CN118229938B - Color-imparting method, device, apparatus, medium and program product for point cloud model - Google Patents
Color-imparting method, device, apparatus, medium and program product for point cloud model Download PDFInfo
- Publication number
- CN118229938B CN118229938B CN202410659086.2A CN202410659086A CN118229938B CN 118229938 B CN118229938 B CN 118229938B CN 202410659086 A CN202410659086 A CN 202410659086A CN 118229938 B CN118229938 B CN 118229938B
- Authority
- CN
- China
- Prior art keywords
- image
- point cloud
- point
- cloud model
- color
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 239000011159 matrix material Substances 0.000 claims abstract description 87
- 238000006243 chemical reaction Methods 0.000 claims abstract description 59
- 230000009466 transformation Effects 0.000 claims abstract description 39
- 230000011218 segmentation Effects 0.000 claims abstract description 10
- 238000004590 computer program Methods 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 9
- 238000003860 storage Methods 0.000 claims description 8
- 238000004040 coloring Methods 0.000 claims description 6
- 238000004422 calculation algorithm Methods 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 5
- 238000004519 manufacturing process Methods 0.000 abstract description 8
- 238000012545 processing Methods 0.000 abstract description 5
- 238000010276 construction Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 241000255749 Coccinellidae Species 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Image Processing (AREA)
Abstract
The application provides a method, a device, equipment, a medium and a program product for imparting color to a point cloud model, and relates to the technical field of image processing. The method comprises the following steps: determining a segmented image based on the original panoramic image; performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature points are determined based on the original panoramic image; the second feature points are determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points; determining a first conversion matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point; color is added to the point cloud model based on the segmentation image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data. The method, the device, the equipment, the medium and the program product for imparting color to the point cloud model can reduce the production cost.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, a medium, and a program product for imparting color to a point cloud model.
Background
The laser radar point cloud data has high-precision coordinate information, but due to the fact that the laser wave band lacks color information, the purpose of the point cloud color imparting is to color the point cloud, color attribute information is added, and a better visual effect is provided.
At present, all large-point cloud production companies push out self-collected products, and a true color point cloud model under a scene can be obtained efficiently through a three-dimensional color-imparting technology. However, this method too relies on the image acquired by the panoramic camera during shooting, the reusability of the point cloud data is not high, and multiple acquisition modeling is required for different periods of the same area, so that the acquisition cost is increased.
Disclosure of Invention
The application provides a method, a device, equipment, a medium and a program product for imparting color to a point cloud model, which are used for solving the defect of higher cost for imparting color to the point cloud model in the prior art.
In a first aspect, the present application provides a method for imparting color to a point cloud model, comprising:
Determining a segmented image based on the original panoramic image;
Performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
Determining a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
Color is added to the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
Optionally, the second transformation matrix is determined by calibration of the original panoramic image and the point cloud data;
the color adding of the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix comprises the following steps:
determining a conversion relation between the segmented image and the point cloud data based on the first conversion matrix and the second conversion matrix;
and based on the conversion relation, coloring RGB values of each point on the segmented image to each point of the corresponding point cloud model.
Optionally, the performing image matching on the original panoramic image and the segmented image to obtain matching feature points includes:
Respectively extracting image features of the original panoramic image and the segmented image based on a feature extraction algorithm to obtain image feature points;
and carrying out feature matching on the image feature points to obtain the matching feature points.
Optionally, the determining the segmented image based on the original panoramic image includes:
and based on the imaging principle of the original panoramic image, carrying out projection segmentation on the original panoramic image to obtain the segmented image.
Optionally, the method further comprises:
simultaneously acquiring real images in multiple directions;
and splicing the real images in the multiple directions into the original panoramic image.
Optionally, the method further comprises:
Determining the projection area of the segmented image through boundary points;
Judging the position relation between the first characteristic point and the projection area by a maximum area method to obtain a judging result;
And determining the quality of the image matching according to the judging result.
In a second aspect, the present application further provides a color-imparting device for a point cloud model, including:
The first determining module is used for determining a segmented image based on the original panoramic image;
The matching module is used for carrying out image matching on the original panoramic image and the segmented image to obtain matching characteristic points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
The second determining module is used for determining a first conversion matrix of the original panoramic image and the segmented image based on the first characteristic points and the second characteristic points;
the color-imparting module is used for imparting color to the point cloud model based on the divided image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
In a third aspect, the application also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to the first aspect when executing the program.
In a fourth aspect, the application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to the first aspect.
In a fifth aspect, the application also provides a computer program product comprising a computer program which, when executed by a processor, implements the method according to the first aspect.
According to the method, the device, the equipment, the medium and the program product for imparting color to the point cloud model, the first conversion matrix is determined by performing image matching on the original panoramic image and the divided image, and then the color is imparted to the point cloud model through the divided image, the first conversion matrix and the second conversion matrix. For elements such as buildings and terrains, the point cloud model is not changed frequently, and the method provided by the application only needs to collect panoramic images of a target research area, does not need to collect point cloud data and point cloud modeling again, and can be used for coloring the established or non-established point cloud model, so that multiplexing of the point cloud data is realized, and the collection cost of street view collection and urban elevation collection is reduced. The method can also be tried on the application of urban three-dimensional model construction, oblique photography model construction and the like, optimize the point cloud model updating mode, improve the data processing efficiency and reduce the production cost.
Drawings
In order to more clearly illustrate the application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the application, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a method for imparting color to a point cloud model;
FIG. 2 is a schematic diagram of a color-imparting method of a point cloud model provided by the application;
FIG. 3 is a schematic diagram of a color-imparting device for a point cloud model according to the present application;
Fig. 4 is a schematic structural diagram of an electronic device provided by the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Fig. 1 is a schematic flow chart of a method for imparting color to a point cloud model according to an embodiment of the present application. Referring to fig. 1, an embodiment of the present application provides a method for imparting color to a point cloud model, where an execution subject may be an electronic device, for example, may be a controller, and the method may include:
Step 110, determining a segmented image based on the original panoramic image;
Step 120, performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature points are determined based on the original panoramic image; the second feature points are determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
Step 130, determining a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
Step 140, coloring the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; each point of the point cloud model corresponds to each point of the point cloud data.
In step 110, the controller may acquire an original panoramic image (panoramic sphere) of the target study area, and segment the original panoramic image to obtain a segmented image. For example, the original panoramic image may be segmented using a threshold-based segmentation method, an edge-based segmentation method, a region-based segmentation method, or the like, to obtain a segmented image.
In step 120, the controller may determine the resolution of the original panoramic image, where the resolution of the image determines the number and quality of feature points extracted, and the segmented image should also meet the resolution requirement, so as to maintain the stability of feature point extraction after the image is compressed, stretched, and other transformations in the image feature extraction process. In addition, before image matching is carried out, the controller can adjust the brightness of the image by calculating the gray average value in the original panoramic image and the gray average value of the segmented image, so that the robustness and the robustness of feature point extraction are improved. Then, the controller can perform image matching on the original panoramic image and the segmented image (for example, image matching is performed through a violence method, a nearest neighbor search algorithm and the like) to obtain matching feature points. The matching feature points comprise a first feature point and a second feature point; the first feature points are determined based on the original panoramic image; the second feature points are determined based on the segmented image; the first characteristic points are matched with the second characteristic points in a one-to-one correspondence mode.
In step 130, the controller may calculate a conversion relationship between the original panoramic image and the segmented image by matching the first feature point and the second feature point, and record the conversion relationship as a first conversion matrix X.
In step 140, the controller may color the point cloud model based on the segmented image, the first conversion matrix X, and the second conversion matrix Y. The second transformation matrix Y is a transformation matrix of the original panoramic image and the point cloud data. Specifically, the controller may acquire color attributes of the divided image, and color is imparted to the point cloud model through the first conversion matrix and the second conversion matrix.
According to the point cloud model color-imparting method provided by the embodiment of the application, the original panoramic image and the segmented image are subjected to image matching to determine the first transformation matrix, and then the point cloud model is colored through the segmented image, the first transformation matrix and the second transformation matrix. For elements such as buildings and terrains, the point cloud model is not changed frequently, and the method provided by the application only needs to collect panoramic images of a target research area, does not need to collect point cloud data and point cloud modeling again, and can be used for coloring the established or non-established point cloud model, so that multiplexing of the point cloud data is realized, and the collection cost of street view collection and urban elevation collection is reduced. The method can also be tried on the application of urban three-dimensional model construction, oblique photography model construction and the like, optimize the point cloud model updating mode, improve the data processing efficiency and reduce the production cost.
Fig. 2 is a schematic diagram of a color-imparting method of a point cloud model according to the present application, which can be referred to in context.
In one embodiment, the second transformation matrix is determined by calibration of the original panoramic image and the point cloud data;
color is added to the point cloud model based on the segmentation image, the first conversion matrix and the second conversion matrix, and the color adding method comprises the following steps:
Determining a conversion relation between the segmented image and the point cloud data based on the first conversion matrix and the second conversion matrix;
Based on the conversion relation, the RGB values of each point on the divided image are colored to each point of the corresponding point cloud model.
In the application, the second conversion matrix Y of the original panoramic image and the point cloud data can be obtained through calibration. The controller may determine a conversion relationship of the divided image and the point cloud data based on the first conversion matrix X and the second conversion matrix Y. For example, the point cloud data coordinates corresponding to any point P (x, y) on the segmented image are). And each point of the point cloud model corresponds to each point of the point cloud data, and the controller can color the RGB values of each point on the segmented image to each point of the corresponding point cloud model through the conversion relation, so that the update of the point cloud model is completed.
According to the color-imparting method for the point cloud model, provided by the embodiment of the application, the color is imparted to the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix, and only the panoramic image of the target research area is required to be acquired, and the point cloud data and the point cloud modeling are not required to be acquired again, so that the color is imparted to the point cloud model which is built or not built, the multiplexing of the point cloud data is realized, and the production cost is reduced.
In one embodiment, performing image matching on the original panoramic image and the segmented image to obtain matching feature points, including:
respectively extracting image features of the original panoramic image and the segmented image based on a feature extraction algorithm to obtain image feature points;
and performing feature matching on the image feature points to obtain matching feature points.
The controller can respectively extract the image features of the original panoramic image and the segmented image by utilizing a feature extraction algorithm so as to perform image feature matching. In particular, feature extraction algorithms may include Scale-invariant feature transforms (Scale-INVARIANT FEATURE TRANSFORM, SIFT), accelerated robust features (Speeded-Up Robust Features, SURF), harris corner detection, and the like.
And after the image features are extracted, the image feature points of the original panoramic image and the segmented image can be obtained. The controller can obtain matching feature points by carrying out feature matching on the image feature points. Specifically, the controller may obtain the matching feature points by performing spatial nearest neighbor calculation on the multi-dimensional feature vector.
According to the point cloud model color-imparting method provided by the embodiment of the application, the original panoramic image and the segmented image are subjected to image matching through feature extraction and feature matching to obtain the matched feature points, so that the first conversion matrix can be determined, the point cloud model is further colored, and data preparation is provided for the point cloud color imparting, so that the multiplexing of the point cloud data is realized, and the production cost is reduced.
In one embodiment, determining a segmented image based on the original panoramic image includes:
based on the imaging principle of the original panoramic image, the original panoramic image is subjected to projection segmentation to obtain a segmented image.
The controller can carry out projection segmentation on the original panoramic image according to the imaging principle of the original panoramic image (panoramic sphere) to obtain a segmented image. Specifically, the controller may project an image coordinate ((0, 0) as a starting point, a direction in which a y axis is located is a 0 ° meridian direction, and a direction in which an x axis is located is a panoramic sphere equator), and divide the projected original panoramic image into a plurality of divided images bounded by different longitudes and latitudes based on the longitudes and latitudes. Projecting the segmented image in this manner may support panoramic image segmentation of arbitrary size (i.e., changing the size of the image segmentation according to the projection range).
According to the color-imparting method for the point cloud model, provided by the embodiment of the application, the original panoramic image is subjected to projection segmentation to obtain the segmented image, so that the image matching precision can be ensured, and the color-imparting accuracy of the point cloud can be ensured.
In one embodiment, the point cloud model color-imparting method further comprises:
simultaneously acquiring real images in multiple directions;
and splicing the real images in multiple directions into an original panoramic image.
The application may take ladybug panoramic images as examples. ladybug panoramic imaging systems are panoramic camera industry standards, and are widely applied to information acquisition, street view map making and the like of geographic information systems (Geographic Information System, GIS). The controller can collect real images in 6 directions simultaneously, splice the real images, and output spliced original panoramic images (panoramic balls).
According to the color-imparting method for the point cloud model, the original panoramic image is obtained by collecting and splicing the real images in multiple directions, so that data preparation is provided for color imparting of the point cloud, multiplexing of the point cloud data is ensured, and therefore production cost is reduced.
In one embodiment, the point cloud model color-imparting method further comprises:
Determining the projection area of the segmented image through the boundary points;
judging the position relation between the first characteristic point and the projection area by a maximum area method to obtain a judging result;
And determining the quality of image matching according to the judging result.
After image matching, the controller may also check the quality of the image feature point matching. The controller can respectively determine the projection areas of the original panoramic image and the segmented image through the image boundary points, and then perform projection calculation on the characteristic points. Assuming that the projection coordinates of the feature point coordinates of the image A on the image B are (x, y), the controller can judge whether the feature point of the image A is outside the projection area of the image B by a maximum area method (any point falls into a plane, and the area of a graph formed by the point 4 of the plane and the point is calculated), so that the proportion of the point in the projection area is calculated, and the higher the proportion is, the higher the image matching quality is.
Specifically, the controller can determine the projection area of the segmented image through the boundary points; judging the position relation between the first characteristic point and the projection area of the segmented image by a maximum area method to obtain a judging result; and determining the quality of image matching according to the judging result. The projection area of the original panoramic image can be determined through the boundary points; judging the position relation between the second characteristic point and the projection area of the original panoramic image by a maximum area method to obtain a judging result; and determining the quality of image matching according to the judging result.
According to the point cloud model color-imparting method provided by the embodiment of the application, the quality of image matching is checked, the point cloud color-imparting is directly carried out when the image matching quality is qualified, and the matching is carried out again until the image matching quality is unqualified, and then the point cloud color-imparting is carried out, so that the accuracy of the point cloud color-imparting can be further ensured.
The color-imparting device for a point cloud model provided by the application is described below, and the color-imparting device for a point cloud model described below and the color-imparting method for a point cloud model described above can be referred to correspondingly.
Fig. 3 is a schematic structural diagram of a color-imparting device for a point cloud model according to an embodiment of the present application. Referring to fig. 3, a color-imparting device for a point cloud model according to an embodiment of the present application may include:
A first determining module 310, configured to determine a segmented image based on the original panoramic image;
The matching module 320 is configured to perform image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
A second determining module 330, configured to determine a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
A color-imparting module 340 for imparting color to the point cloud model based on the divided image, the first conversion matrix, and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
According to the point cloud model color-imparting device provided by the embodiment of the application, the original panoramic image and the segmented image are subjected to image matching to determine the first transformation matrix, and then the point cloud model is colored through the segmented image, the first transformation matrix and the second transformation matrix. For elements such as buildings and terrains, the point cloud model is not changed frequently, and the method provided by the application only needs to collect panoramic images of a target research area, does not need to collect point cloud data and point cloud modeling again, and can be used for coloring the established or non-established point cloud model, so that multiplexing of the point cloud data is realized, and the collection cost of street view collection and urban elevation collection is reduced. The method can also be tried on the application of urban three-dimensional model construction, oblique photography model construction and the like, optimize the point cloud model updating mode, improve the data processing efficiency and reduce the production cost.
Specifically, the color imparting device for point cloud model provided by the embodiment of the present application can implement all the method steps implemented by the method embodiment in which the execution body is a controller, and can achieve the same technical effects, and detailed descriptions of the same parts and beneficial effects as those of the method embodiment in the embodiment are omitted herein.
Fig. 4 illustrates a physical schematic diagram of an electronic device, as shown in fig. 4, which may include: processor 410, communication interface 420, memory 430, and communication bus 440, wherein processor 410, communication interface 420, and memory 430 communicate with each other via communication bus 440. Processor 410 may invoke logic instructions in memory 430 to perform point cloud model color-imparting methods, including, for example:
Determining a segmented image based on the original panoramic image;
Performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
Determining a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
Color is added to the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
Further, the logic instructions in the memory 430 described above may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements steps for performing the method for imparting color to a point cloud model provided by the above methods, for example, comprising:
Determining a segmented image based on the original panoramic image;
Performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
Determining a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
Color is added to the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
In yet another aspect, the present application further provides a computer program product, the computer program product comprising a computer program, the computer program being storable on a non-transitory computer readable storage medium, the computer program, when executed by a processor, being capable of performing the steps of the point cloud model color-imparting method provided by the methods described above, for example comprising:
Determining a segmented image based on the original panoramic image;
Performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
Determining a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
Color is added to the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
In addition, it should be noted that: the terms "first," "second," and the like in embodiments of the present application are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the "first" and "second" distinguishing between objects generally are not limited in number to the extent that the first object may, for example, be one or more.
In the embodiment of the application, the term "and/or" describes the association relation of the association objects, which means that three relations can exist, for example, a and/or B can be expressed as follows: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
In the embodiment of the application, the "determining B based on a" means that a is considered when determining B. Not limited to "B can be determined based on A alone", it should also include: "B based on A and C", "B based on A, C and E", "C based on A, further B based on C", etc. Additionally, a may be included as a condition for determining B, for example, "when a satisfies a first condition, B is determined using a first method"; for another example, "when a satisfies the second condition, B" is determined, etc.; for another example, "when a satisfies the third condition, B" is determined based on the first parameter, and the like. Of course, a may be a condition in which a is a factor for determining B, for example, "when a satisfies the first condition, C is determined using the first method, and B is further determined based on C", or the like.
The term "plurality" in embodiments of the present application means two or more, and other adjectives are similar.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.
Claims (10)
1. A method for imparting color to a point cloud model, comprising:
Determining a segmented image based on the original panoramic image;
Performing image matching on the original panoramic image and the segmented image to obtain matching feature points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
Determining a first transformation matrix of the original panoramic image and the segmented image based on the first feature point and the second feature point;
Color is added to the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
2. The method according to claim 1, wherein the second transformation matrix is determined by calibration of the original panoramic image and the point cloud data;
the color adding of the point cloud model based on the segmented image, the first conversion matrix and the second conversion matrix comprises the following steps:
determining a conversion relation between the segmented image and the point cloud data based on the first conversion matrix and the second conversion matrix;
and based on the conversion relation, coloring RGB values of each point on the segmented image to each point of the corresponding point cloud model.
3. The method of color-imparting to a point cloud model according to claim 1, wherein said image matching the original panoramic image with the segmented image to obtain matching feature points comprises:
Respectively extracting image features of the original panoramic image and the segmented image based on a feature extraction algorithm to obtain image feature points;
and carrying out feature matching on the image feature points to obtain the matching feature points.
4. The method of color-imparting a point cloud model of claim 1, wherein the determining a segmented image based on the original panoramic image comprises:
and based on the imaging principle of the original panoramic image, carrying out projection segmentation on the original panoramic image to obtain the segmented image.
5. The method of imparting color to a point cloud model of claim 1, further comprising:
simultaneously acquiring real images in multiple directions;
and splicing the real images in the multiple directions into the original panoramic image.
6. The method of imparting color to a point cloud model of any one of claims 1 to 5, further comprising:
Determining the projection area of the segmented image through boundary points;
Judging the position relation between the first characteristic point and the projection area by a maximum area method to obtain a judging result;
And determining the quality of the image matching according to the judging result.
7. A point cloud model color imparting device, comprising:
The first determining module is used for determining a segmented image based on the original panoramic image;
The matching module is used for carrying out image matching on the original panoramic image and the segmented image to obtain matching characteristic points; the matching feature points comprise a first feature point and a second feature point; the first feature point is determined based on the original panoramic image; the second feature point is determined based on the segmented image; the first characteristic points are correspondingly matched with the second characteristic points;
The second determining module is used for determining a first conversion matrix of the original panoramic image and the segmented image based on the first characteristic points and the second characteristic points;
the color-imparting module is used for imparting color to the point cloud model based on the divided image, the first conversion matrix and the second conversion matrix; the second transformation matrix is a transformation matrix of the original panoramic image and the point cloud data; the point cloud model is determined based on the point cloud data; and each point of the point cloud model corresponds to each point of the point cloud data.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the point cloud model color imparting method of any of claims 1 to 6 when the program is executed.
9. A non-transitory computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the point cloud model color imparting method according to any of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when executed by a processor, implements the method of imparting color to a point cloud model according to any of claims 1to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410659086.2A CN118229938B (en) | 2024-05-27 | 2024-05-27 | Color-imparting method, device, apparatus, medium and program product for point cloud model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410659086.2A CN118229938B (en) | 2024-05-27 | 2024-05-27 | Color-imparting method, device, apparatus, medium and program product for point cloud model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118229938A CN118229938A (en) | 2024-06-21 |
CN118229938B true CN118229938B (en) | 2024-08-02 |
Family
ID=91498162
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410659086.2A Active CN118229938B (en) | 2024-05-27 | 2024-05-27 | Color-imparting method, device, apparatus, medium and program product for point cloud model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118229938B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114677435A (en) * | 2021-07-20 | 2022-06-28 | 武汉海云空间信息技术有限公司 | Point cloud panoramic fusion element extraction method and system |
CN116596741A (en) * | 2023-04-10 | 2023-08-15 | 北京城市网邻信息技术有限公司 | Point cloud display diagram generation method and device, electronic equipment and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112837419B (en) * | 2021-03-04 | 2022-06-24 | 浙江商汤科技开发有限公司 | Point cloud model construction method, device, equipment and storage medium |
CN115512055A (en) * | 2022-09-13 | 2022-12-23 | 武汉大学 | Method and device for performing indoor structure three-dimensional reconstruction based on two-dimensional video and computer equipment |
CN116597168B (en) * | 2023-07-18 | 2023-11-17 | 齐鲁空天信息研究院 | Matching method, device, equipment and medium of vehicle-mounted laser point cloud and panoramic image |
CN117115211A (en) * | 2023-08-02 | 2023-11-24 | 浙江大华技术股份有限公司 | Point cloud coloring method, point cloud coloring apparatus, and computer-readable storage medium |
-
2024
- 2024-05-27 CN CN202410659086.2A patent/CN118229938B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114677435A (en) * | 2021-07-20 | 2022-06-28 | 武汉海云空间信息技术有限公司 | Point cloud panoramic fusion element extraction method and system |
CN116596741A (en) * | 2023-04-10 | 2023-08-15 | 北京城市网邻信息技术有限公司 | Point cloud display diagram generation method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN118229938A (en) | 2024-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109410316B (en) | Method for three-dimensional reconstruction of object, tracking method, related device and storage medium | |
CN111524168B (en) | Point cloud data registration method, system and device and computer storage medium | |
US11367195B2 (en) | Image segmentation method, image segmentation apparatus, image segmentation device | |
CN111709980A (en) | Multi-scale image registration method and device based on deep learning | |
CN110956661A (en) | Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix | |
CN113724379B (en) | Three-dimensional reconstruction method and device for fusing image and laser point cloud | |
CN102096915B (en) | Camera lens cleaning method based on precise image splicing | |
CN111798453A (en) | Point cloud registration method and system for unmanned auxiliary positioning | |
CN112929626A (en) | Three-dimensional information extraction method based on smartphone image | |
CN111179270A (en) | Image co-segmentation method and device based on attention mechanism | |
Ji et al. | An evaluation of conventional and deep learning‐based image‐matching methods on diverse datasets | |
CN115329111B (en) | Image feature library construction method and system based on point cloud and image matching | |
CN107610216B (en) | Particle swarm optimization-based multi-view three-dimensional point cloud generation method and applied camera | |
CN113378864B (en) | Method, device and equipment for determining anchor frame parameters and readable storage medium | |
CN112270748B (en) | Three-dimensional reconstruction method and device based on image | |
CN117635875B (en) | Three-dimensional reconstruction method, device and terminal | |
CN118229938B (en) | Color-imparting method, device, apparatus, medium and program product for point cloud model | |
CN115587943B (en) | Denoising method and device for point cloud data, electronic equipment and storage medium | |
CN114913246B (en) | Camera calibration method and device, electronic equipment and storage medium | |
CN118247429A (en) | Air-ground cooperative rapid three-dimensional modeling method and system | |
CN113112531B (en) | Image matching method and device | |
CN114494573B (en) | Three-dimensional pipeline model labeling method and device, electronic equipment and storage medium | |
CN112288817B (en) | Three-dimensional reconstruction processing method and device based on image | |
CN115294358A (en) | Feature point extraction method and device, computer equipment and readable storage medium | |
CN113920267A (en) | Three-dimensional scene model construction method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |