CN117746055A - Algorithm for extracting stereo feature based on 3D point cloud data - Google Patents

Algorithm for extracting stereo feature based on 3D point cloud data Download PDF

Info

Publication number
CN117746055A
CN117746055A CN202311791458.9A CN202311791458A CN117746055A CN 117746055 A CN117746055 A CN 117746055A CN 202311791458 A CN202311791458 A CN 202311791458A CN 117746055 A CN117746055 A CN 117746055A
Authority
CN
China
Prior art keywords
point cloud
point
cloud data
points
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311791458.9A
Other languages
Chinese (zh)
Inventor
曹金刚
李�荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Xingshen Technology Group Co ltd
Original Assignee
Jiangsu Xingshen Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Xingshen Technology Group Co ltd filed Critical Jiangsu Xingshen Technology Group Co ltd
Priority to CN202311791458.9A priority Critical patent/CN117746055A/en
Publication of CN117746055A publication Critical patent/CN117746055A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses an algorithm for extracting three-dimensional features based on 3D point cloud data, and particularly relates to the technical field of point cloud data processing.

Description

Algorithm for extracting stereo feature based on 3D point cloud data
Technical Field
The invention relates to the technical field of point cloud data processing, in particular to an algorithm for extracting three-dimensional features based on 3D point cloud data.
Background
With the development of new generation technology, whether the quality of a workpiece is qualified or not is judged manually when the workpiece is produced, so that the quality of the workpiece cannot be adapted to the production development requirement, and the quality inspection of the workpiece by using a machine gradually becomes a main mode of quality inspection.
The existing workpiece quality detection method based on the point cloud data scans the produced workpiece, edge contour points of the target workpiece are automatically marked after the point cloud data information of the target workpiece is acquired, an edge contour model of the target workpiece is built according to the acquired edge contour information of different angles and is overlapped with a model of the target workpiece when the target workpiece is designed, loss or the volume ratio of the exceeding part of the model is recorded, whether the quality of the workpiece is qualified or not is further judged, the data judgment result is more accurate, and the quality inspection precision is also greatly improved.
However, the above method still has some problems: when the point cloud data of the target workpiece are acquired, noise interference points and a large number of redundant point clouds exist, the edge contour information obtained by directly marking the edge contour points after the point cloud data of the target workpiece are acquired and the edge contour information of the actual workpiece are easy to generate larger errors, the obtained point cloud data are required to be cleaned to reduce the interference points, and the accuracy of an edge contour model is still to be improved.
Disclosure of Invention
In order to overcome the above-mentioned drawbacks of the prior art, an embodiment of the present invention provides an algorithm for extracting stereo features based on 3D point cloud data, so as to solve the problems set forth in the above-mentioned background art.
In order to achieve the above purpose, the present invention provides the following technical solutions: an algorithm for extracting three-dimensional features based on 3D point cloud data comprises the following steps:
s1: scanning the target workpiece by using a laser scanning instrument to obtain original workpiece point cloud data;
s2: calculating the principal component direction of the point cloud data based on a principal component analysis principle and establishing a point cloud space coordinate system;
s3: inputting each point in the target workpiece point cloud data, carrying out neighborhood statistics on each point, removing large-scale outlier point cloud data after calculating the average Euclidean distance from the point to the neighborhood point, and carrying out smoothing treatment on the target workpiece single-view point cloud data after removing the large-scale outlier point cloud data by adopting moving least square to obtain a target workpiece multi-view smooth curved surface;
s4: dividing the source workpiece point cloud data and the processed target workpiece point cloud data by adopting a DBSCAN clustering algorithm based on density;
s5: introducing an angle threshold value to the segmented source workpiece point cloud data and target workpiece point cloud data to extract contour points, and taking the contour points extracted from the source workpiece point cloud data and the target workpiece point cloud data as characteristic points for matching the two pieces of point cloud data;
s6: performing corresponding matching based on the characteristic points extracted from the source workpiece point cloud data and the target workpiece point cloud data;
s7: recording and removing the number of point clouds, the space volume of the point clouds of the source workpiece, the number of point clouds and the cleaning time of the point cloud data before and after the large-scale outlier point cloud data and smoothing, recording the number of characteristic points and the number of point clouds extracted from the point cloud data sets of the source workpiece and the target workpiece, and recording the number of correctly matched characteristic points;
s8: calculating a point cloud data fitting degree coefficient, a point cloud data correction efficiency index, an average extraction rate of characteristic points and characteristic point matching accuracy based on the recorded data;
s9: and calculating a feature extraction accuracy index based on the point cloud data correction efficiency index, the feature point average extraction rate and the feature point matching accuracy.
Preferably, the specific steps for constructing the point cloud coordinate system in step S2 are as follows:
s21, counting the number n of three-dimensional points in the point cloud set a A certain point cloud x= (X, Y, Z) T ∈R 3 The number of points contained in the neighborhood of (a) is denoted as n b ,n b ∈n a The points in the point cloud set are denoted as p ai (x ai ,y ai ,z ai ),p ai ∈n a Centroid of the point cloud is marked as o (x o ,y o ,z o ) The following steps are:、/>、/>
s22, calculating covariance of coordinate values of points in the point cloud neighborhood in X, Y and Z, which are respectively represented by cov (X, X), cov (X, Y), cov (X, Z), cov (Y, X), cov (Y, Y), cov (Y, Z), cov (Z, X) and cov (Z, Z), wherein:wherein X is a 、Y a 、Z a Respectively averaging X, Y and Z coordinate values of points in the point cloud neighborhood;
s23, setting a covariance matrix D epsilon R by the obtained covariance 3×3 The method specifically comprises the following steps:since the covariance matrix is a symmetric positive definite matrix, three eigenvalues of the matrix can be known from the property of the symmetric positive definite matrix>The corresponding feature vector is denoted as V 1 、V 2 、V 3
S24, taking the centroid coordinate o as the origin of the target object point cloud space coordinate system, and the feature vector V 1 、V 2 、V 3 Determining a point cloud space coordinate system of the target workpiece for the target workpiece point cloud space coordinate system direction vector, wherein p is used for coordinate points in the point cloud space coordinate system bi (x bi ,y bi ,z bi ) And (3) representing.
Preferably, the specific step of eliminating the abnormal distance distribution point cloud data in the step S3 is as follows:
s311, inputting a target point cloud coordinate p bi Searching neighbor points of the neighbor points, and marking the searching radius as r a
S312, calculating the cloud coordinate p of the target point bi And searching for the rest point cloud coordinates p in the radius bj Euclidean distance d of (2) aj The specific calculation formula is as follows:
s313, collecting and calculating average Euclidean distance d after screening out Euclidean distance dc between cloud coordinates of target point and nearest neighbor point in searching radius e And standard deviation sigma a The specific calculation formula is as follows:,/>
s314, setting a rejection threshold T a The following steps are:wherein mu a To adjust parameters;
s315, eliminating average distance d in point cloud ci >T a Is a point of (2).
Preferably, in step S3, the specific steps of smoothing the single view point cloud data after removing the large-scale outlier are as follows:
s321, inputting target point cloud coordinates to search neighbor points of the target point cloud coordinates;
s322, constructing a fitting surface constraint equation set based on the searched neighboring point coordinates;
s323, solving the constraint equation set to obtain a best fit curved surface at the cloud coordinates of the target point, and replacing points in the point cloud data with points on the curved surface;
s324, repeating the process until the whole point cloud is smoothed point by point to obtain a smooth curved surface.
Preferably, the specific steps of feature point extraction in step S5 are as follows:
s51, target point p b A least square plane which is projected to the structure of the search radius inner points is set as n in number c The nearest point of the neighborhood of the target point is marked as p ti In p b p ti Taking any other point p in the neighborhood adjacent point as a reference vector tj As vector p b p tj
S52, calculating p b p ti And p is as follows b p tj The angle alpha, alpha E0, pi between vectors]Sum cross product vectorAnd takes the same as the reference vector, when +.>When alpha is i Is kept unchanged when->When (I)>
S53, vector angle sequence a= { alpha 12 ,......,α nc Performing ascending arrangement and adding two extreme angles to obtain a new vector included angle sequence a= {0, alpha 12 ,......,α nc 2 pi, the angle θ between each adjacent line segment i The calculation formula is as follows:
s54 according to θ i Whether the set threshold is exceeded or not is determined, whether the target point is a boundary contour point is determined, and the determined contour point is marked as a feature point.
Preferably, the specific data processing procedure in step S8 is as follows:
s81, removing large-scale outlier point cloud data and the point cloud quantity m before smoothing processing from the recorded data a Point cloud space volume V a And source workpiece point cloud distribution space volume V c Number of point clouds m c Calculating the fitting degree epsilon of point cloud data a The specific calculation formula is as follows:removing large-scale outlier point cloud data and the number m of the point clouds after smoothing processing from the recorded data b Volume of space V b And source workpiece point cloud distribution space volume V c Number of point clouds m c Calculating the fitting degree epsilon of point cloud data b The specific calculation formula is as follows: />
S82, eliminating large-scale outlier point cloud data, distributing and attaching degree of point cloud data before and after smoothing processing and cleaning time t of the point cloud data a Calculating the impurity removal efficiency index y of point cloud data a The specific calculation formula is as follows:wherein c a C is an empirical index a >0;
S83, extracting the quantity f of characteristic points from the target workpiece point cloud data set a And the number of point clouds f c Calculating the feature point extraction rate gamma a The specific calculation formula is as follows:
s84, extracting the characteristic point extraction rate gamma of different point cloud data sets of the target workpiece ai Summarizing and calculating average extraction rate gamma of feature points e In particular, theThe calculation formula is as follows:
s85, the number f of the correctly matched characteristic points b And the number f of characteristic points extracted from source workpiece point cloud data d Calculating the feature point matching accuracy z a The specific calculation formula is as follows:wherein c 1 、c 2 、c 3 To adjust the coefficient c 1 >0、c 2 >0、c 3 >0。
Preferably, in the step S9, the feature extraction accuracy index U is calculated based on the point cloud data correction efficiency index, the feature point average extraction rate, and the feature point matching accuracy a The specific calculation formula of (2) is as follows:wherein l 1 、l 2 、l 3 To correspond to the scaling factor of the influencing factor, l 1 >0、l 2 >0、l 3 >0。
The invention has the technical effects and advantages that:
according to the method, after the original workpiece point cloud data are obtained, the principal component direction of the point cloud data is calculated based on a principal component analysis principle, and a point cloud space coordinate system is established, the coordinate system can well reflect the shape of a target workpiece, meanwhile, data support is provided for the subsequent data processing process, after the target workpiece point cloud space coordinate is determined, large-scale outlier point cloud data are removed after the average Euclidean distance from a target point to a neighborhood point is calculated, and the target workpiece single-view point cloud data are subjected to smooth processing by adopting least square of movement to obtain a target workpiece multi-view smooth curved surface, so that the influence of noise interference points is reduced, the fitting degree of the target workpiece point cloud data and actual parameters of the target workpiece is improved, and the accuracy of the subsequent feature point extraction and feature point matching is improved.
Drawings
FIG. 1 is a process step diagram of the present invention;
fig. 2 is a block diagram of the system architecture of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment as shown in fig. 1 provides an algorithm for extracting stereo features based on 3D point cloud data, which includes the following steps:
s1: scanning the target workpiece by using a laser scanning instrument to obtain original workpiece point cloud data;
in this embodiment, it should be specifically noted that the laser scanning apparatus used in the step S1 may be a grating three-dimensional scanner, the optical method adopted is a structured light method, a specific line grating is projected, and three-dimensional information of different view angles of the target workpiece is calculated according to deformation of the line grating on the surface of the target workpiece at different view angles.
S2: calculating the principal component direction of the point cloud data based on a principal component analysis principle and establishing a point cloud space coordinate system;
further, the specific steps for constructing the point cloud coordinate system in the step S2 are as follows:
s21, counting the number n of three-dimensional points in the point cloud set a A certain point cloud x= (X, Y, Z) T ∈R 3 The number of points contained in the neighborhood of (a) is denoted as n b ,n b ∈n a The points in the point cloud set are denoted as p ai (x ai ,y ai ,z ai) ,p ai ∈n a Centroid of the point cloud is marked as o (x o ,y o ,z o ) The following steps are:、/>、/>
s22, calculating covariance of coordinate values of points in the point cloud neighborhood in X, Y and Z, which are respectively represented by cov (X, X), cov (X, Y), cov (X, Z), cov (Y, X), cov (Y, Y), cov (Y, Z), cov (Z, X) and cov (Z, Z), wherein:wherein X is a 、Y a 、Z a Respectively averaging X, Y and Z coordinate values of points in the point cloud neighborhood;
s23, setting a covariance matrix D epsilon R by the obtained covariance 3×3 The method specifically comprises the following steps:since the covariance matrix is a symmetric positive definite matrix, three eigenvalues of the matrix can be known from the property of the symmetric positive definite matrix>The corresponding feature vector is denoted as V 1 、V 2 、V 3
S24, taking the centroid coordinate o as the origin of the target object point cloud space coordinate system, and the feature vector V 1 、V 2 、V 3 Determining a point cloud space coordinate system of the target workpiece for the target workpiece point cloud space coordinate system direction vector, wherein p is used for coordinate points in the point cloud space coordinate system bi (x bi ,y bi ,z bi ) And (3) representing.
S3: inputting each point in the target workpiece point cloud data, carrying out neighborhood statistics on each point, removing large-scale outlier point cloud data after calculating the average Euclidean distance from the point to the neighborhood point, and carrying out smoothing treatment on the target workpiece single-view point cloud data after removing the large-scale outlier point cloud data by adopting moving least square to obtain a target workpiece multi-view smooth curved surface;
further, the specific step of eliminating the abnormal distance distribution point cloud data in the step S3 is as follows:
s311, inputting a target point cloud coordinate p bi Searching neighbor points of the neighbor points, and marking the searching radius as r a
S312, calculating the cloud coordinate p of the target point bi And searching for the rest point cloud coordinates p in the radius bj Euclidean distance d of (2) aj The specific calculation formula is as follows:
s313, collecting and calculating average Euclidean distance d after screening out Euclidean distance dc between cloud coordinates of target point and nearest neighbor point in searching radius e And standard deviation sigma a The specific calculation formula is as follows:,/>
s314, setting a rejection threshold T a The following steps are:wherein mu a To adjust parameters;
s315, eliminating average distance d in point cloud ci >T a Is a point of (2).
Further, in the step S3, the specific steps of smoothing the single view point cloud data after the large-scale outlier is removed are as follows:
s321, inputting target point cloud coordinates to search neighbor points of the target point cloud coordinates;
s322, constructing a fitting surface constraint equation set based on the searched neighboring point coordinates;
s323, solving the constraint equation set to obtain a best fit curved surface at the cloud coordinates of the target point, and replacing points in the point cloud data with points on the curved surface;
s324, repeating the process until the whole point cloud is smoothed point by point to obtain a smooth curved surface.
S4: dividing the source workpiece point cloud data and the processed target workpiece point cloud data by adopting a DBSCAN clustering algorithm based on density;
in this embodiment, it should be specifically noted that, in the step S4, the specific steps of performing the segmentation processing on the point cloud data by using the density-based DBSCAN clustering algorithm are as follows:
s41, establishing a source workpiece point cloud data set Q { Q } 1 ,q 2 ,......,q n Sum target workpiece point cloud data set P b {p b1 ,p b2 ,......,p bn };
S42, for point cloud data Q and P needing to be processed b Giving a parameter MinPts and a parameter Eps;
s43, establishing a k-d tree data structure of the point cloud data;
s44, randomly generating a source point cloud data set Q and a target point cloud data set P b Find a point q i And point p bi Judging the point q through the parameters MinPts and Eps i And point p bi Whether the core point is the core point or not;
s45, if point q i And point p bi As core points, the reachable point sets which can find the core points according to the density reachable requirement are Q respectively 1 And P b1
S46, repeating the step S54 and the step S55 until all the points in the given data set are accessed to obtain a clustered point set Q 1 ,Q 2 ,......,Q n And P b1 ,P b2 ,......,P bn
S5: introducing an angle threshold value to the segmented source workpiece point cloud data and target workpiece point cloud data to extract contour points, and taking the contour points extracted from the source workpiece point cloud data and the target workpiece point cloud data as characteristic points for matching the two pieces of point cloud data;
further, the specific steps of feature point extraction in step S5 are as follows:
s51, target point p b A least square plane which is projected to the structure of the search radius inner points is set as n in number c The nearest point of the neighborhood of the target point is marked as p ti In p b p ti Taking any other point p in the neighborhood adjacent point as a reference vector tj As vector p b p tj
S52, calculating p b p ti And p is as follows b p tj The angle alpha, alpha E0, pi between vectors]Sum cross product vectorAnd takes the same as the reference vector, when +.>When alpha is i Is kept unchanged when->When (I)>
S53, vector angle sequence a= { alpha 12 ,......,α nc Performing ascending arrangement and adding two extreme angles to obtain a new vector included angle sequence a= {0, alpha 12 ,......,α nc 2 pi, the angle θ between each adjacent line segment i The calculation formula is as follows:
s54 according to θ i Whether the set threshold is exceeded or not is determined, whether the target point is a boundary contour point is determined, and the determined contour point is marked as a feature point.
S6: performing corresponding matching based on the characteristic points extracted from the source workpiece point cloud data and the target workpiece point cloud data;
s7: recording and removing the number of point clouds, the space volume of the point clouds of the source workpiece, the number of point clouds and the cleaning time of the point cloud data before and after the large-scale outlier point cloud data and smoothing, recording the number of characteristic points and the number of point clouds extracted from a target workpiece point cloud data set, and recording the number of correctly matched characteristic points;
s8: calculating a point cloud data fitting degree coefficient, a point cloud data correction efficiency index, an average extraction rate of characteristic points and characteristic point matching accuracy based on the recorded data;
further, the specific data processing procedure in step S8 is as follows:
s81, removing large-scale outlier point cloud data and the point cloud quantity m before smoothing processing from the recorded data a Point cloud space volume V a And source workpiece point cloud distribution space volume V c Number of point clouds m c Calculating the fitting degree epsilon of point cloud data a The specific calculation formula is as follows:removing large-scale outlier point cloud data and the number m of the point clouds after smoothing processing from the recorded data b Volume of space V b And source workpiece point cloud distribution space volume V c Number of point clouds m c Calculating the fitting degree epsilon of point cloud data b The specific calculation formula is as follows: />
S82, eliminating large-scale outlier point cloud data, distributing and attaching degree of point cloud data before and after smoothing processing and cleaning time t of the point cloud data a Calculating the impurity removal efficiency index y of point cloud data a The specific calculation formula is as follows:wherein c a C is an empirical index a >0;
S83, extracting features from the target workpiece point cloud data setNumber of points f a And the number of point clouds f c Calculating the feature point extraction rate gamma a The specific calculation formula is as follows:
s84, extracting the characteristic point extraction rate gamma of different point cloud data sets of the target workpiece ai Summarizing and calculating average extraction rate gamma of feature points e The specific calculation formula is as follows:
s85, the number f of the correctly matched characteristic points b And the number f of characteristic points extracted from source workpiece point cloud data d Calculating the feature point matching accuracy z a The specific calculation formula is as follows:wherein c 1 、c 2 、c 3 To adjust the coefficient c 1 >0、c 2 >0、c 3 >0。
S9: and calculating a feature extraction accuracy index based on the point cloud data correction efficiency index, the feature point average extraction rate and the feature point matching accuracy.
Further, in the step S9, a feature extraction accuracy index U is calculated based on the point cloud data correction efficiency index, the feature point average extraction rate, and the feature point matching accuracy a The specific calculation formula of (2) is as follows:wherein l 1 、l 2 、l 3 To correspond to the scaling factor of the influencing factor, l 1 >0、l 2 >0、l 3 >0。
The embodiment of fig. 2 provides an execution system of an algorithm for extracting three-dimensional features based on 3D point cloud data, which comprises an original workpiece scanning module, a source workpiece point cloud data acquisition module, a point cloud space coordinate system construction module, a point cloud data cleaning module, a point cloud data segmentation module, a three-dimensional feature point extraction module, a feature point matching module, a data processing module, a feature extraction precision index calculation module and a database, wherein the original workpiece scanning module, the point cloud space coordinate system construction module, the point cloud data cleaning module, the point cloud data segmentation module, the three-dimensional feature point extraction module and the feature point matching module are sequentially connected, the source workpiece point cloud data acquisition module is connected with the point cloud data segmentation module and the three-dimensional feature point extraction module, the source workpiece point cloud data acquisition module, the point cloud space coordinate system construction module, the point cloud data cleaning module, the three-dimensional feature point extraction module and the feature point matching module are all connected with the data processing module, and all the data processing module is connected with the feature extraction precision index calculation module.
And the original workpiece scanning module scans the target workpiece by using a laser scanning instrument to obtain original workpiece point cloud distribution data, the source workpiece point cloud space volume and the point cloud quantity.
And the source workpiece point cloud data acquisition module acquires source workpiece point cloud data corresponding to the target workpiece in a networking manner.
The point cloud space coordinate system construction module calculates the principal component direction of the point cloud data based on principal component analysis principle, establishes a point cloud space coordinate system, and records the point cloud quantity and the space volume.
The point cloud data cleaning module cleans point cloud data of a target workpiece, removes large-scale outlier point cloud data, and then performs smoothing treatment on single-view point cloud data of the target workpiece by using moving least square to obtain a multi-view smooth curved surface of the target workpiece, and automatically records the number of point clouds, the space volume and the cleaning time of the point cloud data after cleaning.
And the point cloud data segmentation module is used for carrying out segmentation processing on the source workpiece point cloud data and the processed target workpiece point cloud data by adopting a DBSCAN clustering algorithm based on density.
The three-dimensional feature point extraction module extracts contour points of the segmented source workpiece point cloud data and the segmented target workpiece point cloud data according to an angle threshold, and automatically records the number of feature points extracted from different source workpiece point cloud data sets and different target workpiece point cloud data sets and the number of corresponding point clouds.
And the characteristic point matching module correspondingly matches the characteristic points extracted based on the source workpiece point cloud data and the target workpiece point cloud data.
The data processing module calculates a point cloud data fitting degree coefficient, a point cloud data correction efficiency index, an average extraction rate of characteristic points and characteristic point matching accuracy based on the recorded data.
The feature extraction precision index calculation module calculates a feature extraction precision index based on the point cloud data correction efficiency index, the average feature point extraction rate and the feature point matching accuracy.
The database is used for storing all module data in the system.
Finally: the foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (7)

1. An algorithm for extracting three-dimensional features based on 3D point cloud data is characterized in that: the method comprises the following steps:
s1: scanning the target workpiece by using a laser scanning instrument to obtain original workpiece point cloud data;
s2: calculating the principal component direction of the point cloud data based on a principal component analysis principle and establishing a point cloud space coordinate system;
s3: inputting each point in the target workpiece point cloud data, carrying out neighborhood statistics on each point, removing large-scale outlier point cloud data after calculating the average Euclidean distance from the point to the neighborhood point, and carrying out smoothing treatment on the target workpiece single-view point cloud data after removing the large-scale outlier point cloud data by adopting moving least square to obtain a target workpiece multi-view smooth curved surface;
s4: dividing the source workpiece point cloud data and the processed target workpiece point cloud data by adopting a DBSCAN clustering algorithm based on density;
s5: introducing an angle threshold value to the segmented source workpiece point cloud data and target workpiece point cloud data to extract contour points, and taking the contour points extracted from the source workpiece point cloud data and the target workpiece point cloud data as characteristic points for matching the two pieces of point cloud data;
s6: performing corresponding matching based on the characteristic points extracted from the source workpiece point cloud data and the target workpiece point cloud data;
s7: recording and removing the number of point clouds, the space volume of the point clouds of the source workpiece, the number of point clouds and the cleaning time of the point cloud data before and after the large-scale outlier point cloud data and smoothing, recording the number of characteristic points and the number of point clouds extracted from the point cloud data sets of the source workpiece and the target workpiece, and recording the number of correctly matched characteristic points;
s8: calculating a point cloud data fitting degree coefficient, a point cloud data correction efficiency index, an average extraction rate of characteristic points and characteristic point matching accuracy based on the recorded data;
s9: and calculating a feature extraction accuracy index based on the point cloud data correction efficiency index, the feature point average extraction rate and the feature point matching accuracy.
2. The algorithm for extracting stereo features based on 3D point cloud data according to claim 1, wherein: the specific steps for constructing the point cloud coordinate system in the step S2 are as follows:
s21, counting the number n of three-dimensional points in the point cloud set a A certain point cloud x= (X, Y, Z) T ∈R 3 The number of points contained in the neighborhood of (a) is denoted as n b ,n b ∈n a The points in the point cloud set are denoted as p ai (x ai ,y ai ,z ai ),p ai ∈n a, Centroid of point cloud is marked as o (x o , o ,z o ) The following steps are:、/>、/>
s22, calculating covariance of coordinate values of points in the point cloud neighborhood in X, Y and Z, which are respectively represented by cov (X, X), cov (X, Y), cov (X, Z), cov (Y, X), cov (Y, Y), cov (Y, Z), cov (Z, X) and cov (Z, Z), wherein:wherein X is a 、Y a 、Z a Respectively averaging X, Y and Z coordinate values of points in the point cloud neighborhood;
s23, setting a covariance matrix D epsilon R by the obtained covariance 3×3 The method specifically comprises the following steps:since the covariance matrix is a symmetric positive definite matrix, three eigenvalues of the matrix can be known from the property of the symmetric positive definite matrix>The corresponding feature vector is denoted as V 1 、V 2 、V 3
S24, taking the centroid coordinate o as the origin of the target object point cloud space coordinate system, and the feature vector V 1 、V 2 、V 3 Determining a point cloud space coordinate system of the target workpiece for the target workpiece point cloud space coordinate system direction vector, wherein p is used for coordinate points in the point cloud space coordinate system bi (x bi ,y bi ,z bi ) And (3) representing.
3. The algorithm for extracting stereo features based on 3D point cloud data according to claim 1, wherein: the specific steps of eliminating the abnormal distance distribution point cloud data in the step S3 are as follows:
s311, inputting a target point cloud coordinate p bi Searching neighbor points of the neighbor points, and marking the searching radius as r a
S312, calculating the cloud coordinate p of the target point bi And searching for the rest point cloud coordinates p in the radius bj Euclidean distance d of (2) aj The specific calculation formula is as follows:
s313, collecting and calculating average Euclidean distance d after screening out Euclidean distance dc between cloud coordinates of target point and nearest neighbor point in searching radius e And standard deviation sigma a The specific calculation formula is as follows:,/>
s314, setting a rejection threshold T a The following steps are:wherein mu a To adjust parameters;
s315, eliminating average distance d in point cloud ci >T a Is a point of (2).
4. The algorithm for extracting stereo features based on 3D point cloud data according to claim 1, wherein: in the step S3, the specific steps of smoothing the single view point cloud data after the large-scale outlier is removed are as follows:
s321, inputting target point cloud coordinates to search neighbor points of the target point cloud coordinates;
s322, constructing a fitting surface constraint equation set based on the searched neighboring point coordinates;
s323, solving the constraint equation set to obtain a best fit curved surface at the cloud coordinates of the target point, and replacing points in the point cloud data with points on the curved surface;
s324, repeating the process until the whole point cloud is smoothed point by point to obtain a smooth curved surface.
5. The algorithm for extracting stereo features based on 3D point cloud data according to claim 1, wherein: the specific steps of feature point extraction in the step S5 are as follows:
s51, target point p b A least square plane which is projected to the structure of the search radius inner points is set as n in number c The nearest point of the neighborhood of the target point is marked as p ti In p b p ti Taking any other point p in the neighborhood adjacent point as a reference vector tj As vector p b p tj
S52, calculating p b p ti And p is as follows b p tj The angle alpha, alpha E0, pi between vectors]Sum cross product vectorAnd takes the same as the reference vector, when +.>When alpha is i Is kept unchanged when->When (I)>
S53, vector angle sequence a= { alpha 12 ,......,α nc Increasing order and adding twoObtaining a new vector included angle sequence a= {0 and alpha by the extreme angles 12 ,......,α nc 2 pi, the angle θ between each adjacent line segment i The calculation formula is as follows:
s54 according to θ i Whether the set threshold is exceeded or not is determined, whether the target point is a boundary contour point is determined, and the determined contour point is marked as a feature point.
6. The algorithm for extracting stereo features based on 3D point cloud data according to claim 1, wherein: the specific data processing procedure in step S8 is as follows:
s81, removing large-scale outlier point cloud data and the point cloud quantity m before smoothing processing from the recorded data a Point cloud space volume V a And source workpiece point cloud distribution space volume V c Number of point clouds m c Calculating the fitting degree epsilon of point cloud data a The specific calculation formula is as follows:removing large-scale outlier point cloud data and the number m of the point clouds after smoothing processing from the recorded data b Volume of space V b And source workpiece point cloud distribution space volume V c Number of point clouds m c Calculating the fitting degree epsilon of point cloud data b The specific calculation formula is as follows: />
S82, eliminating large-scale outlier point cloud data, distributing and attaching degree of point cloud data before and after smoothing processing and cleaning time t of the point cloud data a Calculating the impurity removal efficiency index y of point cloud data a The specific calculation formula is as follows:wherein c a C is an empirical index a >0;
S83, from the orderNumber f of extracted feature points on target point cloud data set a And the number of point clouds f c Calculating the feature point extraction rate gamma a The specific calculation formula is as follows:
s84, extracting the characteristic point extraction rate gamma of different point cloud data sets of the target workpiece ai Summarizing and calculating average extraction rate gamma of feature points e The specific calculation formula is as follows:
s85, the number f of the correctly matched characteristic points b And the number f of characteristic points extracted from source workpiece point cloud data d Calculating the feature point matching accuracy z a The specific calculation formula is as follows:wherein c 1 、c 2 、c 3 To adjust the coefficient c 1 >0、c 2 >0、c 3 >0。
7. The algorithm for extracting stereo features based on 3D point cloud data according to claim 1, wherein: in the step S9, a feature extraction accuracy index U is calculated based on the point cloud data correction efficiency index, the feature point average extraction rate, and the feature point matching accuracy a The specific calculation formula of (2) is as follows:wherein l 1 、l 2 、l 3 To correspond to the scaling factor of the influencing factor, l 1 >0、l 2 >0、l 3 >0。
CN202311791458.9A 2023-12-25 2023-12-25 Algorithm for extracting stereo feature based on 3D point cloud data Pending CN117746055A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311791458.9A CN117746055A (en) 2023-12-25 2023-12-25 Algorithm for extracting stereo feature based on 3D point cloud data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311791458.9A CN117746055A (en) 2023-12-25 2023-12-25 Algorithm for extracting stereo feature based on 3D point cloud data

Publications (1)

Publication Number Publication Date
CN117746055A true CN117746055A (en) 2024-03-22

Family

ID=90260632

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311791458.9A Pending CN117746055A (en) 2023-12-25 2023-12-25 Algorithm for extracting stereo feature based on 3D point cloud data

Country Status (1)

Country Link
CN (1) CN117746055A (en)

Similar Documents

Publication Publication Date Title
CN113432600B (en) Robot instant positioning and map construction method and system based on multiple information sources
CN105046694B (en) A kind of point cloud rapid registering method based on surface fitting coefficient characteristics
CN109165680B (en) Single-target object dictionary model improvement method in indoor scene based on visual SLAM
CN113538486B (en) Method for improving identification and positioning accuracy of automobile sheet metal workpiece
CN110111375B (en) Image matching gross error elimination method and device under Delaunay triangulation network constraint
CN108107444A (en) Substation's method for recognizing impurities based on laser data
CN110866934A (en) Normative coding-based complex point cloud segmentation method and system
CN116402866A (en) Point cloud-based part digital twin geometric modeling and error assessment method and system
CN104867137A (en) Improved RANSAC algorithm-based image registration method
CN114743259A (en) Pose estimation method, pose estimation system, terminal, storage medium and application
CN114663373A (en) Point cloud registration method and device for detecting surface quality of part
CN114648445B (en) Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization
CN116503705A (en) Fusion method of digital city multi-source data
CN116012399A (en) Point cloud plane identification and edge detection method
CN112164145A (en) Method for rapidly extracting indoor three-dimensional line segment structure based on point cloud data
Chen et al. Eyes localization algorithm based on prior MTCNN face detection
CN116909208B (en) Shell processing path optimization method and system based on artificial intelligence
CN117746055A (en) Algorithm for extracting stereo feature based on 3D point cloud data
CN112767462B (en) Point cloud single-point alignment method based on ridge-valley characteristics and depth characteristic descriptors
CN114399547B (en) Monocular SLAM robust initialization method based on multiframe
CN116310244A (en) Ceramic fragment three-dimensional intelligent splicing method based on contour features
CN115147433A (en) Point cloud registration method
CN117495891B (en) Point cloud edge detection method and device and electronic equipment
CN110070110A (en) A kind of adaptive threshold image matching method
CN117495932B (en) Power equipment heterologous point cloud registration method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination