CN116883443A - Edge extraction method and device based on point cloud - Google Patents

Edge extraction method and device based on point cloud Download PDF

Info

Publication number
CN116883443A
CN116883443A CN202310923324.1A CN202310923324A CN116883443A CN 116883443 A CN116883443 A CN 116883443A CN 202310923324 A CN202310923324 A CN 202310923324A CN 116883443 A CN116883443 A CN 116883443A
Authority
CN
China
Prior art keywords
points
point cloud
feature
point
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310923324.1A
Other languages
Chinese (zh)
Inventor
周浩源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lingyun Shixun Technology Co ltd
Original Assignee
Shenzhen Lingyun Shixun Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lingyun Shixun Technology Co ltd filed Critical Shenzhen Lingyun Shixun Technology Co ltd
Priority to CN202310923324.1A priority Critical patent/CN116883443A/en
Publication of CN116883443A publication Critical patent/CN116883443A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

Some embodiments of the application provide a method and a device for extracting an edge based on a point cloud, wherein the method can acquire target point clouds of feature points with different coordinates, and detect neighborhood information of the feature points through a voxel neighbor search algorithm. And obtaining the difference characteristics of the characteristic points according to the neighborhood information. The difference features comprise displacement after the feature point is smoothed, and/or the maximum included angle of adjacent projection vectors in the feature points. And calculating a discrimination threshold according to the difference features, and marking feature points with the difference features larger than the discrimination threshold as edge points. The method can acquire the neighborhood information of each feature point based on the voxel neighbor searching algorithm, and improves the acquisition efficiency of the neighborhood information; and edge points in the point cloud are judged by smooth processing or adjacent projection and adjacent included angles, and the edge points are judged by the original data of the point cloud, so that the edge extraction precision can be improved, and the edge extraction effect is improved.

Description

Edge extraction method and device based on point cloud
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a point cloud-based edge extraction method and device.
Background
The 3D point cloud edge extraction technology is a basic point cloud processing algorithm and can be applied to scenes such as object contour extraction, object positioning, object measurement and the like. The point cloud edge refers to an edge measurement point capable of expressing the characteristics of an object and is used for representing the geometric characteristics of the object.
In order to accurately extract edge points from a large number of three-dimensional points, the edge points of the point cloud can be judged through a depth map, namely, the depth map of the point cloud is generated based on the distance between each point and a camera plane, the height drop of each point is calculated according to the depth map, and then the point with the height drop larger than a certain threshold value is extracted as the edge point. However, a certain loss is generated in the process of generating the depth map of the point cloud, which affects the accuracy of the depth map data, and further reduces the accuracy of edge extraction.
Disclosure of Invention
The application provides a point cloud-based edge extraction method and device, which are used for solving the problem of low edge extraction accuracy.
In a first aspect, some embodiments of the present application provide a method for extracting an edge based on a point cloud, including:
acquiring a target point cloud, wherein the target point cloud comprises characteristic points with different coordinates;
detecting neighborhood information of the feature points through a voxel neighbor searching algorithm;
obtaining difference features of the feature points according to the neighborhood information, wherein the difference features comprise displacement of the feature points after smoothing treatment and/or the maximum included angle of adjacent projection vectors in the feature points;
calculating a discrimination threshold according to the difference characteristics;
and marking the feature points with the difference features larger than the discrimination threshold as edge points.
In some embodiments of the present application, detecting neighborhood information of the feature point by a voxel neighbor search algorithm includes: establishing a voxel grid based on the target point cloud, and recording index information of the voxel grid; calculating grid indexes of the feature points according to the index information; querying a neighborhood grid of the feature points through the grid index; and acquiring the characteristic point information of the neighborhood grid.
In some embodiments of the present application, obtaining the difference feature of the feature point according to the neighborhood information includes: performing smoothing processing on the target point cloud according to the neighborhood information to generate a smoothed point cloud; detecting first coordinates of target feature points in the smooth point cloud and second coordinates of target feature points in the target point cloud; and calculating a coordinate difference value between the first coordinate and the second coordinate.
In some embodiments of the present application, smoothing is performed on the target point cloud based on a bilateral filtering algorithm.
In some embodiments of the present application, obtaining the difference feature of the feature point according to the neighborhood information includes:
performing plane fitting on the target point cloud according to the neighborhood information, and removing the local outside points in the neighborhood characteristic points through a preset distance threshold value to generate a local fitting plane; projecting the characteristic points to the local fitting plane to generate projection vectors of the characteristic points; calculating the included angle between two adjacent projection vectors; the maximum of the angles is recorded.
In some embodiments of the present application, removing an outlier from a neighborhood feature point by a preset distance threshold includes: inquiring the neighborhood characteristic points of the characteristic points according to the neighborhood information; detecting the target distance between the neighborhood feature point and the feature point; if the target distance is greater than the distance threshold, marking the neighborhood feature point as an outlier, and eliminating the outlier; if the target distance is less than or equal to the distance threshold, marking the neighborhood information points as local points, and fitting the local fitting plane according to the local points.
In some embodiments of the present application, calculating a discrimination threshold according to the difference feature includes: acquiring the average value and standard deviation of the difference characteristics; and calculating a discrimination threshold according to the average value and the standard deviation.
In some embodiments of the present application, calculating the discrimination threshold according to the average value and the standard deviation includes: acquiring a threshold factor of the discrimination threshold; calculating the product of the threshold factor and the standard deviation; and calculating the sum of the product and the average value as the discrimination threshold.
In some embodiments of the present application, marking feature points with feature values greater than the discrimination threshold as edge points includes: traversing the characteristic points of the target point cloud to obtain characteristic values of the characteristic points; if the characteristic value is larger than the judging threshold value, marking the characteristic point as the edge point; and if the characteristic value is smaller than or equal to the judging threshold value, marking the characteristic point as a non-edge point.
In a second aspect, some embodiments of the present application further provide an edge extraction device based on a point cloud, including a search module, a data processing module, and an edge point extraction module, where:
the search module is configured to acquire a target point cloud, wherein the target point cloud comprises characteristic points with different coordinates; detecting neighborhood information of the feature points through a voxel neighbor searching algorithm;
the data processing module is configured to acquire difference features of the feature points according to the neighborhood information, wherein the difference features comprise displacement of the feature points after the feature points are subjected to smoothing processing and/or the maximum included angle of adjacent projection vectors in the feature points; calculating a discrimination threshold according to the difference characteristics;
the edge point extraction module is configured to mark feature points for which the difference feature is greater than the discrimination threshold as edge points.
According to the technical scheme, the edge extraction method and the edge extraction device based on the point cloud provided by some embodiments of the application can obtain the target point clouds of the characteristic points with different coordinates, and detect the neighborhood information of the characteristic points through the voxel neighbor search algorithm. And obtaining the difference characteristics of the characteristic points according to the neighborhood information. The difference features comprise displacement after the feature point is smoothed, and/or the maximum included angle of adjacent projection vectors in the feature points. And calculating a discrimination threshold according to the difference features, and marking feature points with the difference features larger than the discrimination threshold as edge points. The method can acquire the neighborhood information of each feature point based on the voxel neighbor searching algorithm, and improves the acquisition efficiency of the neighborhood information; and edge points in the point cloud are judged by two modes of smoothing or adjacent included angles of adjacent projections, and the edge points are judged by the original data of the point cloud, so that the accuracy of edge extraction can be improved, and the effect of edge extraction is improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of an edge extraction method based on point cloud according to some embodiments of the present application;
FIG. 2 is an exemplary diagram of a voxel grid and a neighborhood grid for voxel neighbor searching provided by some embodiments of the present application;
FIG. 3 is a flowchart of a method for obtaining a smoothed displacement according to some embodiments of the present application;
FIG. 4a is a schematic illustration of the effect of the inner folded edge points provided by some embodiments of the present application;
FIG. 4b is a schematic illustration of the effect of high curvature edge points provided by some embodiments of the present application;
FIG. 4c is a schematic illustration of the effect of outer contour boundary points according to some embodiments of the present application;
FIG. 5 is an exemplary diagram of the maximum included angle of adjacent projection vectors provided by some embodiments of the present application;
fig. 6 is a schematic diagram of an edge extraction method based on point cloud according to some embodiments of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of exemplary embodiments of the present application more apparent, the technical solutions of exemplary embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is apparent that the described exemplary embodiments are only some embodiments of the present application, not all embodiments.
All other embodiments, which can be made by a person skilled in the art without inventive effort, based on the exemplary embodiments shown in the present application are intended to fall within the scope of the present application. Furthermore, while the present disclosure has been described in terms of an exemplary embodiment or embodiments, it should be understood that each aspect of the disclosure may be separately implemented as a complete solution.
The point cloud refers to a characteristic point set expressing target spatial distribution and target surface characteristics under the same spatial reference system. For example, a point cloud obtained by a laser measurement principle comprises three-dimensional coordinates and laser reflection intensity; the point cloud obtained by the photogrammetry principle comprises three-dimensional coordinates and color information (RGB). Thus, the point cloud can be regarded as a three-dimensional feature point image. The attributes of the point cloud include: spatial resolution, point location accuracy, surface normal vector, etc.
The point cloud edge is an edge measurement point in the pointing cloud capable of expressing the characteristics of an object and is used for representing the geometric characteristics of the object. In some embodiments, in order to accurately extract edge points from a large number of three-dimensional points, edge points of a point cloud may be determined by a depth map, that is, a depth map of the point cloud is generated based on distances from each point to a camera plane, height differences of each point are calculated according to the depth map, and points with height differences greater than a certain threshold are extracted as edge points. However, a certain loss is generated in the process of generating the depth map of the point cloud, which affects the accuracy of the depth map data, and results in a decrease in the accuracy of edge extraction.
Thus, to improve the loss problem caused by the depth map, in some embodiments, a neighbor search may be performed based on a kdtree algorithm (k-dime keynote, k-dimensional tree) to find neighborhood feature points of each feature point in the point cloud. And regenerating projection vectors of all the neighborhood feature points, and judging edge points in the point cloud through included angles of the projection vectors. However, searching for the neighborhood pixel points based on the kdtree algorithm consumes a long searching time, which results in reduced efficiency of edge point extraction and affects the edge extraction effect of the point cloud.
Based on the above application scenario, in order to solve the problem of low accuracy of edge extraction, some embodiments of the present application provide a point cloud based edge extraction method, as shown in fig. 1, including the following program steps:
s1: and acquiring a target point cloud.
The target point cloud comprises characteristic points with different coordinates, wherein the characteristic points comprise edge points and non-edge points, and each characteristic point comprises a corresponding pixel value. For example, a point cloud obtained by laser measurement, wherein the pixel value of the characteristic point is the laser reflection intensity value of the point; the pixel value of the characteristic point is the color information of the point, which is obtained by the photographic measurement. The target point cloud may be various types of point cloud data, such as a point cloud with outer contour boundary points or a point cloud with inner folded edge points, and the like.
S2: and detecting neighborhood information of the feature points through a voxel neighbor searching algorithm.
After the target point cloud is acquired, the neighborhood information of the feature point is needed to detect the difference between the feature point and the adjacent feature point. Therefore, neighborhood information of each feature point in the target point cloud is obtained through a voxel neighbor search algorithm. The neighborhood information is information of feature points in the neighborhood of the feature points. The time consumed by obtaining the neighborhood information can be reduced through the voxel neighbor search algorithm, and the overall efficiency of edge extraction is further improved.
Thus, in some embodiments, when neighborhood information of the feature points is detected by a voxel neighbor search algorithm, a voxel grid is established based on the target point cloud, and index information of the voxel grid is recorded. Namely, the voxel grid is required to divide each characteristic point of the target point cloud into different grids, and index information of the grids is recorded. After the index information is recorded, calculating the grid index of the feature points according to the index information, and inquiring the neighborhood grid of the feature points through the grid index. The size of the neighborhood grid may be a variety of sizes, such as a 3 x 3 neighborhood grid or a 5 x 5 neighborhood grid, etc. And obtaining the characteristic point information of the neighborhood grid to be used as the neighborhood information of the current characteristic point.
For example, as shown in fig. 2, each three-dimensional point in fig. 2 is a feature point of the target point cloud, a solid line portion is a voxel grid, and a broken line portion is a neighborhood grid. Wherein,, the neighborhood size is 3X 3. After the target point cloud is obtained, setting the length, width and height dimensions of the voxel grids according to the target point cloud, dividing each point in the target point cloud into different voxel grids, and recording the point indexes in the grids. Inquiring a neighborhood grid of the feature points according to the point index, and acquiring information of each point in the neighborhood grid to serve as neighborhood information of the current feature points.
S3: and obtaining the difference characteristics of the characteristic points according to the neighborhood information.
After the neighborhood information of the feature points is detected, the difference features of the feature points are obtained according to the neighborhood information, so that whether the feature points are edge points in the point cloud or not is judged through the difference features. The difference features comprise displacement after the feature point is smoothed, and/or the maximum included angle of adjacent projection vectors in the feature points. The embodiment of the application provides two ways for obtaining the difference characteristics of the characteristic points, and is applicable to various edge extraction scenes. When the displacement after the smoothing treatment is used as a difference characteristic to judge an edge point, a bending boundary and a folding edge with larger curvature can be detected; when the maximum included angle of the adjacent projection vectors is used as a difference characteristic to judge the edge point, the outer contour boundary and the folding edge of the point cloud can be detected.
As shown in fig. 3, when the difference feature is a displacement after the smoothing process, in some embodiments, when the difference feature of the feature point is acquired from the neighborhood information, the smoothing process is performed on the target point cloud in accordance with the neighborhood information to generate a smoothed point cloud. The smoothing process may be performed multiple times to iterate the point cloud data multiple times. And detecting a first coordinate of the target characteristic point in the smooth point cloud and a second coordinate of the target characteristic point in the target point cloud. And calculating a coordinate difference value between the first coordinate and the second coordinate to serve as a displacement after the smoothing treatment. The embodiment uses the displacement after the smoothing process as a difference feature, and is applicable to edge extraction of high curvature edge points or folding edge points. Wherein, the curvature is the rotation rate of the tangential direction angle of a certain point on the curve relative to the arc length, and the degree of deviation of the curve from a straight line can be indicated. The higher the curvature, the greater the degree of curvature of the curve.
To improve the accuracy of the smoothing process, in some embodiments, the smoothing process is performed on the target point cloud based on a bilateral filtering algorithm. Bilateral filtering (Bilateralfilter) is a nonlinear filtering method, which is a compromise process combining spatial proximity and pixel value similarity of images. Meanwhile, bilateral filtering also considers the similarity of airspace information and gray scale, can achieve the purpose of edge protection and denoising, and has the characteristics of simplicity, non-iteration and locality.
For example, as shown in fig. 4a, 4b, when the edge point of the target point cloud is the inside folding edge point or the high curvature edge point shown in fig. 4a, 4b, the displacement after the smoothing processing can be acquired as the difference feature of the feature point. Firstly, obtaining neighborhood information through a voxel neighbor search algorithm, performing bilateral filtering smoothing on each characteristic point of a target point cloud according to the neighborhood information, and performing multiple bilateral filtering smoothing, such as 5 bilateral filtering smoothing, on the smoothed point cloud. And finally, recording the displacement of each characteristic point after repeated iterative smoothing to serve as the difference characteristic of the characteristic points.
It should be noted that, the smoothing processing manner provided in the embodiment of the present application may also use smoothing algorithms such as trilateral filtering, linear interpolation, ML S (moving least square method), etc. The application is not limited in this regard.
As shown in fig. 5, when the difference feature is the maximum included angle of the adjacent projection vectors, in some embodiments, when the difference feature of the feature points is obtained according to the neighborhood information, performing plane fitting on the target point cloud according to the neighborhood information, and removing the local outside points in the neighborhood feature points through a preset distance threshold value to generate a local fitting plane. I.e. performing a plane fit to the feature points, eliminating outliers. The feature points are projected to a local fitting plane to generate projection vectors of the feature points. And calculating the included angle of two adjacent projection vectors, and recording the maximum value in the included angle to serve as the difference characteristic of the characteristic points.
In order to facilitate the removal of the outliers in the target point cloud, in some embodiments, when the outliers in the neighborhood feature points are removed by a preset distance threshold, the neighborhood feature points of the feature points are queried according to the neighborhood information, and the target distances between the neighborhood feature points and the feature points are detected. If the target distance is greater than the distance threshold, marking the neighborhood feature points as outliers, and eliminating the outliers; if the target distance is less than or equal to the distance threshold, marking the neighborhood information points as local internal points, and fitting a local fitting plane according to the local internal points. And removing the feature points with the larger distance by the distance between the neighborhood feature point and the current feature point so as to fit and generate a local fitting plane of the current feature point.
For example, as shown in fig. 4c, when the edge point of the target point cloud is the outer contour boundary point shown in fig. 4c, the maximum included angle of the adjacent projection vectors may be obtained as the difference feature of the feature point. Firstly, neighborhood information is obtained through a voxel neighbor searching algorithm, and plane fitting for eliminating the local and external points is performed on target points in the target point cloud according to the neighborhood information. And judging whether each neighborhood characteristic point of the target point belongs to a local point of the local fitting plane according to the distance threshold k, wherein the target point is the characteristic point currently being processed. Wherein points less than or equal to the distance threshold K are intra-office points, and points greater than the distance threshold K are extra-office points. And building a uvn coordinate system by taking the target point as an origin, removing the outlier, fitting a local fitting plane according to the local point, and projecting each characteristic point and the corresponding local point to the plane. And recording the maximum included angle of the adjacent projection vectors to serve as the difference characteristic of the current characteristic point. As shown in fig. 5, the included angle θ in fig. 5 is the maximum included angle of the adjacent projection vectors.
It can be understood that the two different features are suitable for different types of edge extraction, and when a curved boundary and a folded edge with large curvature need to be detected, edge points of the target point cloud are distinguished through displacement after the feature point smoothing process; when the outer contour boundary needs to be detected, the edge point of the target point cloud is judged through the maximum included angle of the adjacent projection vectors. Therefore, the edge extraction method provided by the embodiment of the application can be suitable for various edge types, and the universality of edge extraction is further improved.
S4: and calculating a discrimination threshold according to the difference characteristics.
After the difference features of the feature points are obtained, a group of data of displacement or maximum included angle can be obtained, and a discrimination threshold for discriminating the feature points is calculated by obtaining the group of data. In this way, each feature point in the target point cloud can be distinguished according to the difference features, and the feature point with the larger difference is marked as an edge point.
To facilitate the discrimination of edge points, the discrimination threshold may be calculated based on statistical principles, and in some embodiments, the mean and standard deviation of the difference features are obtained and the discrimination threshold is calculated based on the mean and standard deviation when the discrimination threshold is calculated based on the difference features. The standard deviation can reflect the discrete degree of the difference features in the target point cloud, and the judgment threshold value for judging the edge point can be obtained by combining the average value of the difference features.
Thus, in some embodiments, when the discrimination threshold is calculated from the mean and standard deviation, a threshold factor of the discrimination threshold is obtained, and a product of the threshold factor and the standard deviation is calculated. And calculating the sum of the product and the average value to serve as a discrimination threshold. The threshold factor is a preset value and can be set in a self-defined manner.
Illustratively, the discrimination threshold may be calculated by the following formula:
Thresh D =Mean D +m·σ D
in Thresh D To determine the threshold, mean D For average value, m is a preset threshold factor, sigma D Is the standard deviation. When the difference is characterized by displacement, mean D And sigma (sigma) D Mean and standard deviation of displacement; when the difference characteristic is the maximum included angle of adjacent projection vectors, mean D And sigma (sigma) D The average value and standard deviation of the maximum included angle.
S5: and marking the feature points with the difference features larger than the discrimination threshold as edge points.
After the discrimination threshold is calculated, edge points in the target point cloud can be determined according to the discrimination threshold. Detecting each feature point in the target point cloud, and if the difference feature of the feature point is larger than the discrimination threshold value, indicating that the current feature point has larger difference from other feature points, marking the feature point as an edge point. And each characteristic point in the point cloud is judged through the judgment threshold value obtained through calculation, the point cloud data is not required to be converted into other forms, the loss problem caused by data conversion can be solved, and the edge extraction precision is improved.
In some embodiments, when the feature point with the difference feature larger than the discrimination threshold is marked as an edge point, traversing the feature point of the target point cloud to obtain the difference feature of the feature point, namely the displacement after the feature point is smoothed, and/or the maximum included angle of the adjacent projection vectors in the feature point. If the difference characteristic is larger than the discrimination threshold, marking the characteristic point as an edge point; if the difference feature is smaller than or equal to the discrimination threshold, marking the feature point as a non-edge point until all feature points in the target point cloud are traversed, so as to screen out all edge points.
Based on the above-mentioned edge extraction method based on the point cloud, some embodiments of the present application further provide an edge extraction device based on the point cloud, as shown in fig. 6, where the device includes a search module 100, a data processing module 200, and an edge point extraction module 300. Wherein:
the search module 100 is configured to obtain a target point cloud comprising feature points of different coordinates; detecting neighborhood information of the feature points through a voxel neighbor searching algorithm;
the data processing module 200 is configured to obtain a difference feature of the feature points according to the neighborhood information, where the difference feature includes a displacement of the feature points after the feature points are smoothed, and/or a maximum included angle of adjacent projection vectors in the feature points; calculating a discrimination threshold according to the difference characteristics;
the edge point extraction module 300 is configured to mark feature points for which the difference feature is greater than the discrimination threshold as edge points.
In some embodiments, the data processing module 200 is further configured to generate a mode selection option for edge extraction before performing the obtaining of the difference feature of the feature point according to the neighborhood information, the mode selection option including a smooth mode option and a neighborhood vector angle mode option. The data processing module may extract corresponding difference features according to the operation event of the mode selection option. When the smoothing mode option is in the selected state, the data processing module 200 acquires the displacement after the smoothing process as a difference feature; when the neighborhood vector angle mode option is in the selected state, the data processing module 200 acquires the maximum included angle of the adjacent projection vectors in the feature points as the difference feature.
According to the technical scheme, the edge extraction method and the edge extraction device based on the point cloud provided by some embodiments of the application can obtain the target point clouds of the characteristic points with different coordinates, and detect the neighborhood information of the characteristic points through the voxel neighbor search algorithm. And obtaining the difference characteristics of the characteristic points according to the neighborhood information. The difference features comprise displacement after the feature point is smoothed, and/or the maximum included angle of adjacent projection vectors in the feature points. And calculating a discrimination threshold according to the difference features, and marking feature points with the difference features larger than the discrimination threshold as edge points. The method can acquire the neighborhood information of each feature point based on the voxel neighbor searching algorithm, and improves the acquisition efficiency of the neighborhood information; and edge points in the point cloud are judged by two modes of smoothing or adjacent included angles of adjacent projections, and the edge points are judged by the original data of the point cloud, so that the accuracy of edge extraction can be improved, and the effect of edge extraction is improved.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. The edge extraction method based on the point cloud is characterized by comprising the following steps of:
acquiring a target point cloud, wherein the target point cloud comprises characteristic points with different coordinates;
detecting neighborhood information of the feature points through a voxel neighbor searching algorithm;
obtaining difference features of the feature points according to the neighborhood information, wherein the difference features comprise displacement of the feature points after smoothing treatment and/or the maximum included angle of adjacent projection vectors in the feature points;
calculating a discrimination threshold according to the difference characteristics;
and marking the feature points with the difference features larger than the discrimination threshold as edge points.
2. The point cloud based edge extraction method according to claim 1, wherein detecting neighborhood information of the feature point by a voxel neighbor search algorithm comprises:
establishing a voxel grid based on the target point cloud, and recording index information of the voxel grid;
calculating grid indexes of the feature points according to the index information;
querying a neighborhood grid of the feature points through the grid index;
and acquiring the characteristic point information of the neighborhood grid.
3. The point cloud based edge extraction method according to claim 1, wherein obtaining the difference feature of the feature point according to the neighborhood information comprises:
performing smoothing processing on the target point cloud according to the neighborhood information to generate a smoothed point cloud;
detecting first coordinates of target feature points in the smooth point cloud and second coordinates of target feature points in the target point cloud;
and calculating a coordinate difference value between the first coordinate and the second coordinate.
4. A point cloud based edge extraction method according to claim 3, wherein smoothing is performed on the target point cloud based on a bilateral filtering algorithm.
5. The point cloud based edge extraction method according to claim 1, wherein obtaining the difference feature of the feature point according to the neighborhood information comprises:
performing plane fitting on the target point cloud according to the neighborhood information, and removing the local outside points in the neighborhood characteristic points through a preset distance threshold value to generate a local fitting plane;
projecting the characteristic points to the local fitting plane to generate projection vectors of the characteristic points;
calculating the included angle between two adjacent projection vectors;
the maximum of the angles is recorded.
6. The method for extracting edge based on point cloud as claimed in claim 5, wherein the step of eliminating the outlier from the neighborhood feature points by a preset distance threshold comprises:
inquiring the neighborhood characteristic points of the characteristic points according to the neighborhood information;
detecting the target distance between the neighborhood feature point and the feature point;
if the target distance is greater than the distance threshold, marking the neighborhood feature point as an outlier, and eliminating the outlier;
if the target distance is less than or equal to the distance threshold, marking the neighborhood information points as local points, and fitting the local fitting plane according to the local points.
7. The point cloud based edge extraction method of claim 1, wherein calculating a discrimination threshold from the difference features comprises:
acquiring the average value and standard deviation of the difference characteristics;
and calculating a discrimination threshold according to the average value and the standard deviation.
8. The point cloud based edge extraction method of claim 7, wherein calculating a discrimination threshold from the mean and the standard deviation comprises:
acquiring a threshold factor of the discrimination threshold;
calculating the product of the threshold factor and the standard deviation;
and calculating the sum of the product and the average value as the discrimination threshold.
9. The point cloud based edge extraction method according to claim 1, wherein marking feature points where the difference feature is greater than the discrimination threshold as edge points includes:
traversing the characteristic points of the target point cloud to obtain the difference characteristics of the characteristic points;
if the difference characteristic is larger than the judging threshold value, marking the characteristic point as the edge point;
and if the difference characteristic is smaller than or equal to the judging threshold value, marking the characteristic point as a non-edge point.
10. An edge extraction device based on point cloud, which is characterized by comprising:
the searching module is configured to acquire a target point cloud, wherein the target point cloud comprises characteristic points with different coordinates; detecting neighborhood information of the feature points through a voxel neighbor searching algorithm;
the data processing module is configured to acquire difference features of the feature points according to the neighborhood information, wherein the difference features comprise displacement of the feature points after the feature points are subjected to smoothing processing, and/or the maximum included angle of adjacent projection vectors in the feature points; calculating a discrimination threshold according to the difference characteristics;
and the edge point extraction module is configured to mark the feature points with the difference features larger than the discrimination threshold as edge points.
CN202310923324.1A 2023-07-26 2023-07-26 Edge extraction method and device based on point cloud Pending CN116883443A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310923324.1A CN116883443A (en) 2023-07-26 2023-07-26 Edge extraction method and device based on point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310923324.1A CN116883443A (en) 2023-07-26 2023-07-26 Edge extraction method and device based on point cloud

Publications (1)

Publication Number Publication Date
CN116883443A true CN116883443A (en) 2023-10-13

Family

ID=88260297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310923324.1A Pending CN116883443A (en) 2023-07-26 2023-07-26 Edge extraction method and device based on point cloud

Country Status (1)

Country Link
CN (1) CN116883443A (en)

Similar Documents

Publication Publication Date Title
CN110264416B (en) Sparse point cloud segmentation method and device
CN104778701B (en) A kind of topography based on RGB-D sensors describes method
JP6216508B2 (en) Method for recognition and pose determination of 3D objects in 3D scenes
CN111598916A (en) Preparation method of indoor occupancy grid map based on RGB-D information
CN104616278B (en) Three-dimensional point cloud interest point detection method and system
CN111144213B (en) An object detection method and related equipment
CN109086724B (en) Accelerated human face detection method and storage medium
CN105989604A (en) Target object three-dimensional color point cloud generation method based on KINECT
CN109034065B (en) Indoor scene object extraction method based on point cloud
KR102472004B1 (en) calibration method and apparatus among mutiple sensors
CN110660072B (en) Method and device for identifying straight line edge, storage medium and electronic equipment
CN113362363A (en) Automatic image annotation method and device based on visual SLAM and storage medium
CN113362385A (en) Cargo volume measuring method and device based on depth image
CN115937160A (en) Explosion fireball contour detection method based on convex hull algorithm
CN112819883A (en) Rule object detection and positioning method
CN101630407A (en) Method for positioning forged region based on two view geometry and image division
CN110852207A (en) Blue roof building extraction method based on object-oriented image classification technology
CN111783722B (en) Lane line extraction method of laser point cloud and electronic equipment
CN111630569B (en) Binocular matching method, visual imaging device and device with storage function
WO2022247684A1 (en) Detection method and system for base-station feeder line, and related apparatus
CN114155236A (en) Laser stripe center line extraction method suitable for dynamic measurement environment
CN111404075B (en) An automatic measurement method of transmission line for inspection by unmanned aerial vehicle
CN117689813B (en) A high-precision infrared three-dimensional modeling method and system for power transformers in substations
CN111489386B (en) Point cloud characteristic point extraction method, device, storage medium, equipment and system
CN109409387B (en) Acquisition direction determining method and device of image acquisition equipment and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination