CN112419164A - Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method - Google Patents

Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method Download PDF

Info

Publication number
CN112419164A
CN112419164A CN201910783574.3A CN201910783574A CN112419164A CN 112419164 A CN112419164 A CN 112419164A CN 201910783574 A CN201910783574 A CN 201910783574A CN 112419164 A CN112419164 A CN 112419164A
Authority
CN
China
Prior art keywords
point
neighborhood
points
curvature
cloud model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910783574.3A
Other languages
Chinese (zh)
Other versions
CN112419164B (en
Inventor
梅嘉琳
姚亮
李奇
王汉
李慧芳
苏智勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201910783574.3A priority Critical patent/CN112419164B/en
Publication of CN112419164A publication Critical patent/CN112419164A/en
Application granted granted Critical
Publication of CN112419164B publication Critical patent/CN112419164B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a weighted guide point cloud model denoising method based on curvature and neighborhood reconstruction. The method comprises the following steps: firstly, calculating curvature information of each point in a point cloud model, and extracting characteristic points in the model according to a set threshold value; secondly, reconstructing neighborhood points on the basis of neighborhoods acquired by a K neighbor method according to the extracted feature points, and enabling the reconstructed neighborhoods to be on one surface; then, according to the reconstructed neighborhood, the three-dimensional position information of each point is used as a guide signal, and meanwhile, curvature information is used as a weighting signal and added into the position guide signal, so that linear transformation is carried out on each point in the point cloud model; and finally, performing linear transformation on each point according to the calculated linear transformation coefficient to realize point cloud model denoising. The method maintains the characteristic information of the point cloud model while aligning the point cloud denoising model, and has better robustness for noises of different degrees.

Description

Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method
Technical Field
The invention relates to the technical field of computer graphics and three-dimensional point cloud denoising, in particular to a weighted guide point cloud model denoising method based on curvature and neighborhood reconstruction.
Background
In recent years, due to the emergence and popularization of applications such as virtual reality and augmented reality, a point cloud model processing technology has become one of the hot spots in the field of computer graphics research. The denoised point cloud model can be used as the basis of application such as animation, rendering, three-dimensional reconstruction and the like. With the increasing popularity of various scanning devices, especially consumer-grade depth sensors (such as Kinect), the design of robust point cloud denoising methods becomes more and more important, and the main technical challenge of point cloud denoising is how to effectively maintain the features of the model while removing noise.
Currently, common point set filtering methods, such as Local Optimal Projection (LOP), Robust Implicit Moving Least Squares (RIMLS), weighted LOP (wlop), edge-aware resampling (EAR-aware resampling), and continuous LOP (loop), have significant advantages. However, these point set filtering methods either fail to preserve sharp features or are less robust in removing noise. Among them, LOP, WLOP and CLOP are LOP-based methods, which can remove noise and outliers well, but cannot retain their clear features due to their inherent isotropic properties; EAR is an extended LOP-based approach that, while preserving geometric features, may smear out fine-scale geometric features because it requires the use of a fairly large neighborhood size; RIMLS also retains features, but due to the strong dependence on the initial normal estimate, it is generally more sensitive to outliers and noise than LOP-based approaches. The above problems severely limit the robustness and effectiveness of these methods in point cluster denoising.
Disclosure of Invention
The invention aims to provide a curvature and neighborhood reconstruction based weighted guided point cloud model denoising method which is high in denoising precision, can keep model characteristics and has better robustness for noises of different degrees.
The technical solution for realizing the purpose of the invention is as follows: a curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method comprises the following steps:
step 1, calculating curvature information of each point in a point cloud model, and extracting characteristic points in the model according to a set threshold value;
step 2, reconstructing neighborhood points on the basis of neighborhoods acquired by a K neighbor method according to the extracted feature points, and enabling the reconstructed neighborhoods to be on one surface;
step 3, according to the reconstructed neighborhood, using the three-dimensional position information of each point as a guide signal, and simultaneously adding curvature information as a weighting signal into the position guide signal, thereby performing linear transformation on each point in the point cloud model;
and 4, performing linear transformation on each point according to the linear transformation coefficient calculated in the step 3, and denoising the point cloud model.
Further, the curvature information of each point in the point cloud model is calculated in the step 1, and the feature points in the model are extracted according to a set threshold, specifically as follows:
step 1.1, for each point piCalculate that it corresponds to neighborhood NiCurvature value of (a) (N)i) Is composed of
Figure BDA0002177309610000021
Wherein λ is0、λ1、λ2Is NiOf the covariance matrix of, and λ0<λ1<λ2Reflect NiDistribution of three orthogonal singular vectors;
step 1.2, setting a threshold value t, wherein points with curvature values larger than the threshold value t are characteristic points, and points with curvature values smaller than the threshold value t are non-characteristic points.
Further, the step 2 reconstructs the neighborhood points on the basis of the neighborhood acquired by the K-nearest neighbor method according to the extracted feature points, and the reconstructed neighborhood is on one surface, which specifically includes the following steps:
step 2.1, using K nearest neighbor method to each feature point piIs given an initial valueNeighborhood N of (2);
step 2.2, for each neighborhood point p in the initial neighborhood NijObtaining the neighborhood point p by using the K nearest neighbor methodijNeighborhood N ofijWhile simultaneously applying the feature points piAt pijCandidate neighborhood of (c)
Figure BDA0002177309610000022
Initialised to contain only its own pijAnd a feature point piTwo points;
step 2.3, scan NijEach point in (1) determines whether the point can join the candidate neighborhood
Figure BDA0002177309610000023
In the method, the following judgment standard is constructed according to the curvature value of the current neighborhood and the position relation of the points:
Figure BDA0002177309610000024
wherein,
Figure BDA0002177309610000025
a standard value for measuring the curvature value of the current neighborhood and the position relation of the middle point of the neighborhood is expressed,
Figure BDA0002177309610000026
represents pijIn the neighborhood
Figure BDA0002177309610000027
Curvature value of the lower, K representing the neighborhood
Figure BDA0002177309610000028
The number of all the points, alpha and beta, are the user-defined control coefficients; p is a radical ofijkRepresenting a neighborhood
Figure BDA0002177309610000029
K-th point in (1), 2, …, K representing the neighborhood
Figure BDA00021773096100000210
The total number of midpoints;
step 2.4, if the addition of this point is made in formula (2)
Figure BDA0002177309610000031
Is decreased, it represents that the point can join the candidate neighborhood
Figure BDA0002177309610000032
And 5 points nearest to the point are also added into the candidate neighborhood
Figure BDA0002177309610000033
Performing the following steps;
step 2.5, obtaining a characteristic point p for the rest points in the NiThe candidate neighborhoods within, for each candidate neighborhood, are calculated according to equation (2)
Figure BDA0002177309610000034
Wherein the neighborhood corresponding to the minimum value is the feature point piReconstructed neighborhood N'.
Further, according to the reconstructed neighborhood, the three-dimensional position information of each point is used as a guide signal, and the curvature information is used as a weighting signal and added to the position guide signal, so that each point in the point cloud model is linearly transformed, which specifically comprises the following steps:
the cost function E of the weighted guided filtering algorithm is:
Figure BDA0002177309610000035
γ(i)=(σ-t)s(i)+χ (4)
s(i)=-sgn(σ-t)×μ×σ (5)
wherein, N (p)i) Indicates the current point piNeighborhood of pijIs a point in the neighborhood, aiAnd biFor linear transformation coefficients to be solved, ε is the control filter effectParameters of the fruit; sigma is a curvature value of a point calculated before neighborhood reconstruction, and t is a threshold value for judging a characteristic point; χ is a positive number for preventing the weight γ (i) from being 0; mu is a magnification factor of
Figure BDA0002177309610000036
And (4) dynamically determining.
Further, in step 4, each point is linearly transformed according to the linear transformation coefficient calculated in step 3, so as to implement point cloud model denoising, specifically as follows:
step 4.1, coefficient a of linear transformation obtained in step 3i、biComprises the following steps:
Figure BDA0002177309610000037
Figure BDA0002177309610000038
wherein:
Figure BDA0002177309610000041
wherein, | N (p)i) I represents a point piP, the number of points contained in the neighborhood ofijIs piNeighborhood of N (p)i) At one point of the inner side of the body,
Figure BDA0002177309610000042
is the central point of the neighborhood, and epsilon is a parameter for controlling the filtering effect;
step 4.2, according to the obtained linear transformation coefficient aiAnd biAnd performing linear transformation on each characteristic point to obtain the position of the denoised point, and obtaining the denoised point cloud model after all the points are updated.
Compared with the prior art, the invention has the following remarkable advantages: (1) the input is simple, only the position information of the point cloud model point is needed to be input, and the initial normal estimation is not needed to be relied on; (2) curvature information is introduced as a weighting signal, and a characteristic region and a flat region of the model are processed separately, so that the sharp characteristic of the model is better reserved; (3) by reconstructing the neighborhood of the characteristic point, each point obtains a relatively smooth and consistent neighborhood, so that the method has robustness to noises with different degrees.
Drawings
FIG. 1 is a schematic flow chart of the weighted guide point cloud model denoising method based on curvature and neighborhood reconstruction.
FIG. 2 is a flow chart of feature point neighborhood reconstruction in the present invention.
Fig. 3 is a block diagram of the weighted guided filtering method of the present invention.
Fig. 4 is a diagram of denoising effect in the embodiment of the present invention, in which (a) is a schematic diagram of a point cloud model with noise as an input, and (b) is a schematic diagram of a point cloud model after denoising as an output.
Fig. 5 is a diagram of denoising effect in the embodiment of the present invention, in which (a) is a schematic diagram of a point cloud model with noise as an input, and (b) is a schematic diagram of a point cloud model after denoising as an output.
Detailed Description
The invention relates to a curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method, which comprises the following steps:
step 1, calculating curvature information of each point in a point cloud model, and extracting characteristic points in the model according to a set threshold value;
step 2, reconstructing neighborhood points on the basis of neighborhoods acquired by a K neighbor method according to the extracted feature points, and enabling the reconstructed neighborhoods to be on one surface;
step 3, according to the reconstructed neighborhood, using the three-dimensional position information of each point as a guide signal, and simultaneously adding curvature information as a weighting signal into the position guide signal, thereby performing linear transformation on each point in the point cloud model;
and 4, performing linear transformation on each point according to the linear transformation coefficient calculated in the step 3, and denoising the point cloud model.
As a specific example, the curvature information of each point in the point cloud model is calculated in step 1, and the feature points in the model are extracted according to a set threshold, specifically as follows:
step 1.1, for each point piCalculate that it corresponds to neighborhood NiCurvature value of (a) (N)i) Is composed of
Figure BDA0002177309610000051
Wherein λ is0、λ1、λ2Is NiOf the covariance matrix of, and λ0<λ1<λ2Reflect NiDistribution of three orthogonal singular vectors;
step 1.2, setting a threshold value t, wherein points with curvature values larger than the threshold value t are characteristic points, and points with curvature values smaller than the threshold value t are non-characteristic points.
Further, the step 2 reconstructs the neighborhood points on the basis of the neighborhood acquired by the K-nearest neighbor method according to the extracted feature points, and the reconstructed neighborhood is on one surface, which specifically includes the following steps:
step 2.1, using K nearest neighbor method to each feature point piAssigning an initial neighborhood N;
step 2.2, for each neighborhood point p in the initial neighborhood NijObtaining the neighborhood point p by using the K nearest neighbor methodijNeighborhood N ofijWhile simultaneously applying the feature points piAt pijCandidate neighborhood of (c)
Figure BDA0002177309610000052
Initialised to contain only its own pijAnd a feature point piTwo points;
step 2.3, scan NijEach point in (1) determines whether the point can join the candidate neighborhood
Figure BDA0002177309610000053
In the method, the following judgment standard is constructed according to the curvature value of the current neighborhood and the position relation of the points:
Figure BDA0002177309610000054
wherein,
Figure BDA0002177309610000055
a standard value for measuring the curvature value of the current neighborhood and the position relation of the middle point of the neighborhood is expressed,
Figure BDA0002177309610000056
represents pijIn the neighborhood
Figure BDA0002177309610000057
Curvature value of the lower, K representing the neighborhood
Figure BDA0002177309610000058
The number of all the points, alpha and beta, are the user-defined control coefficients; p is a radical ofijkRepresenting a neighborhood
Figure BDA0002177309610000059
K-th point in (1), 2, …, K representing the neighborhood
Figure BDA00021773096100000510
The total number of midpoints;
step 2.4, if the addition of this point is made in formula (2)
Figure BDA00021773096100000511
Is decreased, it represents that the point can join the candidate neighborhood
Figure BDA00021773096100000512
And 5 points nearest to the point are also added into the candidate neighborhood
Figure BDA00021773096100000513
Performing the following steps;
step 2.5, obtaining a characteristic point p for the rest points in the NiIn whichCandidate neighborhoods, each calculated according to equation (2)
Figure BDA0002177309610000061
Wherein the neighborhood corresponding to the minimum value is the feature point piReconstructed neighborhood N'.
As a specific example, according to the reconstructed neighborhood described in step 3, the three-dimensional position information of each point is used as a guiding signal, and the curvature information is added to the position guiding signal as a weighting signal, so as to perform linear transformation on each point in the point cloud model, which is specifically as follows:
the cost function E of the weighted guided filtering algorithm is:
Figure BDA0002177309610000062
γ(i)=(σ-t)s(i)+χ (4)
s(i)=-sgn(σ-t)×μ×σ (5)
wherein, N (p)i) Indicates the current point piNeighborhood of pijIs a point in the neighborhood, aiAnd biIs a linear transformation coefficient to be solved, and epsilon is a parameter for controlling the filtering effect; sigma is a curvature value of a point calculated before neighborhood reconstruction, and t is a threshold value for judging a characteristic point; χ is a positive number for preventing the weight γ (i) from being 0; mu is a magnification factor of
Figure BDA0002177309610000063
And (4) dynamically determining.
As a specific example, in step 4, each point is linearly transformed according to the linear transformation coefficient calculated in step 3, so as to implement point cloud model denoising, which is specifically as follows:
step 4.1, coefficient a of linear transformation obtained in step 3i、biComprises the following steps:
Figure BDA0002177309610000064
Figure BDA0002177309610000065
wherein:
Figure BDA0002177309610000066
wherein, | N (p)i) I represents a point piP, the number of points contained in the neighborhood ofijIs piNeighborhood of N (p)i) At one point of the inner side of the body,
Figure BDA0002177309610000067
is the central point of the neighborhood, and epsilon is a parameter for controlling the filtering effect;
step 4.2, according to the obtained linear transformation coefficient aiAnd biAnd performing linear transformation on each characteristic point to obtain the position of the denoised point, and obtaining the denoised point cloud model after all the points are updated.
The invention is described in further detail below with reference to the figures and the embodiments.
Examples
With reference to fig. 1, the weighted guide point cloud model denoising method based on curvature and neighborhood reconstruction of the present invention includes the following steps:
step 1, calculating curvature information of each point in a point cloud model, and extracting characteristic points in the model according to a set threshold, wherein the curvature information comprises the following specific steps:
step 1.1, for each point piCalculate that it corresponds to neighborhood NiCurvature value of (a) (N)i) Comprises the following steps:
Figure BDA0002177309610000071
wherein λ is0、λ1、λ2Is NiOf the covariance matrix of, and λ0<λ1<λ2Reflect NiDistribution of three orthogonal singular vectors;
step 1.2, setting a threshold value t, wherein points with curvature values larger than the threshold value t are characteristic points, and points with curvature values smaller than the threshold value t are non-characteristic points.
Step 2, reconstructing neighborhood points on the basis of neighborhoods acquired by a K neighbor method according to the extracted feature points, and enabling the reconstructed neighborhoods to be on one surface, wherein the method is as follows by combining the graph 2:
step 2.1, using K nearest neighbor method to each feature point piAssigning an initial neighborhood N;
step 2.2, for each neighborhood point p in the initial neighborhood NijObtaining the neighborhood point p by using the K nearest neighbor methodijNeighborhood N ofijWhile simultaneously applying the feature points piAt pijCandidate neighborhood of (c)
Figure BDA0002177309610000072
Initialised to contain only its own pijAnd a feature point piTwo points;
step 2.3, scan NijEach point in (1) determines whether the point can join the candidate neighborhood
Figure BDA0002177309610000073
In the method, the following judgment standard is constructed according to the curvature value of the current neighborhood and the position relation of the points:
Figure BDA0002177309610000074
wherein,
Figure BDA0002177309610000075
a standard value for measuring the curvature value of the current neighborhood and the position relation of the middle point of the neighborhood is expressed,
Figure BDA0002177309610000081
represents pijIn the neighborhood
Figure BDA0002177309610000082
Curvature value of the lower, K representing the neighborhood
Figure BDA0002177309610000083
The number of all the points, alpha and beta, are the user-defined control coefficients; p is a radical ofijkRepresenting a neighborhood
Figure BDA0002177309610000084
K-th point in (1), 2, …, K representing the neighborhood
Figure BDA0002177309610000085
The total number of midpoints;
step 2.4, if the addition of this point is made in formula (2)
Figure BDA0002177309610000086
Is decreased, it represents that the point can join the candidate neighborhood
Figure BDA0002177309610000087
And 5 points nearest to the point are also added into the candidate neighborhood
Figure BDA0002177309610000088
Performing the following steps;
step 2.5, obtaining a characteristic point p for the rest points in the NiThe candidate neighborhoods within, for each candidate neighborhood, are calculated according to equation (2)
Figure BDA0002177309610000089
Wherein the neighborhood corresponding to the minimum value is the feature point piReconstructed neighborhood N'.
And 3, according to the reconstructed neighborhood, using the three-dimensional position information of each point as a guide signal, and simultaneously adding curvature information as a weighting signal into the position guide signal, so as to perform linear transformation on each point in the point cloud model, wherein the method specifically comprises the following steps in combination with the graph 3:
in the traditional guide filtering algorithm, because a unified linear model and the same regularization parameter are used for each part of a point cloud model, a characteristic region can be smoothed, and a weighted guide point cloud denoising method with curvature information fused is adopted. An effective guided filtering weight model needs to solve two problems: firstly, a point cloud model processing method by which the weight value is used can accurately identify a characteristic region of a model; secondly, in the feature region, a smaller smoothing multiple should be superposed, namely the final regular term should be smaller, and the weight at the denominator should be larger; in the flat region of the model, a slightly larger smoothing factor should be superimposed, i.e. the final regularization term should be slightly larger, then the weight at the denominator should be smaller. In the field of point cloud processing, most of the characteristic points of the point cloud can be extracted by calculating the curvature information of each point.
In addition, the ideal weight model requires that the weight is smaller in the flat region of the model and larger in the feature region. Because the change rule of the weight is similar to the change rule of the exponential function, the curvature information of the point can be used as a base number, and the curvature information of the point is amplified or suppressed through the weight model based on the exponential function. In order to obtain a characteristic region of the model, a characteristic point curvature threshold value t is set, when the curvature value of a certain point is greater than t, the point is considered as a characteristic point, and otherwise, the point is considered as a non-characteristic point. In order to enable the curvature information to be more reasonably applied, a constraint factor s (i) is introduced into the weight gamma (i) to be used as an index item of a weight model, the constraint factor can set a constraint boundary for the weight, so that different weighting behaviors to be taken by different regions can be accurately decided, the characteristic regions are sensitive to the characteristic information, and the information at characteristic points is amplified; when the method is used for a flat area, the method is insensitive to the characteristic information, and the characteristic information is restrained from growing, so that the aim of retaining the characteristic is fulfilled.
Based on the above analysis, the weight is defined as:
γ(i)=(σ-t)s(i)
s(i)=-sgn(σ-t)×μ×σ
wherein σ is a curvature value of a point calculated before neighborhood reconstruction, t is a threshold value for judging a feature point, and μ is a magnification factor
Figure BDA0002177309610000091
Dynamically, α is a constant term that prevents γ (i) from being zero as the denominator.
Obviously, when a point belongs to a feature point, the s (i) sign is negative, and the curvature is larger, that is, the feature information is more remarkable, the weight is larger, and the point is shown to be sensitive to the feature information; when the point does not belong to the feature point, the s (i) sign is positive, and the weight is very small at this time, which shows that the point is insensitive to the feature information.
And finally, rewriting the combination weight on the basis of guiding filtering, wherein the cost function is as follows:
Figure BDA0002177309610000092
wherein N (p)i) Indicates the current point piNeighborhood of pijIs a point in the neighborhood, aiAnd biAnd epsilon is a parameter for controlling the filtering effect for the linear transformation coefficient to be solved.
And 4, performing linear transformation on each point according to the linear transformation coefficient calculated in the step 3 to realize point cloud model denoising, which comprises the following specific steps:
step 4.1, the coefficients of the linear transformation obtained in step 3 are:
Figure BDA0002177309610000093
Figure BDA0002177309610000094
wherein:
Figure BDA0002177309610000095
wherein, | N (p)i) I represents a point piP, the number of points contained in the neighborhood ofijIs piNeighborhood of N (p)i) At one point of the inner side of the body,
Figure BDA0002177309610000096
is the central point of the neighborhood, and epsilon is a parameter for controlling the filtering effect;
step 4.2, according to the obtained linear transformation coefficient aiAnd biAnd performing linear transformation on each characteristic point to obtain the position of the denoised point, and obtaining the denoised point cloud model after all the points are updated.
Fig. 4 is a diagram of denoising effect in the embodiment of the present invention, where fig. 4(a) is a schematic diagram of a point cloud model with noise as an input, and fig. 4(b) is a schematic diagram of a point cloud model after denoising as an output. Fig. 5 is a diagram of denoising effect in the embodiment of the present invention, where fig. 5(a) is a schematic diagram of a point cloud model with noise as an input, and fig. 5(b) is a schematic diagram of a point cloud model after denoising as an output. As can be seen from the graphs in FIGS. 4 and 5, the point cloud model denoising method based on curvature and neighborhood reconstruction is adopted to denoise the point cloud model, so that the sharp features of the model can be maintained while noise is removed, and the method can be widely applied to various models.

Claims (5)

1. A curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method is characterized by comprising the following steps:
step 1, calculating curvature information of each point in a point cloud model, and extracting characteristic points in the model according to a set threshold value;
step 2, reconstructing neighborhood points on the basis of neighborhoods acquired by a K neighbor method according to the extracted feature points, and enabling the reconstructed neighborhoods to be on one surface;
step 3, according to the reconstructed neighborhood, using the three-dimensional position information of each point as a guide signal, and simultaneously adding curvature information as a weighting signal into the position guide signal, thereby performing linear transformation on each point in the point cloud model;
and 4, performing linear transformation on each point according to the linear transformation coefficient calculated in the step 3, and denoising the point cloud model.
2. The curvature and neighborhood reconstruction-based weighted guided point cloud model denoising method according to claim 1, wherein the curvature information of each point in the point cloud model is calculated in step 1, and feature points in the model are extracted according to a set threshold, specifically as follows:
step 1.1, for each point piCalculate that it corresponds to neighborhood NiCurvature value of (a) (N)i) Is composed of
Figure FDA0002177309600000011
Wherein λ is0、λ1、λ2Is NiOf the covariance matrix of, and λ0<λ1<λ2Reflect NiDistribution of three orthogonal singular vectors;
step 1.2, setting a threshold value t, wherein points with curvature values larger than the threshold value t are characteristic points, and points with curvature values smaller than the threshold value t are non-characteristic points.
3. The curvature and neighborhood reconstruction-based weighted guided point cloud model denoising method of claim 1, wherein the step 2 reconstructs neighborhood points based on the neighborhood obtained by the K-nearest neighbor method according to the extracted feature points, and makes the reconstructed neighborhood be on one surface, specifically as follows:
step 2.1, using K nearest neighbor method to each feature point piAssigning an initial neighborhood N;
step 2.2, for each neighborhood point p in the initial neighborhood NijObtaining the neighborhood point p by using the K nearest neighbor methodijNeighborhood N ofijWhile simultaneously applying the feature points piAt pijCandidate neighborhood of (c)
Figure FDA0002177309600000012
Initialised to contain only its own pijAnd a feature point piTwo points;
step 2.3, scan NijEach point in (1) determines whether the point can join the candidate neighborhood
Figure FDA0002177309600000021
In the method, the following judgment standard is constructed according to the curvature value of the current neighborhood and the position relation of the points:
Figure FDA0002177309600000022
wherein,
Figure FDA0002177309600000023
a standard value for measuring the curvature value of the current neighborhood and the position relation of the middle point of the neighborhood is expressed,
Figure FDA0002177309600000024
represents pijIn the neighborhood
Figure FDA0002177309600000025
Curvature value of the lower, K representing the neighborhood
Figure FDA0002177309600000026
The number of all the points, alpha and beta, are the user-defined control coefficients; p is a radical ofijkRepresenting a neighborhood
Figure FDA0002177309600000027
K-th point in (1), 2, …, K representing the neighborhood
Figure FDA0002177309600000028
The total number of midpoints;
step 2.4, if the addition of this point is made in formula (2)
Figure FDA0002177309600000029
Is decreased, it means that the point canJoining to a candidate neighborhood
Figure FDA00021773096000000210
And 5 points nearest to the point are also added into the candidate neighborhood
Figure FDA00021773096000000211
Performing the following steps;
step 2.5, obtaining a characteristic point p for the rest points in the NiThe candidate neighborhoods within, for each candidate neighborhood, are calculated according to equation (2)
Figure FDA00021773096000000212
Wherein the neighborhood corresponding to the minimum value is the feature point piReconstructed neighborhood N'.
4. The curvature-and-neighborhood reconstruction-based weighted guided point cloud model denoising method of claim 1,2 or 3, wherein the step 3 utilizes the three-dimensional position information of each point as a guiding signal according to the reconstructed neighborhood, and adds the curvature information as a weighted signal to the position guiding signal, thereby performing linear transformation on each point in the point cloud model, specifically as follows:
the cost function E of the weighted guided filtering algorithm is:
Figure FDA00021773096000000213
γ(i)=(σ-t)s(i)+χ (4)
s(i)=-sgn(σ-t)×μ×σ (5)
wherein, N (p)i) Indicates the current point piNeighborhood of pijIs a point in the neighborhood, aiAnd biIs a linear transformation coefficient to be solved, and epsilon is a parameter for controlling the filtering effect; sigma is a curvature value of a point calculated before neighborhood reconstruction, and t is a threshold value for judging a characteristic point; χ is a positive number for preventing the weight γ (i) from being 0(ii) a Mu is a magnification factor of
Figure FDA00021773096000000214
And (4) dynamically determining.
5. The curvature and neighborhood reconstruction based weighted guided point cloud model denoising method according to claim 4, wherein the linear transformation coefficient calculated in step 4 is used for performing linear transformation on each point to realize point cloud model denoising, and the method comprises the following steps:
step 4.1, coefficient a of linear transformation obtained in step 3i、biComprises the following steps:
Figure FDA0002177309600000031
Figure FDA0002177309600000032
wherein:
Figure FDA0002177309600000033
wherein, | N (p)i) I represents a point piP, the number of points contained in the neighborhood ofijIs piNeighborhood of N (p)i) At one point of the inner side of the body,
Figure FDA0002177309600000034
is the central point of the neighborhood, and epsilon is a parameter for controlling the filtering effect;
step 4.2, according to the obtained linear transformation coefficient aiAnd biAnd performing linear transformation on each characteristic point to obtain the position of the denoised point, and obtaining the denoised point cloud model after all the points are updated.
CN201910783574.3A 2019-08-23 2019-08-23 Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method Active CN112419164B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910783574.3A CN112419164B (en) 2019-08-23 2019-08-23 Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910783574.3A CN112419164B (en) 2019-08-23 2019-08-23 Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method

Publications (2)

Publication Number Publication Date
CN112419164A true CN112419164A (en) 2021-02-26
CN112419164B CN112419164B (en) 2022-08-19

Family

ID=74780350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910783574.3A Active CN112419164B (en) 2019-08-23 2019-08-23 Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method

Country Status (1)

Country Link
CN (1) CN112419164B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160125226A1 (en) * 2013-09-17 2016-05-05 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Method and system for automatically optimizing quality of point cloud data
CN106709883A (en) * 2016-12-20 2017-05-24 华南理工大学 Point cloud denoising method based on joint bilateral filtering and sharp feature skeleton extraction

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160125226A1 (en) * 2013-09-17 2016-05-05 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Method and system for automatically optimizing quality of point cloud data
CN106709883A (en) * 2016-12-20 2017-05-24 华南理工大学 Point cloud denoising method based on joint bilateral filtering and sharp feature skeleton extraction

Also Published As

Publication number Publication date
CN112419164B (en) 2022-08-19

Similar Documents

Publication Publication Date Title
Pu et al. A fractional-order variational framework for retinex: fractional-order partial differential equation-based formulation for multi-scale nonlocal contrast enhancement with texture preserving
CN108205803B (en) Image processing method, and training method and device of neural network model
Guo et al. LIME: Low-light image enhancement via illumination map estimation
Kandhway et al. An optimal adaptive thresholding based sub-histogram equalization for brightness preserving image contrast enhancement
Song et al. Fast image super-resolution via local adaptive gradient field sharpening transform
CN106296675A (en) A kind of dividing method of the uneven image of strong noise gray scale
CN108010002B (en) Structured point cloud denoising method based on adaptive implicit moving least square
Zhao et al. A novel Neutrosophic image segmentation based on improved fuzzy C-means algorithm (NIS-IFCM)
CN104200434B (en) Non-local mean image denoising method based on noise variance estimation
CN109345536B (en) Image super-pixel segmentation method and device
CN115908984A (en) Training method and device of image clustering model
CN109584249B (en) Three-dimensional volume data segmentation method based on closed form solution
CN112598588B (en) Curvature weighted guide-based transformer substation three-dimensional point cloud model denoising method
CN113034387B (en) Image denoising method, device, equipment and medium
Zhou et al. An improved algorithm using weighted guided coefficient and union self‐adaptive image enhancement for single image haze removal
CN109658357A (en) A kind of denoising method towards remote sensing satellite image
Wang et al. Multifeature contrast enhancement algorithm for digital media images based on the diffusion equation
CN112419164B (en) Curvature and neighborhood reconstruction-based weighted guide point cloud model denoising method
Wang et al. Region-based adaptive anisotropic diffusion for image enhancement and denoising
Keren et al. Denoising color images using regularization and “correlation terms”
CN108109115B (en) Method, device and equipment for enhancing character image and storage medium
Li Roof-edge preserving image smoothing based on MRFs
Jia et al. Weighted guided image filtering with entropy evaluation weighting
Prasath et al. Image restoration with fuzzy coefficient driven anisotropic diffusion
CN110246224B (en) Surface denoising method and system of grid model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Su Zhiyong

Inventor after: Mei Jialin

Inventor after: Yao Liang

Inventor after: Li Qi

Inventor after: Wang Han

Inventor after: Li Huifang

Inventor before: Mei Jialin

Inventor before: Yao Liang

Inventor before: Li Qi

Inventor before: Wang Han

Inventor before: Li Huifang

Inventor before: Su Zhiyong

GR01 Patent grant
GR01 Patent grant