CN106652048B - Three-dimensional model interest point extraction method based on 3D-SUSAN operator - Google Patents

Three-dimensional model interest point extraction method based on 3D-SUSAN operator Download PDF

Info

Publication number
CN106652048B
CN106652048B CN201611260181.7A CN201611260181A CN106652048B CN 106652048 B CN106652048 B CN 106652048B CN 201611260181 A CN201611260181 A CN 201611260181A CN 106652048 B CN106652048 B CN 106652048B
Authority
CN
China
Prior art keywords
vertex
curvature
value
interest
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611260181.7A
Other languages
Chinese (zh)
Other versions
CN106652048A (en
Inventor
张桦
王彧
周文晖
阳宁凯
吴以凡
戴国骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN201611260181.7A priority Critical patent/CN106652048B/en
Publication of CN106652048A publication Critical patent/CN106652048A/en
Application granted granted Critical
Publication of CN106652048B publication Critical patent/CN106652048B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a three-dimensional model interest point extraction method based on a 3D-SUSAN corner operator. The method firstly reads the three-dimensional model vertexes, simplifies the model vertexes and the model meshing, and calculates the curvature of each vertex. And then, the curvature of each vertex is processed through Gaussian filtering, and denoising is carried out. Determining a central point and taking 36 neighbor points around the central point, and entering the next step if the curvature difference between the central vertex and the neighbor vertices is less than the curvature similarity threshold; if the USAN value of each vertex is less than the geometric threshold, then the center vertex is the point of interest. And finally, evaluating the error rate, the loss rate and the error rate of the three-dimensional model interest points extracted based on the 3D-SUSAN corner point operator. The method not only can extract the three-dimensional model interest points, but also keeps the advantages of strong noise resistance, small calculation amount, high efficiency and stable performance of the SUSAN.

Description

Three-dimensional model interest point extraction method based on 3D-SUSAN operator
Technical Field
The invention relates to extraction of interest points of a three-dimensional model, in particular to a three-dimensional model interest point extraction method based on a 3D-SUSAN corner operator.
Background
In recent years, 3D applications have attracted much attention, and in the process of three-dimensional model development, interest point extraction is essential to many 3D application technologies, such as mesh simplification, mesh segmentation, viewpoint selection, and three-dimensional model matching search. Using interest points to match a three-dimensional model requires providing local feature points of the model. The method is also suitable for three-dimensional picture recognition and matching.
In the field of point of interest extraction, we can witnessed the tremendous advances in this technology, both in theory and in real 3D applications. These advances can be reflected in model analysis, model delivery, and model rendering. Interest point extraction techniques typically use mathematical means to extract features, such as curvature in geometric mathematics, with greater curvature being more pronounced.
Three-dimensional model interest point extraction also faces some challenges and difficulties. First, there is no clear definition of a point of interest so far. It has been shown based on some subjective studies that points of interest are points that differ significantly from other points. Then, since the topological direction of the 3D mesh is arbitrary, there are arbitrary neighbor vertices for the model vertices. Finally, there is no information about the vertex other than the position information of the vertex, and therefore other feature information of the vertex needs to be calculated.
The two-dimensional corner detection operators can be generalized into 3 types, namely, the corner detection based on the gray level image, the corner detection based on the binary image and the corner detection based on the contour curve. Common corner operators include Moravec corner detection operator, Harris corner detection operator, SUSAN corner detection operator, and the like. Wherein the SUSAN corner detection operator has the same pixel values according to the local area associated with each image point. If each pixel value within a window region is the same as or similar to the pixel value in the center of the window, we refer to the region as the USAN region. If the USAN region is smaller than the threshold, we will consider the center point of the region as the corner point.
Reference to the literature
[1]HelinDutagaci,Chun Pan Cheung,AfzalGodil,“Evaluation of 3Dinterest point detection techniquesvia human-generated ground truth”,Springer-Verlag,2012.
[2]C.H.Lee,A.Varshney,and D.W.Jacobs,"Mesh Saliency,"ACM SIGGRAPH,vol.174,pp.659-666,2005.
[3]A Benchmark for 3D Interest Points Marked by Human Subjects,[Online].
Available:http://www.itl.nist.gov/iad/vug/sharp/benchmark/3DInterestPoint.
[4]Castellani,U.,Cristani,M.,Fantoni,S.,Murino,V.:Sparse pointsmatching by combining 3D mesh saliency with statistical descriptors.Comput.Graph.Forum27(2),643–652(2008).
Disclosure of Invention
The invention aims to provide a three-dimensional model interest point extraction method based on a 3D-SUSAN corner point operator in view of the importance of three-dimensional model interest points on grid simplification, grid segmentation, viewpoint selection and 3D model matching retrieval. The method inherits the characteristics of strong noise resistance, small calculated amount, high efficiency, stable performance and the like of the SUSAN operator.
In order to achieve the purpose, the invention is realized by the following technical scheme, which specifically comprises the following steps: :
step (1): and calculating the vertex position coordinates of the three-dimensional model to be extracted by utilizing a grid reading algorithm, and simplifying the vertices of the three-dimensional model to be extracted by utilizing a grid simplifying operator.
Step (2): the curvature of each vertex is calculated using a curvature operator.
And (3): and performing Gaussian processing on the curvature value by using a Gaussian function, and removing noise to obtain the denoised curvature value.
And (4): and defining a vertex cluster, and selecting the first 36 neighbor vertices with the shortest distance as the vertex cluster compared with the current vertex curvature value according to the distance between the neighbor vertices and the current vertex.
And (5): and (4) according to the vertex cluster defined in the step (4), if the curvature difference value between the current vertex and other neighbor vertices in the vertex cluster is smaller than a defined similarity threshold, taking the current vertex as a candidate point. The USAN value of the candidate point is then compared to the size of the geometric threshold, and if the USAN value is less than the size of the geometric threshold, the candidate point is returned as the point of interest.
And (6): defining evaluation operators FPE, FNE and WME[1]And (4) evaluating the interest points obtained in the step (5) according to the algorithm missing rate, the redundant rate and the error rate.
Calculating the curvature of each vertex by using a curvature operator in the step (2), wherein the curvature calculation formula is as follows:
let the three-dimensional curve equation be: x (t), y (y) (t), z (z) (t);
1) respectively obtaining x ' (t), y ' (t) and z ' (t) by derivation;
2) respectively solving 2-order derivatives to obtain x ' (t), y ' (t) and z ' (t);
3) the three first orders are taken together as a three-dimensional vector: r ═ x ' (t), y ' (t), z ' (t));
4) three second order derivatives are considered together as a three-dimensional vector: r ″ (x "(t), y" (t), z "(t));
5) the curvature calculation formula is:
Figure GDA0002124669190000031
in the step (3), a gaussian function formula is used to perform gaussian function processing on the top curvature in the step (2) to obtain a gaussian weighted average of the average curvature in the sigma region, wherein the gaussian function formula is as follows:
Figure GDA0002124669190000032
g is the gaussian weighted average of the average curvature in the sigma region,
Figure GDA0002124669190000033
is the curvature value of the vertex, and x is the neighbor vertex for the currently computed vertex v in the σ region.
The vertex cluster selection in the step (4) is to select the first 36 neighboring vertices with the shortest distance as vertices to be compared with the current vertex by taking the current calculation vertex as a center.
The three-dimensional model vertex distance calculation formula is as follows:
Figure GDA0002124669190000034
(x1,y1,z1),(x2,y2,z2) Respectively the coordinates of the two vertices.
The specific implementation process of the step (5) is as follows:
5-1, firstly, in order to increase the threshold self-adaptive capacity, a similarity threshold value of curvature difference in the vertex cluster to which each central vertex belongs needs to be dynamically calculated.
Figure GDA0002124669190000041
The central vertex refers to a vertex currently calculated in the vertex cluster; a represents an average value of the difference in the tortuosity. c (v)0) Represents the curvature of the central vertex, and c (v) represents the curvature of other neighboring vertices within the vertex cluster.
Figure GDA0002124669190000042
Figure GDA0002124669190000043
Representing a curvature similarity threshold value, a representing an average value of curvature differences, c representing a curvature value of a vertex in a cluster, N representing the number of cluster points, wherein 36 is taken;
if the curvature absolute value difference value of the current vertex and the adjacent vertex is smaller than the curvature similarity threshold, taking the current vertex as a candidate point;
5-2. calculate the USAN value for each candidate point:
Figure GDA0002124669190000044
Figure GDA0002124669190000045
denotes a curvature similarity threshold, c (v)0) Representing the curvature of the central vertex, c (v) representing the curvature of other neighboring vertices within the vertex cluster; d represents the USAN value of a certain candidate point, and the initial value of the USAN value is 0;
referring to SUSAN operator applied to two-dimensional image focus extraction, the geometric threshold is generally equal to 1/2 for USAN regions when extracting corners, so in 3D-SUSAN operator, we also choose 1/2 for USAN regions as the geometric threshold:
Figure GDA0002124669190000046
g(v0) Denotes v0Geometric threshold of d (v)0) Represents v0The USAN region value of (1).
And if the USAN value of the candidate point is smaller than the geometric threshold, the candidate point is regarded as the corner point.
Figure GDA0002124669190000047
And 5-3, performing non-maximum suppression on the obtained corner points to finally obtain the interest points.
The evaluation algorithm in step (6) is specifically as follows: the evaluation algorithm is specifically divided into two steps: firstly, establishing a real scene of the model, and secondly, evaluating an interest point extraction algorithm by using the established real scene.
(1) Establishing an evaluation standard of the Interest point identification of the real scene of the model, wherein the evaluation standard refers to a website A benchmark Marked by Human Subjects for 3D Interest Points[2]Including three-dimensional models and a manually extracted point of interest coordinate set for each model.
(2) And evaluating the interest points extracted by the algorithm by using an established real scene evaluation interest point extraction algorithm and using FPE (false positive errors), FNE (false negative errors) and WME (weighted miss errors).
Gr=(p∈M|d(g,p)≤r)
GrRepresenting an interest point set of a real scene of a model, M representing interest points of the model extracted by an algorithm, d representing distances between vertexes and vertexes, r representing a radius area, g representing an interest point of a three-dimensional model, and p representing neighbor vertexes around the interest point g;
Figure GDA0002124669190000051
NCrepresenting the correct number of points extracted by the algorithm, NGRepresenting the total number of points of interest in the real scene.
Figure GDA0002124669190000052
NANumber of points of interest, N, extracted by the expression algorithmCIndicating the correct number of points extracted by the algorithm.
Figure GDA0002124669190000053
Figure GDA0002124669190000054
niRepresents niIndividual volunteers identified point i, giThe 3D-SUSAN evaluation result of the real scene with the radius r is shown in the figure 43D-SUSAN algorithm evaluation result chart.iIs an intermediate variable parameter.
The invention has the following beneficial effects:
the invention provides a three-dimensional model interest point extraction method based on a 3D-SUSAN corner point operator, which introduces an SUSAN corner point detection operator for extracting two-dimensional image feature points into a three-dimensional field, provides the 3D-SUSAN corner point detection operator, and effectively extracts interest points of a three-dimensional model.
The 3D-SUSAN corner detection operator can extract the interest points of the three-dimensional model and still retains the advantages of strong anti-noise capability, small calculated amount, high efficiency and stable performance of SUSAN.
Drawings
FIG. 1 is a flow chart of the three-dimensional model interest point extraction based on the 3D-SUSAN algorithm.
FIG. 2 is a vertex cluster diagram.
Fig. 3 is a flow chart of non-maximum suppression.
Fig. 4 is a diagram of the evaluation result of the 3D-SUSAN algorithm.
FIG. 5 is a point of interest identification.
Detailed Description
The invention will be further explained with reference to the drawings.
As shown in fig. 1, a method for extracting a three-dimensional model interest point based on a 3D-SUSAN corner operator specifically includes the following steps:
step (1): and calculating the vertex position coordinates of the three-dimensional model to be extracted by utilizing a grid reading algorithm, and simplifying the vertices of the three-dimensional model to be extracted by utilizing a grid simplifying operator.
Step (2): the curvature of each vertex is calculated using a curvature operator.
And (3): and performing Gaussian processing on the curvature value by using a Gaussian function, and removing noise to obtain the denoised curvature value.
And (4): defining vertex clusters, and selecting the first 36 neighbor vertices with the shortest distance as vertex clusters compared with the curvature value of the current vertex according to the distance between the neighbor vertices and the current vertex, wherein the attached figure 2 is a vertex cluster map.
And (5): and (4) according to the vertex cluster defined in the step (4), if the curvature difference value between the current vertex and other neighbor vertices in the vertex cluster is smaller than a defined similarity threshold, taking the current vertex as a candidate point. The USAN value of the candidate point is then compared to the size of the geometric threshold, and if the USAN value is less than the size of the geometric threshold, the candidate point is returned as the point of interest.
And (6): defining evaluation operators FPE, FNE and WME[1]And (4) evaluating the interest points obtained in the step (5) according to the algorithm missing rate, the redundant rate and the error rate.
Calculating the curvature of each vertex by using a curvature operator in the step (2), wherein the curvature calculation formula is as follows:
let the three-dimensional curve equation be: x (t), y (y) (t), z (z) (t);
6) respectively obtaining x ' (t), y ' (t) and z ' (t) by derivation;
7) respectively solving 2-order derivatives to obtain x ' (t), y ' (t) and z ' (t);
8) the three first orders are taken together as a three-dimensional vector: r ═ x ' (t), y ' (t), z ' (t));
9) three second order derivatives are considered together as a three-dimensional vector: r ″ (x "(t), y" (t), z "(t));
10) the curvature calculation formula is:
Figure GDA0002124669190000071
in the step (3), a gaussian function formula is used to perform gaussian function processing on the top curvature in the step (2) to obtain a gaussian weighted average of the average curvature in the sigma region, wherein the gaussian function formula is as follows:
Figure GDA0002124669190000072
g is the gaussian weighted average of the average curvature in the sigma region,
Figure GDA0002124669190000073
is the curvature value of the vertex, and x is the neighbor vertex for the currently computed vertex v in the σ region.
The vertex cluster selection in the step (4) is to select the first 36 neighboring vertices with the shortest distance as vertices to be compared with the current vertex by taking the current calculation vertex as a center.
The three-dimensional model vertex distance calculation formula is as follows:
Figure GDA0002124669190000074
(x1,y1,z1),(x2,y2,z2) Respectively the coordinates of the two vertices.
The specific implementation process of the step (5) is as follows:
5-1, firstly, in order to increase the threshold self-adaptive capacity, a similarity threshold value of curvature difference in the vertex cluster to which each central vertex belongs needs to be dynamically calculated.
Figure GDA0002124669190000075
The central vertex refers to a vertex currently calculated in the vertex cluster; a represents an average value of the difference in the tortuosity. c (v)0) Represents the curvature of the central vertex, and c (v) represents the curvature of other neighboring vertices within the vertex cluster.
Figure GDA0002124669190000081
Figure GDA0002124669190000082
Representing a curvature similarity threshold value, a representing an average value of curvature differences, c representing a curvature value of a vertex in a cluster, N representing the number of cluster points, wherein 36 is taken;
if the curvature absolute value difference value of the current vertex and the adjacent vertex is smaller than the curvature similarity threshold, taking the current vertex as a candidate point;
5-2. calculate the USAN value for each candidate point:
Figure GDA0002124669190000083
Figure GDA0002124669190000084
denotes a curvature similarity threshold, c (v)0) Represents the curvature of the central vertex, and c (v) represents the curvature of other neighboring vertices within the vertex cluster. d represents the USAN value of a candidate point, and the initial value is 0.
Referring to SUSAN operator applied to two-dimensional image focus extraction, the geometric threshold is generally equal to 1/2 for USAN regions when extracting corners, so in 3D-SUSAN operator, we also choose 1/2 for USAN regions as the geometric threshold:
Figure GDA0002124669190000085
g(v0) Denotes v0Geometric threshold of d (v)0) Represents v0The USAN region value of (1).
And if the USAN value of the candidate point is smaller than the geometric threshold, the candidate point is regarded as the corner point.
Figure GDA0002124669190000086
And 5-3, performing non-maximum suppression on the obtained corner points to finally obtain the interest points, wherein the attached figure 3 is a non-maximum suppression flow chart.
The evaluation algorithm in step (6) is specifically as follows: the evaluation algorithm is specifically divided into two steps: firstly, establishing a real scene of the model, and secondly, evaluating an interest point extraction algorithm by using the established real scene.
(1) Establishing an evaluation standard of the Interest point identification of the real scene of the model, wherein the evaluation standard refers to a website A benchmark Marked by Human Subjects for 3D Interest Points[2]Including three-dimensional models and a manually extracted point of interest coordinate set for each model.
(2) And evaluating the interest points extracted by the algorithm by using an established real scene evaluation interest point extraction algorithm and using FPE (false positive errors), FNE (false negative errors) and WME (weighted miss errors).
Gr=(p∈M|d(g,p)≤r)
GrRepresenting the interest point set of the real scene of the model, M representing the interest point of the algorithm extraction model, d representing the distance between a vertex and the vertex, and r representing the area of the radius.
Figure GDA0002124669190000091
NCRepresenting the correct number of points extracted by the algorithm, NGRepresenting the total number of points of interest in the real scene.
Figure GDA0002124669190000092
NANumber of points of interest, N, extracted by the expression algorithmCIndicating the correct number of points extracted by the algorithm.
Figure GDA0002124669190000093
Figure GDA0002124669190000094
niRepresents niIndividual volunteers identified point i, giThe 3D-SUSAN evaluation result of the real scene with the radius r is shown in the figure 43D-SUSAN algorithm evaluation result chart.iIs an intermediate variable parameter.
FNE represents the missing rate, FPE represents the surplus rate, WME represents the error rate, and the faster they reduce the rate, the better the algorithm extracts the interest points.
The method aims at the result of extracting the interest points of the three-dimensional model by the algorithm, evaluates the error rate, the loss rate and the error rate, and compares the result with the interest points extracted by the Mesh saliency. As can be seen from FIG. 5, the same as Mesh saliency[3]In contrast, the 3D-SUSAN extracts significantly less points of interest than the Mesh salinacy, and as can be seen from fig. 5, the identification result of the 3D-SUSAN is almost the same as the points of interest identified by human beings. Therefore, the method is improved in the aspect of reducing the number of the three-dimensional model interest points extracted by the algorithm.

Claims (1)

1. The three-dimensional model interest point extraction method based on the 3D-SUSAN operator is characterized by comprising the following steps of:
step (1): calculating the vertex position coordinates of the three-dimensional model to be extracted by using a grid reading algorithm, and simplifying the vertices of the three-dimensional model to be extracted by using a grid simplifying operator;
step (2): calculating a curvature value of each vertex by using a curvature operator;
and (3): carrying out Gaussian processing on the curvature value by using a Gaussian function, and removing noise to obtain a denoised curvature value;
and (4): defining a vertex cluster, and selecting the first 36 neighbor vertexes with the shortest distance as the vertex cluster compared with the current vertex curvature value according to the distance between the neighbor vertexes and the current vertex;
and (5): according to the vertex cluster defined in the step (4), if the difference value of the curvature values of the current vertex and other neighbor vertices in the vertex cluster is smaller than a defined similarity threshold value, taking the current vertex as a candidate point; then comparing the USAN value of the candidate point with the size of the geometric threshold, and if the USAN value is smaller than the size of the geometric threshold, returning the candidate point as the interest point;
and (6): defining evaluation operators FPE, FNE and WME, and evaluating the interest points obtained in the step (5) according to the algorithm missing rate, the redundant rate and the error rate;
calculating the curvature value of each vertex by using a curvature operator in the step (2), wherein the curvature value calculation formula is as follows:
let the three-dimensional curve equation be: x (t), y (y) (t), z (z) (t);
1) respectively obtaining x ' (t), y ' (t) and z ' (t) by derivation;
2) respectively solving 2-order derivatives to obtain x ' (t), y ' (t) and z ' (t);
3) the three first orders are taken together as a three-dimensional vector: r ═ x ' (t), y ' (t), z ' (t));
4) three second order derivatives are considered together as a three-dimensional vector: r ″ (x "(t), y" (t), z "(t));
5) the formula for calculating the curvature value is as follows:
Figure FDA0002508314180000011
in the step (3), the vertex curvature value in the step (2) is subjected to Gaussian function processing by using a Gaussian formula to obtain a Gaussian weighted average of average curvature values in a sigma region, wherein the Gaussian function formula is as follows:
Figure FDA0002508314180000012
g is the gaussian weighted average of the average curvature values within the sigma region,
Figure FDA0002508314180000021
is the curvature value of the vertex, x is the neighbor vertex of the current computed vertex v in the σ region;
defining vertex clusters in the step (4) refers to selecting the first 36 neighbor vertexes with the shortest distance as vertexes compared with the current vertex by taking the current calculation vertex as a center;
the three-dimensional model vertex distance calculation formula is as follows:
Figure FDA0002508314180000022
(x1,y1,z1),(x2,y2,z2) Coordinates of two vertexes, respectively;
the specific implementation process of the step (5) is as follows:
5-1, firstly, in order to increase the threshold self-adaptive capacity, dynamically calculating a similarity threshold of curvature values in a vertex cluster to which each central vertex belongs;
Figure FDA0002508314180000023
the central vertex refers to a vertex currently calculated in the vertex cluster; a represents the mean value of the differences of the curvature values; c (v)0) Represents the central vertex curvature value, c (v) represents the other neighbor vertex curvature values within the vertex cluster:
Figure FDA0002508314180000024
Figure FDA0002508314180000025
representing a curvature value similarity threshold value, a representing a difference average value of curvature values, c representing a curvature value of a vertex in a cluster, N representing the number of cluster points, wherein 36 is taken;
if the absolute value difference value of the curvature values of the current vertex and the adjacent vertex is smaller than the similarity threshold value of the curvature values, taking the current vertex as a candidate point;
5-2. calculate the USAN value for each candidate point:
Figure FDA0002508314180000026
wherein,
Figure FDA0002508314180000027
denotes a curvature similarity threshold, c (v)0) Representing the curvature of the central vertex, c (v) representing the curvature of other neighboring vertices within the vertex cluster; d represents the USAN value of the previous candidate point, and the initial value is 0;
referring to the SUSAN operator applied to two-dimensional image corner extraction, the geometric threshold is equal to 1/2 of the USAN region when the corner is extracted, so in the 3D-SUSAN operator, 1/2 of the USAN value is also selected as the geometric threshold:
Figure FDA0002508314180000031
g(v0) Denotes v0Geometric threshold of d (v)0) Denotes v0The USAN value of (1);
if the USAN value of the candidate point is smaller than the geometric threshold, the candidate point is regarded as the angular point;
Figure FDA0002508314180000032
5-3, performing non-maximum suppression on the obtained corner points to finally obtain interest points;
evaluating the interest points obtained in the step (5) in the step (6), which comprises the following specific steps: firstly, establishing a real scene of a model, and secondly, evaluating an interest point extraction method by using the established real scene;
(1) establishing an evaluation standard of interest point identification of a real scene of the model, wherein the reference data comprises three-dimensional models and an artificially extracted interest point coordinate set of each model;
(2) evaluating the interest points obtained in the step 5-3 by using an established real scene evaluation interest point extraction method and using FPE, FNE and WME;
Gr=(p∈M|d1(g1,p1)≤r)
Grrepresenting an interest point set of a real scene of the model, M representing the extracted interest point set, d1 representing the distance between a vertex g1 and a vertex p1, and r representing the radius of a sigma region;
Figure FDA0002508314180000033
NCindicates the correct number of points extracted, NGRepresenting the total number of interest points in the real scene;
Figure FDA0002508314180000034
NAindicates the number of points of interest extracted, NCRepresenting the correct number of points obtained by extraction;
Figure FDA0002508314180000035
Figure FDA0002508314180000041
nirepresents niIndividual volunteers identified point i, giRepresenting the real scene 3D-SUSAN evaluation at radius r.
CN201611260181.7A 2016-12-30 2016-12-30 Three-dimensional model interest point extraction method based on 3D-SUSAN operator Active CN106652048B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611260181.7A CN106652048B (en) 2016-12-30 2016-12-30 Three-dimensional model interest point extraction method based on 3D-SUSAN operator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611260181.7A CN106652048B (en) 2016-12-30 2016-12-30 Three-dimensional model interest point extraction method based on 3D-SUSAN operator

Publications (2)

Publication Number Publication Date
CN106652048A CN106652048A (en) 2017-05-10
CN106652048B true CN106652048B (en) 2020-07-14

Family

ID=58838837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611260181.7A Active CN106652048B (en) 2016-12-30 2016-12-30 Three-dimensional model interest point extraction method based on 3D-SUSAN operator

Country Status (1)

Country Link
CN (1) CN106652048B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888456B (en) * 2020-07-01 2024-05-24 长春工业大学 Corner detection method based on contour

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104992168A (en) * 2015-07-28 2015-10-21 中国科学院自动化研究所 Human behavior recognition method based on kernel of graph
US9171011B1 (en) * 2010-12-23 2015-10-27 Google Inc. Building search by contents
CN105243661A (en) * 2015-09-21 2016-01-13 成都融创智谷科技有限公司 Corner detection method based on SUSAN operator

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9171011B1 (en) * 2010-12-23 2015-10-27 Google Inc. Building search by contents
CN104992168A (en) * 2015-07-28 2015-10-21 中国科学院自动化研究所 Human behavior recognition method based on kernel of graph
CN105243661A (en) * 2015-09-21 2016-01-13 成都融创智谷科技有限公司 Corner detection method based on SUSAN operator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于点对分布的三维模型特征提取算法;刘晓宇等;《计算机应用》;20060101;第26卷(第1期);全文 *

Also Published As

Publication number Publication date
CN106652048A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN109544677B (en) Indoor scene main structure reconstruction method and system based on depth image key frame
CN109903327B (en) Target size measurement method of sparse point cloud
EP3695384B1 (en) Point cloud meshing method, apparatus, device and computer storage media
CN110222787B (en) Multi-scale target detection method and device, computer equipment and storage medium
CN109961506B (en) Local scene three-dimensional reconstruction method for fusion improved Census diagram
CN109685060B (en) Image processing method and device
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
CN107481274B (en) Robust reconstruction method of three-dimensional crop point cloud
CN109711416B (en) Target identification method and device, computer equipment and storage medium
Jellal et al. LS-ELAS: Line segment based efficient large scale stereo matching
CN108550166B (en) Spatial target image matching method
CN110751680A (en) Image processing method with fast alignment algorithm
CN108062523B (en) Infrared far-small target detection method
CN114359437A (en) Building structure two-dimensional plane map reconstruction method based on point cloud
CN111507921A (en) Tunnel point cloud denoising method based on low-rank recovery
CN116385281A (en) Remote sensing image denoising method based on real noise model and generated countermeasure network
CN108447038B (en) Grid denoising method based on non-local total variation operator
CN106652048B (en) Three-dimensional model interest point extraction method based on 3D-SUSAN operator
CN112884884A (en) Candidate region generation method and system
CN117132737A (en) Three-dimensional building model construction method, system and equipment
CN113111741A (en) Assembly state identification method based on three-dimensional feature points
Xiao et al. A topological approach for segmenting human body shape
CN116310832A (en) Remote sensing image processing method, device, equipment, medium and product
CN114723973A (en) Image feature matching method and device for large-scale change robustness
CN112884817B (en) Dense optical flow calculation method, dense optical flow calculation device, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant