CN111179321B - Point cloud registration method based on template matching - Google Patents

Point cloud registration method based on template matching Download PDF

Info

Publication number
CN111179321B
CN111179321B CN201911397487.0A CN201911397487A CN111179321B CN 111179321 B CN111179321 B CN 111179321B CN 201911397487 A CN201911397487 A CN 201911397487A CN 111179321 B CN111179321 B CN 111179321B
Authority
CN
China
Prior art keywords
point cloud
template
registration
scene
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911397487.0A
Other languages
Chinese (zh)
Other versions
CN111179321A (en
Inventor
张腾飞
严律
王明松
王杰高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Estun Robotics Co Ltd
Original Assignee
Nanjing Estun Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Estun Robotics Co Ltd filed Critical Nanjing Estun Robotics Co Ltd
Priority to CN201911397487.0A priority Critical patent/CN111179321B/en
Publication of CN111179321A publication Critical patent/CN111179321A/en
Application granted granted Critical
Publication of CN111179321B publication Critical patent/CN111179321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a point cloud registration method based on template matching, which is characterized in that a conversion relation between a point cloud template and a target point cloud is calculated through a two-dimensional template matching method, template point cloud data are converted according to the conversion relation, the converted result is used as a coarse registration result and a fine registration initial value, ICP or NDT fine registration is performed, and finally registration of a source point cloud and the target point cloud is realized. The method shortens the overall processing time of the point cloud, improves the production efficiency, and realizes the rapid and high-precision registration of the point cloud. The method for providing the fine registration initial value by using the template matching conversion template point cloud solves the problems of poor universality and long processing time of the traditional coarse registration.

Description

Point cloud registration method based on template matching
Technical Field
The invention relates to a high-precision rigid point cloud registration method, in particular to a point cloud registration method based on template matching.
Background
Along with the continuous breakthrough of computer vision technology and sensor technology, the method for generating object point cloud by laser scanning is rapidly developed and perfected. Currently, the acquisition of the point cloud can be acquired through a depth camera, a binocular camera, a 3D laser scanning camera and other devices. The object point cloud-based processing technology is widely used in the fields of man-machine interaction, virtual reality, reverse engineering, machine vision and the like.
In the field of machine vision, point cloud registration processing techniques are an important method for estimating the relative pose of objects in a scene. The point cloud registration estimation object pose is to determine a conversion relationship from a source point cloud to a target point cloud in a scene (R, T matrix), register a template point cloud as the source point cloud and an object point cloud in the scene as the target point cloud, and obtain a conversion pose relationship of the object point cloud relative to the template point cloud, thereby realizing the function of estimating the pose of an object in an industrial scene. A common method for point cloud registration is coarse registration followed by fine registration. Common methods for coarse registration include hough-based algorithms and initial registration algorithms based on sampling consistency. The rough registration firstly extracts local geometric information (such as curvature, normal vector, adjacent point density and the like) of point cloud data as description features, searches for the corresponding relation between a source point cloud set and a target point set through the features, and calculates by using an iterative algorithm to obtain a conversion relation. Common methods for fine registration are iterative closest point algorithm (ICP) and normal distribution transform algorithm (NDT). The fine registration is a registration method of point sets based on contour features. And firstly carrying out SAC-IA rough registration on the source point cloud and the target point cloud to be registered, so that the distance between the matched points is within a certain threshold value, and the distance is used as a matched point pair. And then calculating the distance from the source point cloud to each point of the target point cloud according to the initial point corresponding relation matched with the nearest principle. And eliminating the wrong corresponding point pair by adopting a direction vector threshold value, determining an objective function, and calculating a transformation matrix until a convergence condition is met. The coarse registration provides initial registration values for the fine registration, and the registration effect directly influences the execution efficiency and accuracy of the fine registration algorithm.
The existing method for the rough registration of the point cloud has the following problems: 1. when the amount of point cloud data is large, coarse registration is time-consuming. 2. For partial object point clouds, features are difficult to select, and single feature information cannot well describe integral point cloud information. 3. Coarse registration is greatly influenced by the field acquisition point cloud effect, the registration result is unstable, and the robustness is poor.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a point cloud registration method based on template matching, which can realize high-precision and high-efficiency registration of point cloud data in a scene.
The basic technical idea of the method is as follows: and a traditional coarse registration method is not used, the conversion relation between the point cloud template and the target point cloud is calculated through template matching, then the point cloud coarse registration is replaced by a method for converting the point cloud data according to the conversion relation, the conversion result is used as an initial value of fine registration, ICP or NDT fine registration is carried out, and finally the registration of the source point cloud and the target point cloud is realized.
The invention discloses a point cloud registration method based on template matching, which comprises the following steps:
and 1, using a binocular camera or a 3D laser scanning camera, placing a single workpiece under the field of view of the camera to scan, obtaining a workpiece template point cloud, and placing a plurality of workpieces under the field of view of the camera to obtain scene object point cloud data.
And 2, manufacturing a workpiece point cloud template matching picture, mapping the workpiece point cloud acquired in the step 1 into a two-dimensional space depth map, setting template related parameters, and storing the two-dimensional template point cloud depth map.
And step 3, firstly cutting point clouds acquired on site, reserving the point clouds of the plane where the workpiece is located, then carrying out rasterization treatment, reducing the data volume of the point clouds, then carrying out filtering operation, removing noise points, removing the plane by using a Euclidean segmentation method, and finally segmenting the final workpiece point clouds by using a region growing method. The final workpiece point cloud is the target point cloud to be registered.
And 4, mapping the point cloud data preprocessed in the step 3 to a depth map.
And 5, performing template matching, namely performing edge feature recognition on the point cloud depth map of the template stored in the step 2, performing edge feature recognition on the point cloud number depth map mapped in the step 4, and calculating corresponding center point and rotation angle information by a template matching method. .
And 6, according to the template matching result in the step 5, converting the center point and the rotation angle into a matrix form, and translating and rotating the template point cloud.
And 7, transforming the template point cloud in the step 6 to the final workpiece point cloud in the step 3 to realize that the center points coincide and the rotation angles are equal.
And 8, registering the template point cloud and the scene point cloud by using an ICP or NDT precise registration method to realize registration of the source point cloud and the target point cloud.
The method using ICP or NDT fine registration: converting the template point cloud into a formula (2) under a world coordinate system according to a formula (1) for calculating registration of the point cloud, rotationally translating the template point cloud P into a plane formula (3) parallel to the scene point cloud Q according to two-dimensional template matching information, converting the template point cloud P processed in the previous two steps into an actual scene point cloud Q, and recording as P' (x) 1’…n’ ,y 1’…n’ ,z 1’…n’ ) Recording the point cloud data as an initial value formula (4) of point cloud fine registration, roughly registering the processed point cloud into a point cloud P', and performing fine matching by using an ICP or NDT registration methodAnd (3) summarizing a quasi-scene point cloud Q formula (5) and formulas (1) (2) (3) (4) (5) to obtain a point cloud registration algorithm calculation formula:
Q(x 1…n ,y 1…n ,z 1…n )=T(R,T)×P(x 1…n ,y 1…n ,z 1…n ) (1)
P m-w (x 1…n ,y 1…n ,z 1…n )=T m-w (R m-w ,T m-w )×P(x 1…n ,y 1…n ,z 1…n ) (2)
P m-w-s (x 1…n ,y 1…n ,z 1…n )=T w:m-s (R m-s ,T m-s )×P m-w (x 1…n ,y 1…n ,z 1…n ) (3)
P m-w-s-s1’…sn’ (x 1’…n’ ,y 1’…n’ ,z 1’…n’ )=T m-w-s-s1’…sn’ (R w-s ,T w-s )×P m-w-s (x 1…n ,y 1…n ,z 1…n ) (4)
P m-w-s-s1’-sn (x 1’…n’ ,y 1’…n’ ,z 1’…n’ )=T s1’-sn (R s-s’ ,T s-s’ )×P m-w-s-s1’…sn’ (x 1’…n’ ,y 1’…n’ ,z 1’…n’ ) (5)
Q(x 1…n ,y 1…n ,z 1…n )=T(R,T)×P(x 1…n ,y 1…n ,z 1…n )
Q(x 1…n ,y 1…n ,z 1…n )=T s1’-sn (R s-s’ ,T s-s’ )×T m-w-s-s1’…sn’ (R w-s ,T w-s )×T w:m-s (R m-s ,T m-s )×T m-w (R m-w ,T m-w )×P(x 1…n ,y 1…n ,z 1…n )
in the above formula, O w X w Y w Z w Is the world coordinate system of the scene, O m X m Y m Z m Is a mouldBoard point cloud coordinate system, O s1’ X s1’ Y s1’ Z s1’ Coarse registration coordinate system for object segmentation block 1 point cloud, O sn’ X sn’ Y sn’ Z sn’ Coarse registration coordinate system for object segmentation block n-point cloud, O s1 X s1 Y s1 Z s1 Dividing a 1-point cloud coordinate system of a block for an object, O sn X sn Y sn Z sn And dividing the n-point cloud coordinate system of the block for the object. R is R m-w 、T m-w Is a conversion relation matrix for converting the template point cloud into a world coordinate system, R m-s 、T m-s Is a conversion relation matrix for converting a two-dimensional template point cloud into a scene point cloud, R w-s’ 、T w-s’ The conversion relation matrix is used for converting the template point cloud into the scene point cloud under the world coordinate system; r is R s’-s 、T s’-s The conversion relation from the point cloud coarse registration to the fine registration is adopted; the template point cloud is denoted as P (x) 1…n ,y 1…n ,z 1…n ) The scene point cloud is noted as Q (x 1…n ,y 1…n ,z 1…n )。
The method provided by the invention realizes rapid and high-precision registration of the point cloud, shortens the overall processing time of the point cloud, and improves the production efficiency. The method for providing the fine registration initial value by using the template matching conversion template point cloud solves the problems of poor universality and long processing time of the introduction of the traditional coarse registration. The method fully utilizes the advantages of the two-dimensional template matching method, combines the two-dimensional template matching with the three-dimensional point cloud registration, meets the requirements on precision, and provides a new solution for the point cloud registration.
Drawings
FIG. 1 is a flow chart of a point cloud registration method of the present invention.
Fig. 2 is a flow chart of a point cloud template manufactured by the point cloud registration method of the present invention.
Fig. 3 is a flow chart of point cloud preprocessing of the point cloud registration method of the present invention.
Fig. 4 is a flow chart of matching a point cloud template in the point cloud registration method of the present invention.
Fig. 5 is a schematic diagram of point cloud registration and conversion relations in the point cloud registration method of the present invention.
Detailed Description
The process according to the invention is described in further detail below with reference to examples and figures.
As shown in fig. 1, a flowchart of a general scheme of a point cloud registration method based on template matching is provided. Firstly, manufacturing a point cloud template, and the manufacturing method comprises the following steps: collecting point cloud data, preprocessing and storing the point cloud template data, and then operating a main program registration algorithm, wherein the steps are as follows: firstly, carrying out point cloud preprocessing, namely generating an object block point cloud with small noise and obvious characteristics, then carrying out two-dimensional template matching, wherein the template is data manufactured in the first step, a test picture is a depth map of each segmented block, matching out an object to be identified, recording related parameters, transforming the template point cloud according to the parameters, and finally carrying out fine registration on the point cloud by taking the transformed point cloud as an initial value and using an ICP or NDT method.
As shown in fig. 2, a flow chart of the point cloud template fabrication of the registration method is shown. The template requirement is a point cloud template diagram of a single object and well representing the characteristics of the object, and the point cloud mapping angle requirement is reasonable and kept right below the camera as far as possible. The method comprises the steps of manufacturing a template point cloud, firstly collecting the point cloud of a single object, secondly preprocessing, subdividing a background, mapping a depth map of the template point cloud after operation, selecting a region of interest of the template by using a square region frame to select the template point cloud, and then setting relevant parameters of template matching, mainly parameters such as precision grade, matching threshold, maximum matching number, center point coordinates and the like, to generate a template model. And (3) collecting the point cloud from the site, mapping the depth map for testing, if the object is correctly found, storing a point cloud template, and if the object is not found, modifying the template point cloud or related parameters until the object is matched.
As shown in fig. 3, a flow chart of the point cloud preprocessing of the registration method is shown. The preprocessing is an indispensable step of a point cloud registration algorithm, is beneficial to the manufacture of template point clouds, reduces the calculated amount of the point clouds during registration, shortens the time required by registration, and improves the efficiency. Firstly, collecting object field point clouds, and firstly cutting the square of the point clouds, wherein the method is to form square squares for cutting by utilizing direct filtering in three directions of X, Y, Z axes. And then carrying out point cloud rasterization, and setting a reasonable rasterization radius by using a voxel rasterization method. And then filtering the point cloud, wherein the point cloud filtering mainly uses median filtering and statistical filtering, and aims to remove outliers and enable the edge of the point cloud to be smoother. Finally, the point cloud segmentation is realized, the point cloud segmentation is mainly based on the segmentation of random sampling consistency background segmentation and the segmentation of clustering single objects, the background segmentation aims at removing redundant point clouds except object point clouds, and the clustering segmentation is the more complete object point cloud data acquired on the segmentation site. And (3) finishing the preprocessing flow, mapping the partitioned point cloud data to a depth map, and continuing the subsequent template matching work.
As shown in fig. 4, a flow chart of the point cloud template matching of the registration method is shown. Template matching is a key step of a point cloud registration algorithm, and provides an initial value for the next point cloud fine registration. Firstly, loading a template picture, wherein the template picture is a depth map of the point cloud mapping which is successfully tested. And loading a field processing picture, wherein the field picture is a depth map of each object segmentation block generated by preprocessing the field point cloud. And then reading a configuration file of related parameters set by the template, setting parameters matched by the template, and carrying out template matching. And finally, calculating the offset and the rotation angle, wherein the offset and the rotation angle are mainly values of the rotation angle and the offset of the characteristic points between the site point cloud and the template point cloud.
Fig. 5 is a schematic diagram of point cloud registration and conversion relations of the registration method. The method replaces coarse registration of the point cloud by a two-dimensional template matching method, provides a more reliable initial value of fine registration of the point cloud, and improves the registration accuracy of the point cloud. In the figure, O w X w Y w Z w Is the world coordinate system of the scene, O m X m Y m Z m As a template point cloud coordinate system, O s1’ X s1’ Y s1’ Z s1’ Coarse registration coordinate system for object segmentation block 1 point cloud, O sn’ X sn’ Y sn’ Z sn’ Coarse registration coordinate system for object segmentation block n-point cloud, O s1 X s1 Y s1 Z s1 Dividing a 1-point cloud coordinate system of a block for an object, O sn X sn Y sn Z sn And dividing the n-point cloud coordinate system of the block for the object. R is R m-w 、T m-w Is a conversion relation matrix for converting the template point cloud into a world coordinate system, R m-s 、T m-s Is a conversion relation matrix for converting a two-dimensional template point cloud into a scene point cloud, R w-s’ 、T w-s’ Is a transformation relation matrix for transforming the template point cloud into the scene point cloud under the world coordinate system. R is R s’-s 、T s’-s Is the conversion relation from the point cloud coarse registration to the fine registration. The template point cloud is denoted as P (x) 1…n ,y 1…n ,z 1…n ) The scene point cloud is noted as Q (x 1…n ,y 1…n ,z 1…n ) The registration process is to calculate the conversion relation between the template point cloud P and the scene point cloud Q, and the expression is as follows:
Q(x 1…n ,y 1…n ,z 1…n )=T(R,T)×P(x 1…n ,y 1…n ,z 1…n ) (1)
conversion relation, first step: converting the template point cloud into a world coordinate system,
P m-w (x 1…n ,y 1…n ,z 1…n )=T m-w (R m-w ,T m-w )×P(x 1…n ,y 1…n ,z 1…n ) (2)
and a second step of: according to the two-dimensional template matching information, the template point cloud P is rotationally translated to the parallel surface of the scene point cloud Q,
P m-w-s (x 1…n ,y 1…n ,z 1…n )=T w:m-s (R m-s ,T m-s )×P m-w (x 1…n ,y 1…n ,z 1…n ) (3)
and a third step of: transforming the template point cloud P processed in the previous two steps to the actual scene point cloud Q, and marking as P' (x) 1’…n’ ,y 1’…n’ ,z 1’…n’ ) This point cloud data is noted as the initial value of the point cloud fine registration.
P m-w-s-s1’…sn’ (x 1’…n’ ,y 1’…n’ ,z 1’…n’ )=T m-w-s-s1’…sn’ (R w-s ,T w-s )×P m-w-s (x 1…n ,y 1…n ,z 1…n ) (4)
Fourth step: coarsely registering the processed point cloud P', finely registering the point cloud P to the scene point cloud Q by using an ICP or NDT registering method,
P m-w-s-s1’-sn (x 1’…n’ ,y 1’…n’ ,z 1’…n’ )=T s1’-sn (R s-s’ ,T s-s’ )×P m-w-s-s1’…sn’ (x 1’…n’ ,y 1’…n’ ,z 1’…n’ ) (5)
summarizing the formulas (1) (2) (3) (4) (5) to obtain a calculation formula of the point cloud registration algorithm:
Q(x 1…n ,y 1…n ,z 1…n )=T(R,T)×P(x 1…n ,y 1…n ,z 1…n )
Q(x 1…n ,y 1…n ,z 1…n )=T s1’-sn (R s-s’ ,T s-s’ )×T m-w-s-s1’…sn’ (R w-s ,T w-s )×T w:m-s (R m-s ,T m-s )×T m-w (R m-w ,T m-w )×P(x 1…n ,y 1…n ,z 1…n )。

Claims (1)

1. a point cloud registration method based on template matching comprises the following steps:
step 1, acquiring point cloud data, namely acquiring workpiece template point cloud data and scene object point cloud data under the view of a camera;
step 2, manufacturing a point cloud template matching picture, namely generating a template matching algorithm picture of a workpiece template point cloud to be registered by a depth map mapping method;
step 3, cutting, rasterizing, filtering, background segmentation and single object segmentation are carried out on the object point cloud acquired on site;
step 4, mapping each object point cloud block with a point cloud depth map after the segmentation is completed;
step 5, performing template matching, wherein the template is the template picture generated in the step 2, the field object test picture is the depth map of each point cloud block mapped in the step 4, the template matching is realized through feature recognition, and corresponding matching information is reserved;
step 6, calculating a rotation and translation matrix corresponding to the center point and the rotation angle according to the template matching result, and converting the template point cloud data into a template point cloud coplanar with the scene point cloud;
step 7, through the step 1-6, the point cloud transformation based on the two-dimensional template matching is completed, the transformation result is translated and rotated to the scene point cloud, and the rough registration of the point cloud, namely the alignment of the center point and the rotation angle, is realized;
step 8, taking the point cloud transformation result obtained in the step 7 as an initial value of point cloud fine registration, performing ICP or NDT fine registration, and finally realizing registration of the template point cloud and the scene point cloud;
the ICP or NDT fine registration method comprises the following steps:
step 8.1 registration of computing Point clouds
Q(x 1…n ,y 1…n ,z 1…n )=T(R,T)×P(x 1…n ,y 1…n ,z 1…n );
Step 8.2 converting the template point cloud into the world coordinate system according to the following
P m-w (x 1…n ,y 1…n ,z 1…n )=T m-w (R m-w ,T m-w )×P(x 1…n ,y 1…n ,z 1…n );
Step 8.3, according to the two-dimensional template matching information, rotationally translating the template point cloud P to a plane parallel to the scene point cloud Q
P m-w-s (x 1…n ,y 1…n ,z 1…n )=T w:m-s (R m-s ,T m-s )×P m-w (x 1…n ,y 1…n ,z 1…n );
Step 8.4, transforming the template point cloud P processed in the step 8.2 and the step 8.3 to an actual scene point cloud Q, and marking as P' (x) 1’…n’ ,y 1’…n’ ,z 1’…n’ ) Point cloud countRecorded as initial value of point cloud fine registration
P m-w-s-s1’…sn’ (x 1’…n’ ,y 1’…n’ ,z 1’…n’ )=T m-w-s-s1’…sn’ (R w-s ,T w-s )×P m-w-s (x 1…n ,y 1…n ,z 1…n );
Step 8.5, performing coarse registration on the processed point cloud to obtain a point cloud P ', and performing fine registration on the point cloud P' to obtain a scene point cloud Q by using an ICP or NDT registration method
P m-w-s-s1’-sn (x 1’…n’ ,y 1’…n’ ,z 1’…n’ )=T s1’-sn (R s-s’ ,T s-s’ )×P m-w-s-s1’…sn’ (x 1’…n’ ,y 1’…n’ ,z 1’…n’ );
Step 8.6 point cloud registration:
Q(x 1…n ,y 1…n ,z 1…n )=T s1’-sn (R s-s’ ,T s-s’ )×T m-w-s-s1’…sn’ (R w-s ,T w-s )×T w:m-s (R m-s ,T m-s )
×T m-w (R m-w ,T m-w )×P(x 1…n ,y 1…n ,z 1…n )
in the above formula, O w X w Y w Z w Is the world coordinate system of the scene, O m X m Y m Z m As a template point cloud coordinate system, O s1’ X s1’ Y s1’ Z s1’ Coarse registration coordinate system for object segmentation block 1 point cloud, O sn’ X sn’ Y sn’ Z sn’ Coarse registration coordinate system for object segmentation block n-point cloud, O s1 X s1 Y s1 Z s1 Dividing a 1-point cloud coordinate system of a block for an object, O sn X sn Y sn Z sn Dividing a block n-point cloud coordinate system for an object; r is R m-w 、T m-w Is a conversion relation matrix for converting the template point cloud into a world coordinate system, R m-s 、T m-s Is a two-dimensional template point cloud converted into scene pointsCloud conversion relation matrix R w-s 、T w-s The conversion relation matrix is used for converting the template point cloud into the scene point cloud under the world coordinate system; r is R s-s’ 、T s-s’ The conversion relation from the point cloud coarse registration to the fine registration is adopted; the template point cloud is denoted as P (x) 1…n ,y 1…n ,z 1…n ) The scene point cloud is noted as Q (x 1…n ,y 1…n ,z 1…n )。
CN201911397487.0A 2019-12-30 2019-12-30 Point cloud registration method based on template matching Active CN111179321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911397487.0A CN111179321B (en) 2019-12-30 2019-12-30 Point cloud registration method based on template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911397487.0A CN111179321B (en) 2019-12-30 2019-12-30 Point cloud registration method based on template matching

Publications (2)

Publication Number Publication Date
CN111179321A CN111179321A (en) 2020-05-19
CN111179321B true CN111179321B (en) 2023-11-14

Family

ID=70650513

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911397487.0A Active CN111179321B (en) 2019-12-30 2019-12-30 Point cloud registration method based on template matching

Country Status (1)

Country Link
CN (1) CN111179321B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111791239B (en) * 2020-08-19 2022-08-19 苏州国岭技研智能科技有限公司 Method for realizing accurate grabbing by combining three-dimensional visual recognition
CN112446907B (en) * 2020-11-19 2022-09-06 武汉中海庭数据技术有限公司 Method and device for registering single-line point cloud and multi-line point cloud
CN112465908B (en) * 2020-11-30 2023-09-22 深圳市优必选科技股份有限公司 Object positioning method, device, terminal equipment and storage medium
CN114049355B (en) * 2022-01-14 2022-04-19 杭州灵西机器人智能科技有限公司 Method, system and device for identifying and labeling scattered workpieces
CN114897974B (en) * 2022-07-15 2022-09-27 江西省智能产业技术创新研究院 Target object space positioning method, system, storage medium and computer equipment
CN115588051B (en) * 2022-09-29 2023-06-13 中国矿业大学(北京) Automatic calibration method for laser radar and camera space position in ore processing link

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012203894A (en) * 2011-03-28 2012-10-22 Kumamoto Univ Three-dimensional pattern matching method
CN108830902A (en) * 2018-04-19 2018-11-16 江南大学 A kind of workpiece identification at random and localization method based on points cloud processing
CN110246127A (en) * 2019-06-17 2019-09-17 南京工程学院 Workpiece identification and localization method and system, sorting system based on depth camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012203894A (en) * 2011-03-28 2012-10-22 Kumamoto Univ Three-dimensional pattern matching method
CN108830902A (en) * 2018-04-19 2018-11-16 江南大学 A kind of workpiece identification at random and localization method based on points cloud processing
CN110246127A (en) * 2019-06-17 2019-09-17 南京工程学院 Workpiece identification and localization method and system, sorting system based on depth camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张桂杨 ; 苑壮 ; 陶刚 ; .基于NDT和ICP融合的点云配准方法.北京测绘.2019,(12),全文. *

Also Published As

Publication number Publication date
CN111179321A (en) 2020-05-19

Similar Documents

Publication Publication Date Title
CN111179321B (en) Point cloud registration method based on template matching
CN112348864B (en) Three-dimensional point cloud automatic registration method for laser contour features of fusion line
CN110335234B (en) Three-dimensional change detection method based on antique LiDAR point cloud
CN109544612B (en) Point cloud registration method based on feature point geometric surface description
CN109767463B (en) Automatic registration method for three-dimensional point cloud
CN109272523B (en) Random stacking piston pose estimation method based on improved CVFH (continuously variable frequency) and CRH (Crh) characteristics
JP4785880B2 (en) System and method for 3D object recognition
CN109493372B (en) Rapid global optimization registration method for product point cloud data with large data volume and few characteristics
CN107358629B (en) Indoor mapping and positioning method based on target identification
CN108830888B (en) Coarse matching method based on improved multi-scale covariance matrix characteristic descriptor
CN111028280B (en) # -shaped structured light camera system and method for performing scaled three-dimensional reconstruction of target
CN114202566A (en) Glue path guiding and positioning method based on shape coarse registration and ICP point cloud fine registration
CN113516695B (en) Point cloud registration strategy in laser profiler flatness measurement
CN107220928A (en) A kind of tooth CT image pixel datas are converted to the method for 3D printing data
CN112132876B (en) Initial pose estimation method in 2D-3D image registration
CN109965979A (en) A kind of steady Use of Neuronavigation automatic registration method without index point
CN111523547A (en) 3D semantic segmentation method and terminal
CN111820545A (en) Method for automatically generating sole glue spraying track by combining offline and online scanning
CN114463396B (en) Point cloud registration method utilizing plane shape and topological graph voting
CN114241018A (en) Tooth point cloud registration method and system and readable storage medium
CN114170284A (en) Multi-view point cloud registration method based on active landmark point projection assistance
CN115601430A (en) Texture-free high-reflection object pose estimation method and system based on key point mapping
CN116883590A (en) Three-dimensional face point cloud optimization method, medium and system
CN111259788A (en) Method and device for detecting head and neck inflection point and computer equipment
CN117541614B (en) Space non-cooperative target close-range relative pose tracking method based on improved ICP algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant