CN111179321A - Point cloud registration method based on template matching - Google Patents

Point cloud registration method based on template matching Download PDF

Info

Publication number
CN111179321A
CN111179321A CN201911397487.0A CN201911397487A CN111179321A CN 111179321 A CN111179321 A CN 111179321A CN 201911397487 A CN201911397487 A CN 201911397487A CN 111179321 A CN111179321 A CN 111179321A
Authority
CN
China
Prior art keywords
point cloud
template
registration
scene
template matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911397487.0A
Other languages
Chinese (zh)
Other versions
CN111179321B (en
Inventor
张腾飞
严律
王明松
王杰高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Estun Robotics Co Ltd
Original Assignee
Nanjing Estun Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Estun Robotics Co Ltd filed Critical Nanjing Estun Robotics Co Ltd
Priority to CN201911397487.0A priority Critical patent/CN111179321B/en
Publication of CN111179321A publication Critical patent/CN111179321A/en
Application granted granted Critical
Publication of CN111179321B publication Critical patent/CN111179321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a point cloud registration method based on template matching, which comprises the steps of calculating a conversion relation between a point cloud template and a target point cloud by a two-dimensional template matching method, transforming template point cloud data according to the conversion relation, taking the conversion result as a coarse registration result and an initial value of fine registration, and performing ICP or NDT fine registration to finally realize the registration of the source point cloud and the target point cloud. The method shortens the overall processing time of the point cloud, improves the production efficiency and realizes the rapid and high-precision registration of the point cloud. The method for providing the initial value of the fine registration by using the template matching conversion template point cloud solves the problems of poor universality and long processing time of the traditional coarse registration.

Description

Point cloud registration method based on template matching
Technical Field
The invention relates to a high-precision rigid point cloud registration method, in particular to a point cloud registration method based on template matching.
Background
With the continuous breakthrough of computer vision technology and sensor technology, the method for generating object point cloud by laser scanning is rapidly developed and perfected. Currently, the point cloud can be acquired by a depth camera, a binocular camera, a 3D laser scanning camera, and the like. Object-based point cloud processing technology has been widely used in the fields of human-computer interaction, virtual reality, reverse engineering, machine vision, and the like.
In the field of machine vision, point cloud registration processing technology is an important method for estimating the relative pose of an object in a scene. The point cloud registration estimation object pose is that a conversion relation from source point cloud to target point cloud in a scene is determined (R, T matrix), and the template point cloud is used as the source point cloud and the object point cloud in the scene is used as the target point cloud for registration, so that the conversion pose relation of the object point cloud relative to the template point cloud can be obtained, and the function of pose estimation of the object in the industrial scene is realized. The common method for point cloud registration is coarse registration and then fine registration. The general methods for coarse registration include hough algorithm-based and initial registration algorithm based on sampling consistency. The rough registration firstly extracts local geometric information (such as curvature, normal vector, adjacent point density and the like) of the point cloud data as description features, finds the corresponding relation between a source point cloud set and a target point set through the features, and calculates by using an iterative algorithm to obtain a conversion relation. Common methods for fine registration include an iterative closest point algorithm (ICP) and a normal distribution transformation algorithm (NDT). The fine registration is a registration method of a point set to a point set based on contour features. And performing SAC-IA coarse registration on the source point cloud and the target point cloud to be registered to enable the distance between the matching points to be within a certain threshold value and serve as matching point pairs. And then, calculating the distance from the source point cloud to each point of the target point cloud according to the initial point corresponding relation matched with the distance nearest principle. And eliminating wrong corresponding point pairs by adopting a direction vector threshold value, determining a target function, and calculating a transformation matrix until a convergence condition is met. The coarse registration provides an initial value of the registration for the fine registration, and the execution efficiency and the precision of the fine registration algorithm are directly influenced by the effect of the registration.
The existing method for point cloud rough registration has the following problems: firstly, when the amount of point cloud data is large, coarse registration is time-consuming. Secondly, aiming at partial object point clouds, the characteristics are difficult to select, and the single characteristic information cannot well describe the whole point cloud information. And thirdly, the coarse registration is greatly influenced by the effect of point cloud acquisition on site, the registration result is unstable, and the robustness is poor.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, provides a point cloud registration method based on template matching, and can realize high-precision and high-efficiency registration of point cloud data in a scene.
The basic technical thought of the method of the invention is as follows: the method comprises the steps of calculating a conversion relation between a point cloud template and a target point cloud through template matching without using a traditional rough registration method, replacing point cloud rough registration by a method of converting point cloud data according to the conversion relation, taking a conversion result as an initial value of fine registration, and performing ICP or NDT fine registration to finally realize registration of the source point cloud and the target point cloud.
The invention relates to a point cloud registration method based on template matching, which comprises the following steps:
step 1, using a binocular camera or a 3D laser scanning camera, placing a single workpiece under the field of view of the camera for scanning, obtaining workpiece template point clouds, and placing a plurality of workpieces under the field of view of the camera to obtain scene object point cloud data.
And 2, making a workpiece point cloud template matching picture, mapping the workpiece template point cloud collected in the step 1 into a two-dimensional space depth map, setting template related parameters, and storing the two-dimensional template point cloud depth map.
And 3, firstly cutting the point cloud acquired on site, reserving the point cloud of the plane where the workpiece is located, then performing rasterization processing to reduce the data volume of the point cloud, then performing filtering operation to remove noise points, then removing the plane by using an Euclidean segmentation method, and finally segmenting the final workpiece point cloud by using a region growing method. And the final workpiece point cloud is the target point cloud to be registered.
And 4, mapping the point cloud data preprocessed in the step 3 into a depth map.
And 5, template matching, namely performing edge feature identification on the point cloud depth mapping image of the template stored in the step 2, performing edge feature identification on the point cloud depth mapping image mapped in the step 4, and calculating corresponding central point and rotation angle information by using a template matching method. .
And 6, transforming the central point and the rotation angle into a matrix form according to the template matching result in the step 5, and translating and rotating the template point cloud.
And 7, transforming the template point cloud obtained in the step 6 to the final workpiece point cloud obtained in the step 3 to realize the coincidence of the central points and the equal rotation angle.
And 8, registering the template point cloud and the scene point cloud by using an ICP (inductively coupled plasma) or NDT (non-dispersive Transmission technology) precise registration method, and realizing the registration of the source point cloud and the target point cloud.
The method of fine registration using ICP or NDT: calculating a formula (1) for registering point clouds, converting the template point clouds into a formula (2) under a world coordinate system, rotationally translating the template point clouds P to a surface formula (3) parallel to the scene point clouds Q according to two-dimensional template matching information, transforming the template point clouds P processed in the last two steps to the actual scene point clouds Q, and marking as P '(x') (x)1’…n’,y1’…n’,z1’…n’) And recording the point cloud data as an initial value formula (4) of point cloud fine registration, performing fine registration on the processed point cloud coarse registration point cloud P' to a scene point cloud Q formula (5) by using an ICP (inductively coupled plasma) or NDT (non-dispersive transmission) registration method, summarizing the formulas (1), (2), (3), (4) and (5), and obtaining a point cloud registration algorithm calculation formula:
Q(x1…n,y1…n,z1…n)=T(R,T)×P(x1…n,y1…n,z1…n) (1)
Pm-w(x1…n,y1…n,z1…n)=Tm-w(Rm-w,Tm-w)×P(x1…n,y1…n,z1…n) (2)
Pm-w-s(x1…n,y1…n,z1…n)=Tw:m-s(Rm-s,Tm-s)×Pm-w(x1…n,y1…n,z1…n) (3)
Pm-w-s-s1’…sn’(x1’…n’,y1’…n’,z1’…n’)=Tm-w-s-s1’…sn’(Rw-s,Tw-s)×Pm-w-s(x1…n,y1…n,z1…n) (4)
Pm-w-s-s1’-sn(x1’…n’,y1’…n’,z1’…n’)=Ts1’-sn(Rs-s’,Ts-s’)×Pm-w-s-s1’…sn’(x1’…n’,y1’…n’,z1’…n’) (5)
Q(x1…n,y1…n,z1…n)=T(R,T)×P(x1…n,y1…n,z1…n)
Q(x1…n,y1…n,z1…n)=Ts1’-sn(Rs-s’,Ts-s’)×Tm-w-s-s1’…sn’(Rw-s,Tw-s)×Tw:m-s(Rm-s,Tm-s)×Tm-w(Rm-w,Tm-w)×P(x1…n,y1…n,z1…n)
in the above formula, OwXwYwZwIs the world coordinate system of the scene, OmXmYmZmAs a template point cloud coordinate system, Os1’Xs1’Ys1’Zs1’Rough registration of the coordinate system for object segmentation block 1 point cloud, Osn’Xsn’Ysn’Zsn’Rough registration of the coordinate system for the object segmentation block n point clouds, Os1Xs1Ys1Zs1Partitioning the object into blocks 1 Point cloud coordinate System, OsnXsnYsnZsnThe object is segmented into blocks n of point cloud coordinate systems. Rm-w、Tm-wIs a conversion relation matrix of template point cloud to world coordinate system, Rm-s、Tm-sIs a transformation relation matrix of two-dimensional template point cloud to scene point cloud, Rw-s’、Tw-s’A conversion relation matrix for converting template point cloud into scene point cloud under a world coordinate system; rs’-s、Ts’-sThe conversion relationship from point cloud rough registration to fine registration is obtained; the template point cloud is marked as P (x)1…n,y1…n,z1…n) The scene point cloud is marked as Q (x)1…n,y1…n,z1…n)。
The method realizes the rapid and high-precision registration of the point cloud, shortens the overall processing time of the point cloud, and improves the production efficiency. The method for providing the initial value of the fine registration by using the template matching conversion template point cloud solves the problems of poor universality and long processing time caused by the introduction of the traditional coarse registration. The method of the invention fully utilizes the advantages of the two-dimensional template matching method, combines the two-dimensional template matching with the three-dimensional point cloud registration, meets the requirements on precision, and provides a new solution for the point cloud registration.
Drawings
FIG. 1 is a flow chart of a point cloud registration method of the present invention.
FIG. 2 is a flow chart of the point cloud registration method for making the point cloud template.
FIG. 3 is a flow chart of point cloud preprocessing of the point cloud registration method of the present invention.
FIG. 4 is a point cloud template matching flow chart of the point cloud registration method of the present invention.
FIG. 5 is a schematic diagram of a point cloud registration and transformation relationship in the point cloud registration method of the present invention.
Detailed Description
The process of the present invention will be described in further detail below with reference to examples and the accompanying drawings.
As shown in fig. 1, it is a flowchart of a general scheme of a point cloud registration method based on template matching. Firstly, a point cloud template is manufactured, and the manufacturing method comprises the following steps: collecting point cloud data, preprocessing and storing point cloud template data, and then operating a main program registration algorithm, wherein the steps are as follows: firstly, point cloud preprocessing is carried out, the purpose is to generate an object block point cloud with small noise and obvious characteristics, then two-dimensional template matching is carried out, a template is data manufactured in the first step, a test picture is a depth map of each partition block, an object to be identified is matched, related parameters are recorded, the template point cloud is transformed according to the parameters, finally, the transformed point cloud is used as an initial value, and the point cloud is precisely registered by using an ICP (inductively coupled plasma) or NDT (non-dispersive transmission) method.
Fig. 2 is a flow chart of the point cloud template making process of the registration method. The template requirement is a point cloud template graph of a single object and well expressing the characteristics of the object, and the point cloud mapping angle requirement is reasonable and is kept right below the camera as much as possible. The method comprises the steps of manufacturing template point clouds, firstly collecting point clouds of single objects, secondly preprocessing, then segmenting a background, mapping the operated template point clouds to a depth map, using a square area to select the template point clouds so as to select an interesting area of a template, then setting relevant parameters of template matching, mainly including parameters such as precision grade, a matching threshold value, the maximum matching number and a center point coordinate, and generating a template model. Collecting point clouds from a field, mapping a depth map for testing, if an object is correctly found, storing a point cloud template, and if the object is not found, modifying the template point cloud or related parameters until the object is matched.
As shown in fig. 3, it is a flow chart of point cloud preprocessing of the registration method. The preprocessing is an indispensable step of a point cloud registration algorithm, so that the template point cloud is favorably manufactured, the calculation amount of the point cloud is reduced during registration, the time required by registration is shortened, and the efficiency is improved. Firstly, collecting point clouds of an object on site, and firstly, performing point cloud square cutting, wherein the method is to use direct filtering in three directions of an X, Y, Z axis to form a square grid for cutting. And performing point cloud rasterization, and setting a reasonable rasterization radius by using a voxel rasterization method. And then, carrying out point cloud filtering, wherein the point cloud filtering mainly uses median filtering and statistical filtering and aims to remove outliers and enable the edges of the point cloud to be smoother. And finally, point cloud segmentation is realized, wherein the point cloud segmentation is mainly based on background segmentation of random sampling consistency and segmentation of a clustered single object, the background segmentation aims to remove redundant point clouds except the object point cloud, and the clustering segmentation is relatively complete object point cloud data acquired on a segmentation site. And (5) finishing the preprocessing flow, mapping the segmented point cloud data to a depth map, and continuing the subsequent work of template matching.
Fig. 4 is a flow chart of point cloud template matching in the registration method. Template matching is a key step of a point cloud registration algorithm, and provides an initial value for point cloud fine registration of the next step. Firstly, loading a template picture, wherein the template picture is a depth map of point cloud mapping which is successfully tested. And loading a field processing picture, wherein the field picture is a depth map of the object segmentation blocks generated after the field point cloud is preprocessed. And then reading a configuration file of the relevant parameters set by the template, setting parameters matched with the template, and carrying out template matching. And finally, calculating the offset and the corner, wherein the offset and the corner are mainly used for calculating the values of the corner and the offset of the characteristic point between the field point cloud and the template point cloud.
Fig. 5 is a schematic diagram of point cloud registration and transformation relationship of the registration method. The method replaces the rough registration of the point cloud by a two-dimensional template matching method, provides a more reliable initial value of the point cloud precise registration, and improves the registration accuracy of the point cloud. In the figure, OwXwYwZwIs the world coordinate system of the scene, OmXmYmZmAs a template point cloud coordinate system, Os1’Xs1’Ys1’Zs1’Rough registration of the coordinate system for object segmentation block 1 point cloud, Osn’Xsn’Ysn’Zsn’Rough registration of the coordinate system for the object segmentation block n point clouds, Os1Xs1Ys1Zs1Partitioning the object into blocks 1 Point cloud coordinate System, OsnXsnYsnZsnThe object is segmented into blocks n of point cloud coordinate systems. Rm-w、Tm-wIs a conversion relation matrix of template point cloud to world coordinate system, Rm-s、Tm-sIs a transformation relation matrix of two-dimensional template point cloud to scene point cloud, Rw-s’、Tw-s’The method is a conversion relation matrix for converting template point cloud into scene point cloud under a world coordinate system. Rs’-s、Ts’-sThe method is a conversion relation from point cloud rough registration to fine registration. The template point cloud is marked as P (x)1…n,y1…n,z1…n) The scene point cloud is marked as Q (x)1…n,y1…n,z1…n) The registration process is to find the conversion relation between the template point cloud P and the scene point cloud Q, and the expression is as follows:
Q(x1…n,y1…n,z1…n)=T(R,T)×P(x1…n,y1…n,z1…n) (1)
conversion relationship, first step: converting the template point cloud into a world coordinate system,
Pm-w(x1…n,y1…n,z1…n)=Tm-w(Rm-w,Tm-w)×P(x1…n,y1…n,z1…n) (2)
the second step is that: according to the two-dimensional template matching information, the template point cloud P is translated to the parallel surface of the scene point cloud Q in a rotating way,
Pm-w-s(x1…n,y1…n,z1…n)=Tw:m-s(Rm-s,Tm-s)×Pm-w(x1…n,y1…n,z1…n) (3)
the third step: converting the template point cloud P processed in the last two steps to an actual scene point cloud Q, and recording as P' (x)1’…n’,y1’…n’,z1’…n’) And recording the point cloud data as an initial value of point cloud fine registration.
Pm-w-s-s1’…sn’(x1’…n’,y1’…n’,z1’…n’)=Tm-w-s-s1’…sn’(Rw-s,Tw-s)×Pm-w-s(x1…n,y1…n,z1…n) (4)
The fourth step: roughly registering the point cloud P' of the processed point cloud to a scene point cloud Q by using an ICP or NDT registering method,
Pm-w-s-s1’-sn(x1’…n’,y1’…n’,z1’…n’)=Ts1’-sn(Rs-s’,Ts-s’)×Pm-w-s-s1’…sn’(x1’…n’,y1’…n’,z1’…n’) (5)
summarizing the formulas (1), (2), (3), (4) and (5) to obtain a point cloud registration algorithm calculation formula:
Q(x1…n,y1…n,z1…n)=T(R,T)×P(x1…n,y1…n,z1…n)
Q(x1…n,y1…n,z1…n)=Ts1’-sn(Rs-s’,Ts-s’)×Tm-w-s-s1’…sn’(Rw-s,Tw-s)×Tw:m-s(Rm-s,Tm-s)×Tm-w(Rm-w,Tm-w)×P(x1…n,y1…n,z1…n)。

Claims (2)

1. a point cloud registration method based on template matching comprises the following steps:
step 1, collecting point cloud data, and collecting workpiece template point cloud data and scene object point cloud data under the view field of a camera;
step 2, making a point cloud template matching picture, namely generating a picture of a template matching algorithm by a method of mapping a depth map on the point cloud of the workpiece template to be registered;
step 3, cutting, rasterizing, filtering, background segmentation and single target object segmentation operations are carried out on the point cloud of the object acquired on site;
step 4, mapping each segmented object point cloud block to a point cloud depth map;
step 5, template matching is carried out, wherein the template is the template picture generated in the step 2, the field object test picture is the depth map of each point cloud block mapped in the step 4, the template matching is realized through feature recognition, and corresponding matching information is reserved;
step 6, according to the result of template matching, calculating rotation and translation matrixes corresponding to the central point and the rotation angle, and converting template point cloud data into template point cloud coplanar with the scene point cloud;
step 7, completing point cloud transformation based on two-dimensional template matching through steps 1-6, translating and rotating a transformation result to a scene point cloud, and realizing coarse registration of the point cloud, namely aligning a central point with a rotating angle;
and 8, taking the point cloud transformation result obtained in the step 7 as an initial value of point cloud fine registration, and performing ICP or NDT fine registration to finally realize registration of the template point cloud and the scene point cloud.
2. The template matching-based point cloud registration method of claim 1, wherein: the ICP or NDT fine registration method in step 8 comprises the following steps:
step 8.1 computing registration of the point clouds
Q(x1…n,y1…n,z1…n)=T(R,T)×P(x1…n,y1…n,z1…n);
Step 8.2, converting the template point cloud into the world coordinate system according to the following formula
Pm-w(x1…n,y1…n,z1…n)=Tm-w(Rm-w,Tm-w)×P(x1…n,y1…n,z1…n);
Step 8.3, according to the two-dimensional template matching information, the template point cloud P is rotationally translated to a surface parallel to the scene point cloud Q
Pm-w-s(x1…n,y1…n,z1…n)=Tw:m-s(Rm-s,Tm-s)×Pm-w(x1…n,y1…n,z1…n);
Step 8.4, the template point cloud P processed in step 8.2 and step 8.3 is transformed to the actual scene point cloud Q and is marked as P' (x)1’…n’,y1’…n’,z1’…n’) And recording the point cloud data as initial value of point cloud precise registration
Pm-w-s-s1’…sn’(x1’…n’,y1’…n’,z1’…n’)=Tm-w-s-s1’…sn’(Rw-s,Tw-s)×Pm-w-s(x1…n,y1…n,z1…n);
Step 8.5, roughly registering the point cloud P' of the processed point cloud, and performing fine registration to the scene point cloud Q by using an ICP (inductively coupled plasma) or NDT (non-dispersive Transmission) registration method
Pm-w-s-s1’-sn(x1’…n’,y1’…n’,z1’…n’)=Ts1’-sn(Rs-s’,Ts-s’)×Pm-w-s-s1’…sn’(x1’…n’,y1’…n’,z1’…n’);
Step 8.6, point cloud registration:
Q(x1…n,y1…n,z1…n)=Ts1’-sn(Rs-s’,Ts-s’)×Tm-w-s-s1’…sn’(Rw-s,Tw-s)×Tw:m-s(Rm-s,Tm-s)×Tm-w(Rm-w,Tm-w)×P(x1…n,y1…n,z1…n)
in the above formula, OwXwYwZwIs the world coordinate system of the scene, OmXmYmZmAs a template point cloud coordinate system, Os1’Xs1’Ys1’Zs1’Rough registration of the coordinate system for object segmentation block 1 point cloud, Osn’Xsn’Ysn’Zsn’Rough registration of the coordinate system for the object segmentation block n point clouds, Os1Xs1Ys1Zs1Partitioning the object into blocks 1 Point cloud coordinate System, OsnXsnYsnZsnPartitioning an object into n point cloud coordinate systems; rm-w、Tm-wIs a conversion relation matrix of template point cloud to world coordinate system, Rm-s、Tm-sIs a transformation relation matrix of two-dimensional template point cloud to scene point cloud, Rw-s’、Tw-s’A conversion relation matrix for converting template point cloud into scene point cloud under a world coordinate system; rs’-s、Ts’-sThe conversion relationship from point cloud rough registration to fine registration is obtained; the template point cloud is marked as P (x)1…n,y1…n,z1…n) The scene point cloud is marked as Q (x)1…n,y1…n,z1…n)。
CN201911397487.0A 2019-12-30 2019-12-30 Point cloud registration method based on template matching Active CN111179321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911397487.0A CN111179321B (en) 2019-12-30 2019-12-30 Point cloud registration method based on template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911397487.0A CN111179321B (en) 2019-12-30 2019-12-30 Point cloud registration method based on template matching

Publications (2)

Publication Number Publication Date
CN111179321A true CN111179321A (en) 2020-05-19
CN111179321B CN111179321B (en) 2023-11-14

Family

ID=70650513

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911397487.0A Active CN111179321B (en) 2019-12-30 2019-12-30 Point cloud registration method based on template matching

Country Status (1)

Country Link
CN (1) CN111179321B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111791239A (en) * 2020-08-19 2020-10-20 苏州国岭技研智能科技有限公司 Method for realizing accurate grabbing by combining three-dimensional visual recognition
CN112446907A (en) * 2020-11-19 2021-03-05 武汉中海庭数据技术有限公司 Method and device for registering single-line point cloud and multi-line point cloud
CN112465908A (en) * 2020-11-30 2021-03-09 深圳市优必选科技股份有限公司 Object positioning method and device, terminal equipment and storage medium
CN114049355A (en) * 2022-01-14 2022-02-15 杭州灵西机器人智能科技有限公司 Method, system and device for identifying and labeling scattered workpieces
CN114897974A (en) * 2022-07-15 2022-08-12 江西省智能产业技术创新研究院 Target object space positioning method, system, storage medium and computer equipment
CN115588051A (en) * 2022-09-29 2023-01-10 中国矿业大学(北京) Automatic calibration method for space positions of laser radar and camera in ore processing link

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012203894A (en) * 2011-03-28 2012-10-22 Kumamoto Univ Three-dimensional pattern matching method
CN108830902A (en) * 2018-04-19 2018-11-16 江南大学 A kind of workpiece identification at random and localization method based on points cloud processing
CN110246127A (en) * 2019-06-17 2019-09-17 南京工程学院 Workpiece identification and localization method and system, sorting system based on depth camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012203894A (en) * 2011-03-28 2012-10-22 Kumamoto Univ Three-dimensional pattern matching method
CN108830902A (en) * 2018-04-19 2018-11-16 江南大学 A kind of workpiece identification at random and localization method based on points cloud processing
CN110246127A (en) * 2019-06-17 2019-09-17 南京工程学院 Workpiece identification and localization method and system, sorting system based on depth camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张桂杨;苑壮;陶刚;: "基于NDT和ICP融合的点云配准方法" *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111791239A (en) * 2020-08-19 2020-10-20 苏州国岭技研智能科技有限公司 Method for realizing accurate grabbing by combining three-dimensional visual recognition
CN112446907A (en) * 2020-11-19 2021-03-05 武汉中海庭数据技术有限公司 Method and device for registering single-line point cloud and multi-line point cloud
CN112465908A (en) * 2020-11-30 2021-03-09 深圳市优必选科技股份有限公司 Object positioning method and device, terminal equipment and storage medium
CN112465908B (en) * 2020-11-30 2023-09-22 深圳市优必选科技股份有限公司 Object positioning method, device, terminal equipment and storage medium
CN114049355A (en) * 2022-01-14 2022-02-15 杭州灵西机器人智能科技有限公司 Method, system and device for identifying and labeling scattered workpieces
CN114049355B (en) * 2022-01-14 2022-04-19 杭州灵西机器人智能科技有限公司 Method, system and device for identifying and labeling scattered workpieces
CN114897974A (en) * 2022-07-15 2022-08-12 江西省智能产业技术创新研究院 Target object space positioning method, system, storage medium and computer equipment
CN114897974B (en) * 2022-07-15 2022-09-27 江西省智能产业技术创新研究院 Target object space positioning method, system, storage medium and computer equipment
CN115588051A (en) * 2022-09-29 2023-01-10 中国矿业大学(北京) Automatic calibration method for space positions of laser radar and camera in ore processing link

Also Published As

Publication number Publication date
CN111179321B (en) 2023-11-14

Similar Documents

Publication Publication Date Title
CN111179321B (en) Point cloud registration method based on template matching
CN109272523B (en) Random stacking piston pose estimation method based on improved CVFH (continuously variable frequency) and CRH (Crh) characteristics
CN106709947B (en) Three-dimensional human body rapid modeling system based on RGBD camera
CN107123164B (en) Three-dimensional reconstruction method and system for keeping sharp features
CN111222516B (en) Method for extracting point cloud key outline features of printed circuit board
CN112348864B (en) Three-dimensional point cloud automatic registration method for laser contour features of fusion line
CN110335234B (en) Three-dimensional change detection method based on antique LiDAR point cloud
CN113178009B (en) Indoor three-dimensional reconstruction method utilizing point cloud segmentation and grid repair
CN109493372B (en) Rapid global optimization registration method for product point cloud data with large data volume and few characteristics
CN108830888B (en) Coarse matching method based on improved multi-scale covariance matrix characteristic descriptor
CN107862735B (en) RGBD three-dimensional scene reconstruction method based on structural information
JP2014081347A (en) Method for recognition and pose determination of 3d object in 3d scene
CN102411779B (en) Object model matching posture measuring method based on image
CN112164145B (en) Method for rapidly extracting indoor three-dimensional line segment structure based on point cloud data
CN115290097B (en) BIM-based real-time accurate map construction method, terminal and storage medium
CN113628263A (en) Point cloud registration method based on local curvature and neighbor characteristics thereof
CN114202566A (en) Glue path guiding and positioning method based on shape coarse registration and ICP point cloud fine registration
CN114463396B (en) Point cloud registration method utilizing plane shape and topological graph voting
CN113728360A (en) Method and apparatus for pose, size and shape measurement of objects in 3D scene
Tong et al. 3D point cloud initial registration using surface curvature and SURF matching
CN107784656B (en) Part point cloud segmentation method based on geometric elements
CN117237428A (en) Data registration method, device and medium for three-dimensional point cloud
Shen et al. A 3D modeling method of indoor objects using Kinect sensor
CN109670557B (en) Automatic highway point cloud registration method based on rod-shaped structures
Labatut et al. Hierarchical shape-based surface reconstruction for dense multi-view stereo

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant