KR20170100718A - Apparatus and method for registration of point cloud - Google Patents

Apparatus and method for registration of point cloud Download PDF

Info

Publication number
KR20170100718A
KR20170100718A KR1020160022770A KR20160022770A KR20170100718A KR 20170100718 A KR20170100718 A KR 20170100718A KR 1020160022770 A KR1020160022770 A KR 1020160022770A KR 20160022770 A KR20160022770 A KR 20160022770A KR 20170100718 A KR20170100718 A KR 20170100718A
Authority
KR
South Korea
Prior art keywords
point
points
point group
input data
matching
Prior art date
Application number
KR1020160022770A
Other languages
Korean (ko)
Inventor
정순철
장인수
남승우
최윤석
김진서
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020160022770A priority Critical patent/KR20170100718A/en
Publication of KR20170100718A publication Critical patent/KR20170100718A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Mathematical Optimization (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

An input data set is generated based on the edge sharing rate for arbitrary reference point group and another point group among the 3D scanned point group, a pair of arbitrary number of points of the input data set is selected as a subset, The transformation matrix is generated by processing the singular value decomposition for each pair of points, and the transform matrix is applied to the point group to transform the points of the remaining point group into the coordinate system of the reference point group, and the edge sharing ratio From the highest point, an arbitrary number of points are matched to generate an input data set.

Description

[0001] APPARATUS AND METHOD FOR REGISTRATION OF POINT CLOUD [0002]

The present invention relates to a point-to-point matching apparatus and method for matching a set of three-dimensional points obtained from a 3D scanner, that is, a point cloud.

As a method of restoring an actual object into a 3D model in a computer, there is a method of generating a surface composed of triangles having apexes of points of a point cloud acquired by a 3D scanner.

However, most scanners can not scan an object in 360 ° direction at a time, so you have to scan several times to cover all directions. In order to restore the 3D model, it is necessary to perform a matching process in which a set of point groups is converted into a single coordinate system, since the point groups obtained at each scanning are located on the respective coordinate systems. At this time, some points must be overlapped between the two point groups to be matched so that the matching can be performed. The matching problem is to obtain matrices (that is, a rotation matrix and a movement matrix) that transform different point groups on a coordinate system of a reference point group so that matching points spatially overlap when two point groups are given.

On the other hand, there are various kinds of algorithms for matching point groups. In a situation where two point groups to be matched are placed (for example, when two point groups are randomly positioned, when they are roughly matched, Etc.) are different. In general, if two point groups to be matched are roughly matched, the ICP (Iterative Closest Point) algorithm shows good performance. In addition, when the arrangement of the two point groups is not known, the algorithm of the RAndom SAmple Consensus (RANSAC) series is widely used.

RANSAC is a simple yet robust approach that evaluates the parameters of a mathematical model from a set of input data that contains an outlier by repeating the following procedure.

According to RANSAC, we first obtain a subset consisting of the hypothetical inliers generated by arbitrarily selecting data from the input data set.

Then, the quality of the subset is evaluated. These RANSACs do not require an initial solution and are useful in various fields because of the theoretical analysis.

However, RANSAC has a disadvantage that the execution time is too long even if the input data set grows slightly.

Specifically, the RANSAC algorithm for solving the point-matching problem is as follows. Assume that the size of each point group (i.e., the number of points) is N, and the probability that a point is matched with a point of another point group is a. That is, the a * N points of each point group are matched (i.e., overlapped) with the points in the other point group 1: 1. At this time, all cases of pairs of two points arbitrarily selected in each point group become potential normal values, and there are N * N pairs of such points. The pair of points corresponds to the input data set, and only a * N pairs of these points are actually normal values. At this time, the following steps are repeated until the specific termination condition is satisfied. For example, a subset is obtained by arbitrarily selecting three pairs of points. For reference, the reason for selecting a pair of three points is that at least three pairs of points are needed to solve the transformation matrix. Then, the quality of the acquired subset is calculated.

In order to calculate the quality as described above, a transformation matrix is first obtained such that two points of a pair of points coincide spatially. Then, the remaining points of the other point group are also converted to the coordinate system of the reference point group through the conversion matrix. Then, spatially matching points are obtained between the reference point group and the points of the converted point group. The best solution is the transformation matrix that includes the largest number of pairs of spatially coincident points between the reference point group and the points of the converted point group.

Theoretically, the number of iterations needed to find a suitable model with the probability of p in the above algorithm is given by Equation 1 below.

Figure pat00001

For example, when the size of the point cloud is 100 (ie, N = 100) and the overlap ratio is 30% (a = 0.3, ie 30 dots overlap), the probability of 99% (ie p = 0.99) The number of iterations needed to find the transform matrix is approximately 170 million times. Also, the time for one repetition is 0 (NlogN) because of the time required for obtaining the matching point described above. Therefore, there is a need for a point-of-match system and a matching method that can efficiently perform RANSAC when resolving the point-group matching problem.

In this regard, Korean Patent Laid-Open No. 10-2011-0095003 (entitled "Scan Data Matching System and Scan Data Matching Method Utilizing Same") discloses a method of performing a scan operation on n points (n? 1, natural number) (N) pieces of distance data extracted from the n distance data ("first scan data") collected in the first scan period by the scanning device, and n pieces of distance data A parameter calculation module for calculating a matching parameter by comparing one distance data extracted from the two distance data ("second scan data"), a data conversion module for converting the second scan data using the matching parameter calculated by the parameter calculation module And a data matching module for matching the converted second scan data with the first scan data. And.

An embodiment of the present invention is to provide a point-of-match device and a method thereof that efficiently match through reduction of a set of input data during matching processing of point groups obtained from a 3D scanner.

It should be understood, however, that the technical scope of the present invention is not limited to the above-described technical problems, and other technical problems may exist.

According to an aspect of the present invention, there is provided a point-based matching apparatus for 3D scanning data according to an aspect of the present invention, comprising: an input unit for inputting, based on an edge sharing rate for an arbitrary reference point group and another point group among 3D- An input data set generation unit for generating a data set; A subset generation unit for generating a subset by selecting an arbitrary number of pairs of the input data sets; A mathematical model generation unit for processing a singular value decomposition for each pair of points of the subset to generate a transformation matrix; And a point group matching unit for applying the transformation matrix to the point cloud group and converting the points of the remaining point cloud group into a coordinate system of the reference point cloud group, wherein the input data cloud generation unit generates, for each point of the reference point cloud group, The input data set is generated by matching an arbitrary number of points from the highest edge sharing rate.

According to any one of the above objects of the present invention, the point-based matching problem can be processed faster than the existing one without degrading the quality by processing the point-based matching through reduction of the input data set. Thus, a larger point group (i.e., a more accurate 3D model) can be processed.

1 is a block diagram showing a configuration of a point-of-match device according to an embodiment of the present invention.
2 is a view showing an example of an object to be 3D-scanned according to an embodiment of the present invention.
FIG. 3 is a view showing a point group according to the 3D scan result of FIG. 2. FIG.
4 is a view showing an example of a result of preprocessing of a point group according to an embodiment of the present invention.
FIG. 5 is a flowchart illustrating a point-to-point matching method according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly explain the present invention in the drawings, parts not related to the description are omitted.

Throughout the specification, when an element is referred to as "including" an element, it does not exclude other elements unless specifically stated to the contrary, But do not preclude the presence or addition of one or more of the other features, numbers, steps, operations, elements, parts, or combinations thereof.

In this specification, the term " part " means a unit realized by hardware or software, a unit realized by using both, and one unit may be realized by using two or more hardware, The above units may be realized by one hardware.

Hereinafter, the point-of-match device and the matching method according to an embodiment of the present invention will be described in detail with reference to the drawings.

1 is a block diagram showing a configuration of a point-of-match device according to an embodiment of the present invention.

1, the point-of-match device 100 includes a point-group preprocessing unit 110, a point cloud database 120, an input data set generation unit 130, a subset generation unit 140, a mathematical model generation unit 150, a mathematical model evaluation unit 160, and a point-of-match matching unit 170.

1, the point-of-match matching apparatus 100 is interlocked with the 3D scanner 200 to acquire and process data (i.e., a point cloud) generated in the 3D scanner 200. [ 1, the point-of-match device 100 and the 3D scanner 200 are described as being separate devices. However, the point-of-match device 100 according to another embodiment of the present invention includes the 3D scanner 200 and one device And may be included in some configurations within the scanner.

The 3D scanner 200 scans an object as a scan object and generates data for a plurality of points, that is, a point cloud. In this case, generally, since one scan can not cover all the outer shapes of the object, a plurality of scanned point groups may be generated by repeating the scanning process a plurality of times.

The point cloud is a set of point data obtained through scanning, and each point has its own three-dimensional position represented by at least (x, y, z) coordinates. In addition, depending on the scanning method of the scanner, each point may have additional attributes such as color, sensor position, and orientation. In addition, the coordinate system of each point may be the sensor center coordinate system, in which case the position of the sensor may be set as the origin. For reference, the direction from the front of the sensor can be set to any axis (eg z axis).

FIG. 2 is a view showing an example of an object to be 3D-scanned according to an embodiment of the present invention, and FIG. 3 is a view showing a point group according to a 3D scan result of FIG.

For example, the 3D scanner 200 may scan an actual object as shown in FIG. 2 and transmit the resultant to the point-group preprocessing unit 120. In FIG. 3, the result of scanning an object as shown in FIG. 2 is shown. That is, the 3D scanner 200 can transmit the point group and other data that have not been subjected to the preprocessing as shown in FIG. 3 to the point cloud preprocessing unit 120.

In addition, the 3D scanner 200 may also generate a point cloud for matching calculation in addition to a general point cloud by attaching a reflection marker to a scan target to improve accuracy in matching. In FIG. 2, it is shown that the reflection marker P11 is attached to the object P10 for more precise matching processing. As described above, the scanning method of the 3D scanner 200 may be a method of separately acquiring a point cloud for matching calculation using reflection marker information.

1, the point-group preprocessing unit 120 receives a point group scanned through the 3D scanner 200 and performs predetermined preprocessing and stores the preprocess in the point-of-view database 120. At this time, the point group preprocessing unit 120 removes unnecessary points from the entire scan point group or generates a point cloud for matching calculation that is composed only of major feature points.

Specifically, the point-of-group preprocessing unit 120 removes outliers of the scan that are not present on the surface of the object or exist on the empty space.

FIG. 4 is a view showing an example of a result of preprocessing of a point group according to an embodiment of the present invention, and FIGS. 4 (a) and 4 (b) each show an example of a point group in which such abnormal values are removed.

The manner in which the point-group preprocessing unit 120 removes an abnormal point may include the following steps. First, if the average value of distances to a certain number of neighboring points at each point is equal to or greater than a certain value, it is determined that the point is not on the surface of the object and is removed. Then, the neighboring points are connected to each other to construct a clustered component, and the remaining components except for the largest component are removed. By doing so, the remaining points except the point on the surface of the scan target body can be removed. On the other hand, if it is determined that the size of the point group (i.e., the number of points) as a result of the above processing is too large to be a specific size (for example, several tens to millions), the point group preprocessing unit 120 may use Fast Point Feature Histograms ), It is possible to separately generate a point cloud for matching calculation that is composed only of feature points. Then, the point-group preprocessing unit 120 stores the generated point group and the point-group for matching calculation in the point-of-view database 120.

1, the point cloud database 120 stores the scanned or matched point groups and, in response to inquiries from other configurations (i.e., the input data set generation unit 130 and the point cloud conversion unit 170) Data.

The input data set generation unit 130 generates an input data set for RANSAC operation for point-of-match matching processing.

Specifically, the input data set generation unit 130 acquires the point group (normal point group or matching point cloud) data to be matched from the point cloud database 120. The input data set generation unit 130 generates an input data set based on two sets of points to be matched (i.e., a reference point group and a matching point group).

For reference, in RANSAC, all pairs of two points, arbitrarily selected from each point group, correspond to the input data set. Thus, the size of the input data set is N 2 (where N is the size of the point cloud, assuming that the sizes of the two point clouds are equal to N for ease of computation). In general, however, the reason for the long execution time of RANSAC to solve the matching problem is that the input data set is N 2 , not N. Thus, in one embodiment of the present invention, the size of the input data set is greatly reduced without degrading the quality, thereby increasing the efficiency of the RANSAC.

Specifically, the input data set generation unit 130 can process the input data set in the following manner to generate an input data set.

First, given two points p and q belonging to the reference point group P and the other point group Q, s (p, q) can be defined as the edge sharing ratio of p and q. In this case, the edge sharing rate s (p, q) is a length between E (P, p) which is a set of edges connected with p in P and E (Q, q) which is a set of edges connected with q in Q The number of the same edge is divided by N. If the two points p and q are actually matched (overlapping) points, their edge sharing rate becomes the greatest among the edge sharing rates of points p and Q. That is, the fact that the edge sharing rate is large means that there is a high probability that the two points p and q match each other.

Unlike the conventional RANSAC, which considers all the points of Q as potential matching points for a point p in P, an embodiment of the present invention, based on the above discussion, Only the points of K Q are regarded as potential matching points of p. By doing this, the size of the input data set can be greatly reduced to K * N.

For example, when the size of the point cloud is 100 (N = 100) and the overlap ratio is 30% (a = 0.3, 30 dots overlap), the probability of 99% (p = 0.99) Conversion matrix) is approximately 170 million times. However, in one embodiment of the present invention, the number of iterations needed to find a suitable model with probability of p in RANSAC can be expressed by the following equation (1).

Figure pat00002

In the case of using Equation 2, if K is 10, a suitable model (i.e., a transformation matrix) can be obtained by about 170,000 times in RANSAC processing. That is, according to the input data set generation method according to an embodiment of the present invention, the number of operation iterations can be reduced to about 1000 times or more as compared with the conventional RANSAC processing.

The subset generation unit 140 generates an arbitrary subset of the input data set.

The subset generation unit 140 generates a subset by arbitrarily selecting three pairs of points from a given input data set. For reference, the reason for obtaining a pair of three points is that at least three are needed to solve the transformation matrix.

The mathematical model generation unit 150 calculates and generates a mathematical model from the subset.

The mathematical model generation unit 150 receives the subset, and constructs a transformation matrix, which is a mathematical model of the matching problem.

At this time, the transformation matrix is composed of the rotation matrix of Equation (3) and the movement matrix of Equation (4).

Figure pat00003

Figure pat00004

When the subset is given as {(p 1 , q 1 ), (p 2 , q 2 ), (p 3 , q 3 )}, the transformation matrix of q i to p i is singular value decomposition (SVD ).

The mathematical model evaluation unit 160 evaluates the quality of the mathematical model (conversion matrix) formed through the mathematical model generation unit 150. [

At this time, the mathematical model evaluation unit 160 may process the following operations to evaluate the quality of the mathematical model.

First, the transformation matrix is applied to all the points of the point group Q, which is not the reference point group, and is moved to a new position. Suppose that the newly moved points are q '.

At this time, if there is a point q i 'closest to the point p i of the reference point group P, the two points are matched if the distance between the two points is less than a specific value (for example, substantially zero). Therefore, the number of pairs of points whose distance is less than or equal to a predetermined threshold value among pairs of matched points is calculated.

Based on the result of the calculation, whether the current state of the mathematical model meets a predetermined repeat termination condition. At this time, the termination condition may be set to a case where the number of pairs of points having a distance equal to or less than the threshold is equal to or greater than a predetermined number.

At this time, if the current state of the mathematical model does not satisfy the iteration termination condition, the process of generating the input data set through the input data set generation unit 130 is repeated. On the other hand, if the current state of the mathematical model satisfies the iteration termination condition, the quality satisfaction result for the transformation matrix is transmitted to the point-

The point-of-match matching unit 170 converts a point cloud into a coordinate system of a predetermined reference point cloud using a transformation matrix, which is a mathematical model generated.

At this time, the point cloud transforming unit 170 may convert the point cloud using the best quality mathematical model among the transformation matrices generated through one or more mathematical model creation, and store the result in the point cloud database 120 .

Meanwhile, the point-of-match device 100 according to an embodiment of the present invention described above may be implemented in a form including a memory (not shown) and a processor (not shown).

That is, the memory (not shown) includes a series of operations and algorithms for generating the minimum number of input data sets from the above-described "3D scan data (i.e., point cloud), and processing the RANSAC technique thereby to perform point matching Is stored. At this time, the program stored in the memory (not shown) may be a program in which all the operations processed by the respective components of the point-of-view matching apparatus 100 are implemented as one, or a program May be interlinked with each other. A processor (not shown) executes a program stored in a memory (not shown). As the processor (not shown) executes the program, the operations and algorithms that the respective components of the point-of-match device 100 described above can perform can be performed. For reference, the components of the point-of-match device 100 may be implemented in hardware such as software or an FPGA (Field Programmable Gate Array) or ASIC (Application Specific Integrated Circuit), and may perform predetermined roles. However, 'components' are not meant to be limited to software or hardware, and each component may be configured to reside on an addressable storage medium and configured to play one or more processors. Thus, by way of example, an element may comprise components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, Routines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The components and functions provided within those components may be combined into a smaller number of components or further separated into additional components.

Hereinafter, a point-of-match method for 3D scanning data through the point-of-match device 100 according to an embodiment of the present invention will be described in detail with reference to FIG.

FIG. 5 is a flowchart illustrating a point-to-point matching method according to an embodiment of the present invention.

First, the 3D scanner obtains point cloud data which is data obtained by scanning the real objects (S510).

At this time, the obtained point group may include a general point group by scanning performed without additional processing, and a matching calculation point group further including information of a reflection marker for precise matching.

Next, predetermined pre-processing is performed on the obtained point group (S520).

At this time, preprocessing for eliminating unnecessary points for the entire point cloud group and preprocessing for generating a point cloud for matching computation consisting only of main feature points can be performed. More specifically, it is possible to generate a point cloud for matching calculation by removing an outlier for the scanned point cloud. In addition, if it is determined that the size of the point cloud after the removal process is larger than the specific size, the point cloud for matching calculation can be generated by applying the predetermined minutia detection algorithm.

Next, the preprocessed point group is stored in the point cloud database (S530), and an input data cloud is created for the stored point clouds (S540).

Specifically, after acquiring the point group in which the preprocessing has been completed, one of the reference point groups is set, and another point group to be matched with respect to the reference point group is selected to generate the input data set. At this time, the edge sharing rate between all the points of different point groups with respect to each point of the reference point group is calculated. A certain number of points are selected from the highest value of the calculated edge sharing rate and a pair of points obtained by matching points of different selected point groups with respect to each point of the reference point group is generated as an input data set.

Then, an arbitrary subset is generated for the generated input data set (S550).

At this time, an arbitrary number of pairs of points are selected from the input data set and are generated as a subset.

Then, the predetermined mathematical modeling is processed from the generated subset to generate a mathematical model (i.e., transformation matrix) (S560).

At this time, the transformation matrix composed of the rotation matrix and the movement matrix can be generated by processing singular value decomposition for each pair of points in the subset.

Next, the quality is evaluated according to predetermined criteria for the generated mathematical model (S570).

Specifically, the generated transformation matrix is temporarily applied to convert all points in the point group except the reference point group to a new position (i.e., a coordinate system of the reference point group). As a result of the conversion, the number of pairs of points whose distance between the point of the reference point group and the closest transformed point is equal to or less than a preset threshold value is calculated. At this time, when the number of pairs of points having a distance equal to or less than the threshold value is equal to or greater than a preset reference number, it can be determined that the quality of the mathematical model is satisfied.

If the quality of the mathematical model is not satisfied as a result of the evaluation in step S570, the process returns to step S540 to repeat the process from the generation of the input data set. In this case, a smaller number of points can be selected from the point where the value of the edge sharing rate is highest when generating the input data set. That is, the reference value for the edge sharing rate can be set higher.

On the other hand, if the quality of the mathematical model is satisfied as a result of the evaluation in step S570, the conversion matrix is applied to match the 3D scanning point group (S580).

At this time, the entire point group is converted into the coordinate system of the reference point group by applying the conversion matrix.

For reference, the steps up to step S570 may be repeatedly performed, thereby selecting the highest quality mathematical model. It is also possible to perform the step S580 by applying the mathematical model of the best quality selected in this way.

The point-based matching method for the 3D scanning point cloud in the point-of-match device 100 according to the embodiment of the present invention described above can be realized by a computer program stored in a medium executed by a computer or a recording medium including a computer- But may also be implemented in the form of a medium. Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media. In addition, the computer-readable medium may include both computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Communication media typically includes any information delivery media, including computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transport mechanism.

While the methods and systems of the present invention have been described in connection with specific embodiments, some or all of those elements or operations may be implemented using a computer system having a general purpose hardware architecture.

It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.

The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.

100: point-matching device
200: 3D Scanner

Claims (1)

A point-to-point matching device for 3D scanning data,
An input data set generation unit for generating an input data set based on an edge sharing rate for an arbitrary reference point group and another point group among the 3D scanned point groups;
A subset generation unit for generating a subset by selecting an arbitrary number of pairs of the input data sets;
A mathematical model generation unit for processing a singular value decomposition for each pair of points of the subset to generate a transformation matrix; And
And a point group matching unit for applying the transformation matrix to the point group to convert the points of the remaining point group into the coordinate system of the reference point group,
Wherein the input data-
Wherein the input data set is generated by matching an arbitrary number of points from a point having the highest edge sharing rate among points of the different point groups for each point of the reference point group.
KR1020160022770A 2016-02-25 2016-02-25 Apparatus and method for registration of point cloud KR20170100718A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160022770A KR20170100718A (en) 2016-02-25 2016-02-25 Apparatus and method for registration of point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160022770A KR20170100718A (en) 2016-02-25 2016-02-25 Apparatus and method for registration of point cloud

Publications (1)

Publication Number Publication Date
KR20170100718A true KR20170100718A (en) 2017-09-05

Family

ID=59924961

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160022770A KR20170100718A (en) 2016-02-25 2016-02-25 Apparatus and method for registration of point cloud

Country Status (1)

Country Link
KR (1) KR20170100718A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107860346A (en) * 2017-09-30 2018-03-30 北京卫星制造厂 A kind of measuring coordinate system method for registering
CN108022262A (en) * 2017-11-16 2018-05-11 天津大学 A kind of point cloud registration method based on neighborhood of a point center of gravity vector characteristics
CN113267122A (en) * 2021-05-12 2021-08-17 温州大学瓯江学院 Industrial part size measurement method based on 3D vision sensor
KR20220018371A (en) * 2020-08-06 2022-02-15 국방과학연구소 Method and apparatus for matching point cloud data for 3D terrain model reconstruction
KR20220081776A (en) * 2020-12-09 2022-06-16 주식회사 맵퍼스 Point cloud matching method and point cloud matching device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107860346A (en) * 2017-09-30 2018-03-30 北京卫星制造厂 A kind of measuring coordinate system method for registering
CN107860346B (en) * 2017-09-30 2019-12-20 北京卫星制造厂 Registration method for measuring coordinate system
CN108022262A (en) * 2017-11-16 2018-05-11 天津大学 A kind of point cloud registration method based on neighborhood of a point center of gravity vector characteristics
KR20220018371A (en) * 2020-08-06 2022-02-15 국방과학연구소 Method and apparatus for matching point cloud data for 3D terrain model reconstruction
KR20220081776A (en) * 2020-12-09 2022-06-16 주식회사 맵퍼스 Point cloud matching method and point cloud matching device
CN113267122A (en) * 2021-05-12 2021-08-17 温州大学瓯江学院 Industrial part size measurement method based on 3D vision sensor

Similar Documents

Publication Publication Date Title
US11556797B2 (en) Systems and methods for polygon object annotation and a method of training an object annotation system
CN111899163B (en) Efficient structure preservation to generate single image super-resolution in an antagonistic network
Wang et al. Fusing bird’s eye view lidar point cloud and front view camera image for 3d object detection
Li et al. DeepI2P: Image-to-point cloud registration via deep classification
US11017210B2 (en) Image processing apparatus and method
KR20170100718A (en) Apparatus and method for registration of point cloud
Sodhi et al. In-field segmentation and identification of plant structures using 3D imaging
KR20160147491A (en) Apparatus and method for 3D model generation
CN110197109B (en) Neural network model training and face recognition method, device, equipment and medium
CN111028327A (en) Three-dimensional point cloud processing method, device and equipment
KR101723738B1 (en) Apparatus and method for resolution enhancement based on dictionary learning
CN114724120B (en) Vehicle target detection method and system based on radar vision semantic segmentation adaptive fusion
Sun et al. 3DRIMR: 3D reconstruction and imaging via mmWave radar based on deep learning
Dewan et al. Learning a local feature descriptor for 3d lidar scans
CN115063768A (en) Three-dimensional target detection method, encoder and decoder
Zhang et al. Robust projective template matching
Kang et al. Primitive fitting based on the efficient multibaysac algorithm
Li et al. Primitive fitting using deep geometric segmentation
CN113052890A (en) Depth truth value acquisition method, device and system and depth camera
CN113344784B (en) Optimizing a supervisory generated countermeasure network through latent spatial regularization
CN117649602B (en) Image processing method and system based on artificial intelligence
Yang et al. Multiscale adjacency matrix CNN: Learning on multispectral LiDAR point cloud via multiscale local graph convolution
Jiang Three-Dimensional Data Registration in Laser based 3D Scanning Reconstruction
Kohek et al. Estimation of projection matrices from a sparse set of feature points for 3D tree reconstruction from multiple images
CN117934573B (en) Point cloud data registration method and device and electronic equipment