CN114926549B - Three-dimensional point cloud processing method, device, equipment and storage medium - Google Patents

Three-dimensional point cloud processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN114926549B
CN114926549B CN202210629210.1A CN202210629210A CN114926549B CN 114926549 B CN114926549 B CN 114926549B CN 202210629210 A CN202210629210 A CN 202210629210A CN 114926549 B CN114926549 B CN 114926549B
Authority
CN
China
Prior art keywords
point cloud
cloud data
dimensional point
transformation matrix
sets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210629210.1A
Other languages
Chinese (zh)
Other versions
CN114926549A (en
Inventor
刘时磊
姚文强
陈嘉文
沈健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202210629210.1A priority Critical patent/CN114926549B/en
Publication of CN114926549A publication Critical patent/CN114926549A/en
Application granted granted Critical
Publication of CN114926549B publication Critical patent/CN114926549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides a three-dimensional point cloud data processing method, device, equipment and storage medium. The artificial intelligence field, in particular to image processing, 3D vision and deep learning technology, which can be applied to intelligent cloud and smart city scenes. The specific implementation scheme is as follows: acquiring two groups of three-dimensional point cloud data; based on the two sets of three-dimensional point cloud data, determining plane sets respectively corresponding to the two sets of three-dimensional point cloud data, wherein the plane sets are combinations of a plurality of planes constructed for the same object; constructing characteristic point lattices corresponding to the two groups of three-dimensional point cloud data respectively based on the plane groups corresponding to the two groups of three-dimensional point cloud data respectively; determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data; based on the initial transformation matrix, a target transformation matrix is determined, the target transformation matrix being a transformation matrix for registration. According to the technical scheme, the precision of coarse registration can be improved, and the intelligence and universality of coarse registration are improved.

Description

Three-dimensional point cloud processing method, device, equipment and storage medium
Technical Field
The disclosure relates to the field of artificial intelligence, in particular to image processing, 3D vision and deep learning technologies, which can be applied to intelligent cloud and smart city scenes.
Background
With the rapid development of laser radar technology, registration and fusion between three-dimensional point clouds of multiple detectors are receiving more and more attention. The three-dimensional point cloud registration is mainly divided into coarse registration, which pursues rapidness and automation, and fine registration, which pursues high precision. In the related art, in the rough registration of the point cloud, the implementation process of the rough registration is poor in intelligence and large in error.
Disclosure of Invention
The disclosure provides a three-dimensional point cloud data processing method, device, equipment and storage medium.
According to a first aspect of the present disclosure, there is provided a three-dimensional point cloud data processing method, including:
Acquiring two groups of three-dimensional point cloud data;
Based on the two sets of three-dimensional point cloud data, determining plane sets respectively corresponding to the two sets of three-dimensional point cloud data, wherein the plane sets are combinations of a plurality of planes constructed for the same object;
Constructing characteristic point lattices corresponding to the two groups of three-dimensional point cloud data respectively based on the plane groups corresponding to the two groups of three-dimensional point cloud data respectively;
determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data;
Based on the initial transformation matrix, a target transformation matrix is determined, which is a transformation matrix for registration.
According to a second aspect of the present disclosure, there is provided a three-dimensional point cloud data processing apparatus, including:
the acquisition unit is used for acquiring two groups of three-dimensional point cloud data;
the first determining unit is used for determining plane groups corresponding to the two sets of three-dimensional point cloud data respectively based on the two sets of three-dimensional point cloud data, wherein the plane groups are combinations of a plurality of planes constructed for the same object;
The construction unit is used for constructing characteristic point lattices corresponding to the two groups of three-dimensional point cloud data based on the plane groups corresponding to the two groups of three-dimensional point cloud data respectively;
The second determining unit is used for determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data;
a third determination unit for determining a target transformation matrix based on the initial transformation matrix, the target transformation matrix being a transformation matrix for registration.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method provided in the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method provided in the first aspect above.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method provided by the first aspect described above.
According to the technical scheme, the speed and the precision of coarse registration can be improved, and the intelligence and universality of registration are improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a flow diagram of a three-dimensional point cloud data processing method according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow diagram of coarse matching based on three-dimensional point cloud data according to an embodiment of the present disclosure;
Fig. 3a is a schematic diagram of a plane set and a feature point corresponding to one set of three-dimensional point cloud data according to an embodiment of the disclosure, and fig. 3b is a schematic diagram of a plane set and a feature point corresponding to another set of three-dimensional point cloud data according to an embodiment of the disclosure;
FIG. 4 is a schematic diagram of a feature point lattice corresponding to two sets of three-dimensional point cloud data according to an embodiment of the disclosure;
Fig. 5a is a bird's eye view of two sets of three-dimensional point cloud data before rough matching according to an embodiment of the present disclosure, and fig. 5b is a bird's eye view of two sets of three-dimensional point cloud data after rough matching according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of the effect of further performing a fine match after completing a coarse match in accordance with an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a three-dimensional point cloud data processing apparatus according to an embodiment of the present disclosure;
fig. 8 is a block diagram of an electronic device used to implement a three-dimensional point cloud data processing method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such as a series of steps or elements. The method, system, article, or apparatus is not necessarily limited to those explicitly listed but may include other steps or elements not explicitly listed or inherent to such process, method, article, or apparatus.
In the related art, the coarse registration method mainly comprises the following steps: 1. manually searching for one-to-one corresponding characteristic points in the two point clouds; 2. automatically searching feature points; 3. manually searching for a feature line segment; 4. manually searching and selecting a plane; 5. the calibration (e.g., calibration sphere) is aided by the use of a special calibration tool. However, the above-described coarse registration methods all have drawbacks. For example, the manual process of the 1 st registration mode is complicated, and errors exist because the detection point cloud data do not correspond strictly one by one. The 2 nd registration mode has large calculation amount and strict data requirements. The 3 rd registration mode is complicated in manual process, and because of the discreteness of the point cloud data, errors exist in the length of the line segment. The 4 th registration mode has large calculation amount in the process of globally searching the plane, the manual selection process is complicated, and the iterative calculation method is complicated. The 5 th registration mode requires additional preparation of calibration tools, and for smaller calibration tools, the calculation range needs to be manually determined, so that universality is weak.
In order to at least partially solve one or more of the above problems and other potential problems, the present disclosure proposes a three-dimensional point cloud data processing method, and by using the technical solution of the embodiment of the present disclosure, a target transformation matrix may be rapidly obtained, and coarse registration is performed by using the target transformation matrix, so that the speed and accuracy of coarse registration may be improved, and the intelligence of coarse registration may be improved. In addition, the universality is stronger because no extra calibration tool is needed.
An embodiment of the present disclosure provides a three-dimensional point cloud data processing method, and fig. 1 is a schematic flow diagram of the three-dimensional point cloud data processing method according to an embodiment of the present disclosure, where the three-dimensional point cloud data processing method may be applied to a three-dimensional point cloud data processing device. The three-dimensional point cloud data processing apparatus is located in an electronic device, including but not limited to a fixed device and/or a mobile device, for example, the fixed device includes but not limited to a server, which may be a cloud server or a general server. For example, mobile devices include, but are not limited to: one or more terminals of a mobile phone, a tablet computer and a vehicle-mounted terminal. In some possible implementations, the method may also be implemented by way of a processor invoking computer readable instructions stored in a memory. As shown in fig. 1, the three-dimensional point cloud data processing method includes:
s101: acquiring two groups of three-dimensional point cloud data;
s102: based on the two sets of three-dimensional point cloud data, determining plane sets respectively corresponding to the two sets of three-dimensional point cloud data, wherein the plane sets are combinations of a plurality of planes constructed for the same object;
s103: constructing characteristic point lattices corresponding to the two groups of three-dimensional point cloud data respectively based on the plane groups corresponding to the two groups of three-dimensional point cloud data respectively;
S104: determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data;
S105: based on the initial transformation matrix, a target transformation matrix is determined, which is a transformation matrix for registration.
In the embodiment of the disclosure, the two sets of three-dimensional point cloud data may be three-dimensional point cloud data acquired by the same laser radar (Light Detection AND RANGING, LIDAR) at different times. Here, the different times are noted as a first time and a second time, and an interval between the first time and the second time is smaller than a preset threshold. Here, the preset may be set or adjusted according to design requirements such as accuracy requirements or speed requirements.
In the embodiment of the disclosure, the two sets of three-dimensional point cloud data may also be three-dimensional point cloud data acquired by two radars at the same time.
In the embodiment of the disclosure, the two sets of three-dimensional point cloud data may also be three-dimensional point cloud data acquired by two radars at different times. Here, if the different times are noted as a third time and a fourth time, the interval between the third time and the fourth time is smaller than the preset threshold. Here, the preset may be set or adjusted according to design requirements such as accuracy requirements or speed requirements.
It should be noted that the present disclosure is not limited to the specific collection of two sets of three-dimensional point cloud data.
In some embodiments, acquiring two sets of three-dimensional point cloud data includes: and receiving two groups of three-dimensional point cloud data sent by the laser radar. In other embodiments, acquiring two sets of three-dimensional point cloud data includes: and receiving two sets of three-dimensional point cloud data collected by the laser radar, which are forwarded by other equipment. The above is merely exemplary, and is not intended to limit the total possible ways in which two sets of three-dimensional point cloud data may be acquired, but is not intended to be exhaustive.
In an embodiment of the disclosure, the two sets of three-dimensional point cloud data include at least one block of point cloud data of the same region. For example, the first set of point cloud data includes point cloud data of an area a and an area B, the second set of point cloud data includes point cloud data of an area B and an area C, and then the first set of point cloud data and the second set of point cloud data each include point cloud data of an area B. And the point cloud data aiming at the area B in the two sets of three-dimensional point cloud data are used as the data basis for determining the plane set subsequently.
In the disclosed embodiment, a plane group is a combination of a plurality of planes constructed for the same object. Each plane group comprises at least 3 planes, which 3 planes can intersect, i.e. 3 planes have an intersection point. The number of planes included in each plane group is not limited in a mandatory way, and the calculation precision is higher as the number of planes is larger.
In the embodiment of the disclosure, the same object is an object in a common area corresponding to two sets of three-dimensional point cloud data. The present disclosure is not limited in terms of the type of object. For example, the object may be a building. For another example, the object may be a dustbin. As another example, the object may be a carton. In some embodiments, the object may be selected to have at least one surface parallel to the ground, thus facilitating subsequent construction of the planar set. The above is illustrative only and is not intended to limit the total number of possible types of objects, but is not intended to be exhaustive.
In the embodiment of the disclosure, the feature point lattices corresponding to the two sets of three-dimensional point cloud data are lattices composed of feature points selected from a plurality of feature points included in the plane set. The number of the feature points in each feature point lattice is greater than or equal to 4, wherein when the feature point lattice comprises 4 feature points, at least one feature point in the 4 feature points and other 3 feature points are not on the same plane.
In an embodiment of the present disclosure, the initial transformation matrix is a transformation matrix for registration. The initial transformation matrix is a non-rigid transformation matrix.
In an embodiment of the present disclosure, the target transformation matrix is a transformation matrix for registration, the target transformation matrix being a rigid transformation matrix.
According to the technical scheme, two groups of three-dimensional point cloud data are based, and plane groups corresponding to the two groups of three-dimensional point cloud data are determined; constructing characteristic point lattices corresponding to the two groups of three-dimensional point cloud data respectively based on the plane groups corresponding to the two groups of three-dimensional point cloud data respectively; determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data; based on the initial transformation matrix, a target transformation matrix is determined, which is a transformation matrix for registration. Because the plane group is a combination of a plurality of planes constructed for the same object, a target transformation matrix can be obtained based on plane matching; therefore, by adopting the registration scheme based on surface matching, the target transformation matrix can be rapidly obtained, and coarse registration is performed by adopting the target transformation matrix, so that the speed and the accuracy of coarse registration can be improved, and the intelligence of coarse registration is improved. In addition, the universality of registration is improved because no extra calibration tool is needed.
In some embodiments, determining the plane groups to which the two sets of three-dimensional point cloud data respectively correspond based on the two sets of three-dimensional point cloud data includes: acquiring registration scene information; determining a target algorithm matched with the registration scene information from an algorithm library; and fitting by adopting a target algorithm based on the two sets of three-dimensional point cloud data to obtain plane sets respectively corresponding to the two sets of three-dimensional point cloud data.
Here, the plane group includes N planes, which are expressed by N plane equations, N being an integer of 3 or more. At least three planes of the N planes can intersect.
Here, the algorithm library includes at least one algorithm for fitting a plane. The present disclosure does not impose a limit on the number of algorithms included in the algorithm library.
Here, the registration scene information is information characterizing a usage scene of registration. For example, the registration scene comprises batch automatic calibration of the whole vehicle before delivery of the vehicle factory. For another example, the registration scene includes a trajectory estimation of the low frequency samples. For another example, the registration scenario includes a multi-sensor relative position determination.
Here, different registration scenarios may correspond to different algorithms. The same registration scenario may correspond to one or more algorithms.
In practical application, two groups of three-dimensional point cloud data are fitted by adopting the same target algorithm to obtain two groups of plane groups corresponding to the three-dimensional point cloud data respectively, so that the reliability of a registration result can be improved, and the accuracy of the registration result can also be improved.
Therefore, by adopting a target algorithm matched with the registration scene, the correlation between the target transformation matrix and the registration scene can be improved, so that the reliability of the registration result is improved, and the accuracy of the registration result is also improved.
In some embodiments, constructing feature point lattices respectively corresponding to two sets of three-dimensional point cloud data based on planar sets respectively corresponding to the two sets of three-dimensional point cloud data includes: determining unit normal vectors for planes in the plane groups corresponding to the two groups of three-dimensional point cloud data respectively, wherein the directions of the normal vectors of the planes in the plane groups corresponding to the two groups of three-dimensional point cloud data correspond to each other one by one; according to the intersection points of three planes of the plane group corresponding to the two groups of three-dimensional point cloud data, the characteristic point lattices corresponding to the two groups of three-dimensional point cloud data are constructed, so that the characteristic points constructed by the two groups of three-dimensional point cloud data are in one-to-one correspondence.
Here, the direction of the normal vector is consistent, that is, under the respective coordinate systems of the two three-dimensional point clouds, the directions of the two groups of normal vectors are in one-to-one correspondence with each other. That is, n1 in point cloud 1 is outward from the wall, and the normal vector n1' in point cloud 2 corresponding to point cloud 1 should also be outward from the wall, not inward from the wall.
Therefore, two fitted plane groups are adopted to construct two groups of characteristic point lattices, a calculation basis is provided for the follow-up determination of an initial transformation matrix, and the accuracy of the initial transformation matrix is improved, so that the accuracy of a target transformation matrix is improved.
In some embodiments, determining the initial transformation matrix based on the feature point lattices to which the two sets of three-dimensional point cloud data respectively correspond includes: adding one-dimensional unit vectors to the first characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data to obtain second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data; and obtaining an initial transformation matrix according to the second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data and the relationship between the second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data.
Wherein,A second feature point lattice which is a reference,/>Is a second characteristic point lattice to be transformed;
Wherein, To add a one-dimensional unit vector on the basis of G, the dimension is 4×n G, G is the first feature point lattice, and N G is the number of feature points to be constructed.
Due toAs a non-square matrix, a singular value decomposition (Singular Value Decomposition, SVD) algorithm can be used to solve for the initial transformation matrix T 0, specifically as follows:
therefore, the initial transformation matrix can be obtained rapidly according to the formula, so that a calculation basis is provided for further solving the target transformation matrix, and the determination speed of the target transformation matrix is improved.
The initial transformation matrix T 0, which is found by SVD decomposition, is not a rigid transformation matrix, since the point cloud data and the fitted plane equation may have errors. In some embodiments, determining the target transformation matrix based on the initial transformation matrix includes: determining Euler angles and translation parameters through an initial transformation matrix; based on Euler angle and translation parameters, a target transformation matrix T is obtained.
Therefore, errors caused by point cloud data and fitting plane equations can be reduced, the non-rigid initial transformation matrix is updated to be a rigid target transformation matrix, and the accuracy of the target transformation matrix is improved, so that the speed and the accuracy of coarse registration are improved.
In some embodiments, determining euler angles and translation parameters from the initial transformation matrix includes:
the euler angle is calculated according to the following formula:
Wherein α, β, γ are euler angles, T 0 is an initial transformation matrix, T 0,1,1 represents row 1 and column 1 of T 0, T 0,1,2 represents row 1 and column 2 of T 0, T 0,1,3 represents row 1 and column 3 of T 0, and T 0,2,3 represents row 2 and column 3 of T 0; t 0,3,3 represents row 3 and column 3 of T 0;
the translation parameters were calculated according to the following formula:
Where T x、ty、tz is a translation parameter, T 0,4,1 represents row 4, column 1 of T 0, T 0,4,2 represents row 4, column 2 of T 0, and T 0,4,3 represents row 4, column 3 of T 0.
Therefore, euler angles and translation parameters can be rapidly obtained through a calculation formula, so that preparation is made for subsequent determination of the target transformation matrix.
In some embodiments, deriving the target transformation matrix from the euler angles and the translation parameters includes:
The target transformation matrix is calculated according to the following formula:
Wherein T represents a target transformation matrix, alpha, beta and gamma are Euler angles, and T x、ty、tz is a translation parameter.
Therefore, the target transformation matrix can be calculated and obtained quickly through a formula, and the determination speed and the accuracy of the target transformation matrix are improved, so that the registration speed and the registration accuracy are improved.
Fig. 2 shows a schematic flow chart of rough matching based on three-dimensional point cloud data, as shown in fig. 2, the flow includes:
step a: collecting three-dimensional point cloud data, and determining datum point cloud data and point cloud data to be converted;
Step b: fitting equations of three planes in each group of point cloud data through a random sampling consensus algorithm (Random Sample Consensus, RANSAC);
obtaining 3 plane equations in each set of point cloud data according to formula (7)
aix+biy+ciz+di=0,(i=1,2,3) (7)
In order to ensure that the directions of normal unit vectors of planes of two groups of point cloud data are consistent, the direction of n i, (i=1, 2, 3) can be made to face the data origin of the laser radar by ensuring that the d i value is positive.
Step c: calculating the intersection points I 0 of 3 planes in each plane group;
I0=(x0,y0,z0) (8)
Step d: calculating normal unit vectors n1, n2 and n3 corresponding to each plane group;
G={(x,y,z)|(x,y,z)=I0+(ln1+mn2+nn3),l,m,n∈R} (9)
The feature point lattices constructed by the datum point cloud data and the point cloud data to be transformed are G 1 and G 2 respectively, the dimensions of the feature point lattices are 3 multiplied by N G, N G is the number of constructed feature points, and the two groups of constructed feature points are in one-to-one correspondence. The distribution of the dot matrix in the point cloud space is shown in fig. 3a and fig. 3b, where fig. 3a may be a plane group and a feature point schematic diagram corresponding to the reference point cloud data, and fig. 3b may be a plane group and a feature point schematic diagram corresponding to the point cloud data to be transformed.
Step e: and generating a feature lattice.
Fig. 4 shows schematic diagrams of feature point lattices corresponding to two sets of three-dimensional point cloud data, in fig. 4, the left is a lattice corresponding to reference point cloud data, and the right is a lattice corresponding to point cloud data to be transformed.
Step f: obtaining an initial transformation matrix T 0 through singular value decomposition;
step g: reversely pushing according to T 0 to obtain a rotary Euler angle and a translation parameter;
step h: and updating T 0 to obtain a target transformation matrix T.
Here, the specific calculation formula may refer to formula (6), and will not be described herein.
As shown in fig. 5a and 5b, fig. 5a is a bird's-eye view diagram of two sets of three-dimensional point cloud data before rough matching, and fig. 5b is a bird's-eye view diagram of two sets of three-dimensional point cloud data after rough matching, it can be seen that the overlapping degree effect of the point cloud data of the same object is good after rough matching.
On the basis of the rough matching result shown in fig. 5b, the iterative closest point algorithm (ITERATIVE CLOSEST POINT, ICP) method is used for further performing fine matching, and the schematic diagram of the effect of the fine matching is shown in fig. 6.
The rough registration processing flow disclosed by the invention adopts the surface matching scheme, so that the speed, precision and intelligence of rough matching are improved, and the efficiency of fine registration is improved. The method can be applied to calibration before delivery of an automatic driving whole vehicle, or any scene requiring external parameter calibration of multiple detectors in advance, has low requirements on calibration environments, and generally selects 3 flat surfaces (such as wall surfaces).
It should be understood that the schematic diagrams shown in fig. 2, 3a, 3b, 4, 5a, 5b, and 6 are by way of example only and not limitation.
The three-dimensional point cloud data processing method provided by the disclosure can be suitable for most point cloud registration scenes, and can be used for batch automatic calibration before delivery of a vehicle, a factory and a whole vehicle, track estimation of low-frequency sampling, relative position determination of multiple sensors and the like. In practical applications, the preparation work required before registration can include: find a region with corner, guarantee the level and smooth of wall and ground as far as possible. It is ensured that the three planes of the corner are all contained in a common region of the two samplings of the lidar.
The embodiment of the disclosure provides a three-dimensional point cloud data processing device, as shown in fig. 7, which may include: an acquisition unit 701, configured to acquire two sets of three-dimensional point cloud data; a first determining unit 702, configured to determine, based on two sets of three-dimensional point cloud data, plane sets corresponding to the two sets of three-dimensional point cloud data respectively, where the plane sets are combinations of a plurality of planes constructed for the same object; a construction unit 703, configured to construct feature point lattices corresponding to the two sets of three-dimensional point cloud data respectively based on the two sets of plane sets corresponding to the three-dimensional point cloud data respectively; a second determining unit 704, configured to determine an initial transformation matrix based on feature point lattices corresponding to the two sets of three-dimensional point cloud data respectively; a third determining unit 705 for determining a target transformation matrix based on the initial transformation matrix, the target transformation matrix being a transformation matrix for registration.
In some embodiments, the first determining unit 702 includes: an acquisition subunit, configured to acquire registration scene information; the matching subunit is used for determining a target algorithm matched with the registration scene information from the algorithm library; and the fitting subunit is used for fitting and obtaining plane groups corresponding to the two groups of three-dimensional point cloud data respectively by adopting a target algorithm based on the two groups of three-dimensional point cloud data.
In some embodiments, the construction unit 703 comprises: the first determining subunit is used for determining unit normal vectors for planes in the plane groups corresponding to the two sets of three-dimensional point cloud data respectively, and the directions of the normal vectors of the planes in the plane groups corresponding to the two sets of three-dimensional point cloud data respectively correspond to each other one by one; and the construction subunit is used for constructing characteristic point lattices corresponding to the two sets of three-dimensional point cloud data according to the intersection points of the three planes of the plane sets corresponding to the two sets of three-dimensional point cloud data respectively, so that the characteristic points constructed by the two sets of three-dimensional point cloud data are in one-to-one correspondence.
In some embodiments, the second determining unit 704 includes: the transformation subunit is used for adding one-dimensional unit vectors to the first characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data to obtain second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data; the second determining subunit is configured to obtain an initial transformation matrix according to the second feature point lattices respectively corresponding to the two sets of three-dimensional point cloud data and the relationships between the second feature point lattices respectively corresponding to the two sets of three-dimensional point cloud data.
In some embodiments, the third determining unit 705 includes: a third determining subunit, configured to determine, through the initial transformation matrix, an euler angle; a fourth determining subunit, configured to determine a translation parameter through the initial transformation matrix; and a fifth determination subunit, configured to obtain a target transformation matrix based on the euler angle and the translation parameter.
In some embodiments, the third determining subunit is configured to:
the euler angle is calculated according to the following formula:
Wherein α, β, γ are euler angles, T 0 is an initial transformation matrix, T 0,1,1 represents row 1 and column 1 of T 0, T 0,1,2 represents row 1 and column 2 of T 0, T 0,1,3 represents row 1 and column 3 of T 0, and T 0,2,3 represents row 2 and column 3 of T 0; t 0,3,3 represents row 3 and column 3 of T 0;
A fourth determination subunit, configured to calculate the translation parameter according to the following formula:
Where T x、ty、tz is a translation parameter, T 0,4,1 represents row 4, column 1 of T 0, T 0,4,2 represents row 4, column 2 of T 0, and T 0,4,3 represents row 4, column 3 of T 0.
In some embodiments, the fifth determining subunit is configured to:
The target transformation matrix is calculated according to the following formula:
Wherein T represents a target transformation matrix, alpha, beta and gamma are Euler angles, and T x、ty、tz is a translation parameter.
It should be understood by those skilled in the art that the functions of each processing module in the three-dimensional point cloud data processing apparatus according to the embodiments of the present disclosure may be understood with reference to the foregoing description of the three-dimensional point cloud data processing method applied to the server, and each processing module in the three-dimensional point cloud data processing apparatus according to the embodiments of the present disclosure may be implemented by an analog circuit that implements the functions described in the embodiments of the present disclosure, or may be implemented by running software that implements the functions described in the embodiments of the present disclosure on an electronic device.
The three-dimensional point cloud data processing device disclosed by the embodiment of the invention can improve the speed and the precision of coarse registration and improve the intelligence of coarse registration. In addition, the universality is stronger because no extra calibration tool is needed.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the related user personal information all conform to the regulations of related laws and regulations, and the public sequence is not violated.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 8 illustrates a schematic block diagram of an example electronic device 800 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 8, the apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read-Only Memory (ROM) 802 or a computer program loaded from a storage unit 808 into a random access Memory (Random Access Memory, RAM) 803. In the RAM 803, various programs and data required for the operation of the device 800 can also be stored. The computing unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An Input/Output (I/O) interface 805 is also connected to bus 804.
Various components in device 800 are connected to I/O interface 805, including: an input unit 806 such as a keyboard, mouse, etc.; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, etc.; and a communication unit 809, such as a network card, modem, wireless communication transceiver, or the like. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 801 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 801 include, but are not limited to, a central processing unit (Central Processing Unit, CPU), a graphics processing unit (Graphics Processing Unit, GPU), various dedicated artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) computing chips, various computing units running machine learning model algorithms, digital signal processors (DIGITAL SIGNAL processors, DSPs), and any suitable processors, controllers, microcontrollers, etc. The computing unit 801 performs the respective methods and processes described above, for example, a three-dimensional point cloud data processing method. For example, in some embodiments, the three-dimensional point cloud data processing method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 800 via ROM 802 and/or communication unit 809. When a computer program is loaded into the RAM 803 and executed by the computing unit 801, one or more steps of the three-dimensional point cloud data processing method described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the three-dimensional point cloud data processing method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuitry, field programmable gate arrays (Field Programmable GATE ARRAY, FPGA), application-specific integrated circuits (ASIC), application-specific standard Products (ASSP), system-on-Chip Systems (SOC), load-programmable logic devices (Complex Programmable Logic Device, CPLD), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a random access Memory, a read-Only Memory, an erasable programmable read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable compact disc read-Only Memory (Compact Disk Read Only Memory, CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., cathode Ray Tube (CRT) or Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: a local area network (Local Area Network, LAN), a wide area network (Wide Area Network, WAN), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (13)

1. A three-dimensional point cloud data processing method, comprising:
Acquiring two groups of three-dimensional point cloud data;
Determining plane groups corresponding to the two sets of three-dimensional point cloud data respectively based on the two sets of three-dimensional point cloud data, wherein the plane groups are combinations of a plurality of planes constructed for the same object;
constructing characteristic point lattices corresponding to the two sets of three-dimensional point cloud data respectively based on plane sets corresponding to the two sets of three-dimensional point cloud data respectively;
Determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data;
Determining a target transformation matrix based on the initial transformation matrix, the target transformation matrix being a transformation matrix for registration;
the constructing the feature point lattice corresponding to the two sets of three-dimensional point cloud data based on the plane sets corresponding to the two sets of three-dimensional point cloud data respectively includes:
Determining unit normal vectors for planes in the plane groups corresponding to the two sets of three-dimensional point cloud data respectively, wherein the directions of the normal vectors of the planes in the plane groups corresponding to the two sets of three-dimensional point cloud data correspond to each other one by one;
Constructing characteristic point lattices corresponding to the two sets of three-dimensional point cloud data according to the intersection points of three planes of the plane sets corresponding to the two sets of three-dimensional point cloud data respectively, so that the characteristic points constructed by the two sets of three-dimensional point cloud data are in one-to-one correspondence;
The determining an initial transformation matrix based on the feature point lattices respectively corresponding to the two sets of three-dimensional point cloud data comprises the following steps:
adding one-dimensional unit vectors to the first characteristic point lattices respectively corresponding to the two sets of three-dimensional point cloud data to obtain second characteristic point lattices respectively corresponding to the two sets of three-dimensional point cloud data;
And obtaining an initial transformation matrix according to the second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data and the relation between the second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data.
2. The method of claim 1, wherein the determining, based on the two sets of three-dimensional point cloud data, the planar sets to which the two sets of three-dimensional point cloud data respectively correspond comprises:
Acquiring registration scene information;
determining a target algorithm matched with the registration scene information from an algorithm library;
And fitting by adopting the target algorithm based on the two groups of three-dimensional point cloud data to obtain plane groups respectively corresponding to the two groups of three-dimensional point cloud data.
3. The method of claim 1, wherein the determining a target transformation matrix based on the initial transformation matrix comprises:
determining Euler angles and translation parameters through the initial transformation matrix;
and obtaining the target transformation matrix based on the Euler angle and the translation parameter.
4. A method according to claim 3, wherein said determining euler angles and translation parameters from said initial transformation matrix comprises:
the euler angle is calculated according to the following formula:
Wherein α, β, γ are euler angles, T 0 is an initial transformation matrix, T 0,1,1 represents row 1 and column 1 of T 0, T 0,1,2 represents row 1 and column 2 of T 0, T 0,1,3 represents row 1 and column 3 of T 0, and T 0,2,3 represents row 2 and column 3 of T 0; t 0,3,3 represents row 3 and column 3 of T 0;
the translation parameters were calculated according to the following formula:
Where T x、ty、tz is a translation parameter, T 0,4,1 represents row 4, column 1 of T 0, T 0,4,2 represents row 4, column 2 of T 0, and T 0,4,3 represents row 4, column 3 of T 0.
5. The method according to claim 3 or 4, wherein said deriving a target transformation matrix from said euler angles and said translation parameters comprises:
The target transformation matrix is calculated according to the following formula:
Wherein T represents a target transformation matrix, alpha, beta and gamma are Euler angles, and T x、ty、tz is a translation parameter.
6. A three-dimensional point cloud data processing apparatus, comprising:
the acquisition unit is used for acquiring two groups of three-dimensional point cloud data;
The first determining unit is used for determining plane groups corresponding to the two sets of three-dimensional point cloud data respectively based on the two sets of three-dimensional point cloud data, wherein the plane groups are combinations of a plurality of planes constructed for the same object;
The construction unit is used for constructing characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data based on the plane groups respectively corresponding to the two groups of three-dimensional point cloud data;
the second determining unit is used for determining an initial transformation matrix based on the characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data;
a third determining unit configured to determine a target transformation matrix based on the initial transformation matrix, the target transformation matrix being a transformation matrix for registration;
the construction unit includes:
The first determining subunit is configured to determine a unit normal vector for each plane in the plane group corresponding to the two sets of three-dimensional point cloud data, where the directions of the normal vectors of the planes in the plane group corresponding to the two sets of three-dimensional point cloud data are in one-to-one correspondence;
the construction subunit is used for constructing characteristic point lattices corresponding to the two sets of three-dimensional point cloud data according to the intersection points of the three planes of the plane sets corresponding to the two sets of three-dimensional point cloud data respectively, so that the characteristic points constructed by the two sets of three-dimensional point cloud data are in one-to-one correspondence;
The second determination unit includes:
The transformation subunit is used for adding one-dimensional unit vectors to the first characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data to obtain second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data;
And the second determining subunit is used for obtaining an initial transformation matrix according to the second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data and the relationship between the second characteristic point lattices respectively corresponding to the two groups of three-dimensional point cloud data.
7. The apparatus of claim 6, wherein the first determining unit comprises:
an acquisition subunit, configured to acquire registration scene information;
The matching subunit is used for determining a target algorithm matched with the registration scene information from an algorithm library;
and the fitting subunit is used for fitting and obtaining plane groups respectively corresponding to the two groups of three-dimensional point cloud data by adopting the target algorithm based on the two groups of three-dimensional point cloud data.
8. The apparatus of claim 6, wherein the third determining unit comprises:
A third determining subunit, configured to determine an euler angle through the initial transformation matrix;
A fourth determining subunit, configured to determine a translation parameter through the initial transformation matrix;
And a fifth determining subunit, configured to obtain the target transformation matrix based on the euler angle and the translation parameter.
9. The apparatus of claim 8, wherein,
The third determining subunit is configured to:
the euler angle is calculated according to the following formula:
Wherein α, β, γ are euler angles, T 0 is an initial transformation matrix, T 0,1,1 represents row 1 and column 1 of T 0, T 0,1,2 represents row 1 and column 2 of T 0, T 0,1,3 represents row 1 and column 3 of T 0, and T 0,2,3 represents row 2 and column 3 of T 0; t 0,3,3 represents row 3 and column 3 of T 0;
The fourth determination subunit is configured to calculate the translation parameter according to the following formula:
Where T x、ty、tz is a translation parameter, T 0,4,1 represents row 4, column 1 of T 0, T 0,4,2 represents row 4, column 2 of T 0, and T 0,4,3 represents row 4, column 3 of T 0.
10. The apparatus of claim 8 or 9, wherein the fifth determination subunit is configured to:
The target transformation matrix is calculated according to the following formula:
Wherein T represents a target transformation matrix, alpha, beta and gamma are Euler angles, and T x、ty、tz is a translation parameter.
11. An electronic device, comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-5.
13. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any of claims 1-5.
CN202210629210.1A 2022-05-30 2022-05-30 Three-dimensional point cloud processing method, device, equipment and storage medium Active CN114926549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210629210.1A CN114926549B (en) 2022-05-30 2022-05-30 Three-dimensional point cloud processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210629210.1A CN114926549B (en) 2022-05-30 2022-05-30 Three-dimensional point cloud processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114926549A CN114926549A (en) 2022-08-19
CN114926549B true CN114926549B (en) 2024-05-14

Family

ID=82812497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210629210.1A Active CN114926549B (en) 2022-05-30 2022-05-30 Three-dimensional point cloud processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114926549B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117745778B (en) * 2024-02-01 2024-05-28 法奥意威(苏州)机器人系统有限公司 Point cloud registration realization method and device, storage medium and electronic equipment
CN117788529A (en) * 2024-02-26 2024-03-29 苏州艾吉威机器人有限公司 Three-dimensional plane point cloud coarse registration method, system, medium and equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109493375A (en) * 2018-10-24 2019-03-19 深圳市易尚展示股份有限公司 The Data Matching and merging method of three-dimensional point cloud, device, readable medium
CN109559340A (en) * 2018-11-29 2019-04-02 东北大学 A kind of parallel three dimensional point cloud automation method for registering
CN110688440A (en) * 2019-09-29 2020-01-14 中山大学 Map fusion method suitable for less sub-map overlapping parts
CN110738730A (en) * 2019-10-15 2020-01-31 业成科技(成都)有限公司 Point cloud matching method and device, computer equipment and storage medium
CN114387319A (en) * 2022-01-13 2022-04-22 北京百度网讯科技有限公司 Point cloud registration method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697749A (en) * 2017-10-20 2019-04-30 虹软科技股份有限公司 A kind of method and apparatus for three-dimensional modeling

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109493375A (en) * 2018-10-24 2019-03-19 深圳市易尚展示股份有限公司 The Data Matching and merging method of three-dimensional point cloud, device, readable medium
CN109559340A (en) * 2018-11-29 2019-04-02 东北大学 A kind of parallel three dimensional point cloud automation method for registering
CN110688440A (en) * 2019-09-29 2020-01-14 中山大学 Map fusion method suitable for less sub-map overlapping parts
CN110738730A (en) * 2019-10-15 2020-01-31 业成科技(成都)有限公司 Point cloud matching method and device, computer equipment and storage medium
CN114387319A (en) * 2022-01-13 2022-04-22 北京百度网讯科技有限公司 Point cloud registration method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A point cloud registration method based on dual quaternion description with point-linear feature constraints;Raobo Li et al.;《International Journal of Remote Sensing》;2517-2537 *
基于FPFH和法向量的改进点云配准算法;韩一菲等;《半导体光电》;第42卷(第04期);579-584 *

Also Published As

Publication number Publication date
CN114926549A (en) 2022-08-19

Similar Documents

Publication Publication Date Title
CN110307838B (en) Robot repositioning method and device, computer-readable storage medium and robot
CN114926549B (en) Three-dimensional point cloud processing method, device, equipment and storage medium
CN108279670B (en) Method, apparatus and computer readable medium for adjusting point cloud data acquisition trajectory
US10346996B2 (en) Image depth inference from semantic labels
EP4027299A2 (en) Method and apparatus for generating depth map, and storage medium
CN116559928B (en) Pose information determining method, device and equipment of laser radar and storage medium
CN112862006A (en) Training method and device for image depth information acquisition model and electronic equipment
CN116188893A (en) Image detection model training and target detection method and device based on BEV
CN114140759A (en) High-precision map lane line position determining method and device and automatic driving vehicle
CN115457152A (en) External parameter calibration method and device, electronic equipment and storage medium
CN114528941A (en) Sensor data fusion method and device, electronic equipment and storage medium
CN114299242A (en) Method, device and equipment for processing images in high-precision map and storage medium
CN113436233A (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN115880555B (en) Target detection method, model training method, device, equipment and medium
CN114674328B (en) Map generation method, map generation device, electronic device, storage medium, and vehicle
CN114266876B (en) Positioning method, visual map generation method and device
CN113762397B (en) Method, equipment, medium and product for training detection model and updating high-precision map
CN115127565A (en) High-precision map data generation method and device, electronic equipment and storage medium
CN114910892A (en) Laser radar calibration method and device, electronic equipment and storage medium
CN114565721A (en) Object determination method, device, equipment, storage medium and program product
CN114565683A (en) Precision determination method, device, equipment, medium and product
CN116559927B (en) Course angle determining method, device, equipment and medium of laser radar
CN115797585B (en) Parking lot map generation method and device
CN117075171B (en) Pose information determining method, device and equipment of laser radar and storage medium
CN117629186A (en) Map construction method, model training method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant