CN115407304A - Point cloud data processing method and device - Google Patents

Point cloud data processing method and device Download PDF

Info

Publication number
CN115407304A
CN115407304A CN202211051091.2A CN202211051091A CN115407304A CN 115407304 A CN115407304 A CN 115407304A CN 202211051091 A CN202211051091 A CN 202211051091A CN 115407304 A CN115407304 A CN 115407304A
Authority
CN
China
Prior art keywords
point cloud
speed
cloud data
data
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211051091.2A
Other languages
Chinese (zh)
Inventor
张勇
王宇
郭昌野
黄佳伟
张林灿
李创辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202211051091.2A priority Critical patent/CN115407304A/en
Publication of CN115407304A publication Critical patent/CN115407304A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a processing method and a processing device for point cloud data. Wherein, the method comprises the following steps: acquiring first point cloud data, second point cloud data and the running speed of a target vehicle in the running process of the target vehicle, wherein the first point cloud data is obtained by sensing a first target area through a millimeter wave radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a laser radar installed on the target vehicle, and the first target area is partially overlapped with the second target area; determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed; and transforming the second point cloud data based on the second point cloud speed and the running speed to obtain target point cloud data. The invention solves the technical problem of low accuracy of describing the form of the detected object when the detected object moves in the related technology.

Description

Point cloud data processing method and device
Technical Field
The invention relates to the field of intelligent automobiles, in particular to a point cloud data processing method and a point cloud data processing device.
Background
At present, a laser radar is deployed on an automatic driving vehicle, and the three-dimensional perception of the surrounding environment of the vehicle can be realized through the laser radar, so that support is provided for high-precision map making and obstacle detection. However, in the process of vehicle movement, different lidar points in each frame of point cloud acquired by the lidar are not acquired at the same time, so that coordinate systems of different laser points in the same frame of lidar point cloud are inconsistent, and the form of a detected object cannot be accurately described.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a processing method and a processing device of point cloud data, which at least solve the technical problem that the point cloud data acquired by a laser radar cannot accurately describe the form of a detected object in the related technology.
According to an aspect of an embodiment of the present invention, a method for processing point cloud data is provided, including: acquiring first point cloud data, second point cloud data and the running speed of a target vehicle in the running process of the target vehicle, wherein the first point cloud data is obtained by sensing a first target area through a millimeter wave radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a laser radar installed on the target vehicle, and the first target area is partially overlapped with the second target area; determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed; and transforming the second point cloud data based on the second point cloud speed and the running speed to obtain target point cloud data.
Optionally, transforming the second point cloud data based on the second point cloud speed and the operating speed to obtain target point cloud data, including: generating a transformation matrix based on the second point cloud speed and the running speed; and acquiring the product of the transformation matrix and the second point cloud data to obtain target point cloud data.
Optionally, the second point cloud data includes a point cloud time, and the generating a transformation matrix based on the second point cloud speed and the operating speed includes: obtaining a difference value between the speed of the second point cloud and the running speed to obtain a speed difference; obtaining the time difference between the obtained running time of the second radar and the point cloud time difference value; obtaining the product of the speed difference and the time difference to obtain the relative motion position of the target vehicle and the second point cloud data; based on the relative motion position, a transformation matrix is generated.
Optionally, determining a second point cloud speed corresponding to the second point cloud data based on the first point cloud speed corresponding to the first point cloud data includes: dividing a preset plane to obtain a plurality of grids, wherein the preset plane is a plane in a preset coordinate system corresponding to the first point cloud data and the second point cloud data; mapping the first point cloud data with a plurality of grids to obtain a first point cloud corresponding to each grid; mapping the second point cloud data with a plurality of grids to obtain a second point cloud corresponding to each grid; second point cloud data of a second point cloud is determined based on a first point cloud speed of the first point cloud.
Optionally, determining second point cloud data for a second point cloud based on the first point cloud velocity for the first point cloud comprises: determining a ground point cloud in the second point cloud for representing the ground; determining second point cloud speeds of other point clouds except the ground point cloud in the second point cloud based on the first point cloud speed; and determining the second point cloud speed of the ground point cloud as a preset speed.
Optionally, determining a second point cloud speed of other point clouds in the second point cloud except the target point cloud based on the first point cloud speed comprises: determining a point cloud center corresponding to the second point cloud based on the second point cloud; acquiring the distance between the first point cloud and the point cloud centroid; determining a target point cloud in the first point cloud based on the distance, wherein the distance between the target point cloud and the point cloud centroid is smaller than a preset distance; obtaining an average value of first point cloud speeds of the target point cloud to obtain an average speed; and determining the average speed as a second point cloud speed of other point clouds.
Optionally, the obtaining the first point cloud data and the second point cloud data includes: acquiring first perception data perceived by a first radar, second perception data perceived by a second radar, and calibration parameters of the first radar and the second radar; and respectively transforming the first sensing data and the second sensing data to a preset coordinate system based on the calibration parameters to obtain first point cloud data and second point cloud data, wherein the preset coordinate system is constructed by taking a projection point from a preset position on the target vehicle to the ground as a center.
Optionally, obtaining calibration parameters of the first radar and the second radar includes: acquiring a first installation position of a first radar on a target vehicle and a second installation position of a second radar on the target vehicle; and calibrating the first radar and the second radar based on the first mounting position and the second mounting position to obtain calibration parameters.
According to another aspect of the embodiments of the present invention, there is provided a processing apparatus of point cloud data, including: the acquisition module is used for acquiring first point cloud data, second point cloud data and the running speed of the target vehicle in the running process of the target vehicle, wherein the first point cloud data is obtained by sensing a first target area through a first radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a second radar installed on the target vehicle, and the first target area is partially overlapped with the second target area; the determining module is used for determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, and the first point cloud data comprises the first point cloud speed; and the transformation module is used for transforming the second point cloud data based on the second point cloud speed and the running speed to obtain target point cloud data.
According to another aspect of the embodiments of the present invention, there is also provided a target vehicle including: one or more processors; storage means for storing one or more programs; when executed by one or more processors, the one or more programs cause the one or more processors to perform the point cloud data processing method of any one of the above embodiments.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, which includes a stored program, wherein when the program runs, an apparatus where the computer-readable storage medium is located is controlled to execute the point cloud data processing method in any one of the above embodiments.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes the point cloud data processing method in any one of the above embodiments when running.
In the embodiment of the invention, in the running process of a target vehicle, first point cloud data, second point cloud data and the running speed of the target vehicle are obtained, the first point cloud data is obtained by sensing a first target area through a first radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a second radar installed on the target vehicle, and the first target area and the second target area are partially overlapped; further determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed; and further transforming the second point cloud data based on the second point cloud speed and the operating speed to obtain target point cloud data. It is easy to note that the purpose of acquiring different laser radar point clouds in each frame of point cloud at the same time is achieved by acquiring the first point cloud data, the second point cloud data and the running speed of a target vehicle, so that the technical effect of consistent coordinates of different laser points in the same frame of laser radar point is achieved, the second point cloud speed corresponding to the second point cloud data is further determined based on the first point cloud speed corresponding to the first point cloud data, the purpose of acquiring the point cloud speed of the laser radar reflected by a detected object is achieved, the technical effect of calculating and expressing the point cloud speed of the laser radar by using the point cloud speed of the millimeter radar is achieved, the second point cloud data is further converted based on the second point cloud speed and the running speed, the purpose of completing motion compensation by converting the acquired second point cloud speed is achieved, the technical effect of describing the shape of the detected object by using the perfect laser radar point cloud motion compensation is achieved, and the technical problem that the shape of the detected object cannot be accurately described by the point cloud data acquired by the laser radar in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention and do not constitute a limitation of the invention. In the drawings:
fig. 1 is a flowchart of a method of processing point cloud data according to the present embodiment;
FIG. 2 is a schematic diagram of an alternative vehicle radar top view in accordance with an embodiment of the present invention;
FIG. 3 is a schematic illustration of an alternative radar detection coordinate system according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an alternative grid coordinate system in accordance with embodiments of the present invention;
FIG. 5 is a schematic illustration of an alternative front view of a vehicle radar in accordance with an embodiment of the present invention;
FIG. 6 is a flow chart of an alternative overall processing method according to an embodiment of the invention;
fig. 7 is a schematic diagram of a device for processing point cloud data according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the autopilot system, utilize laser radar to obtain some cloud data, and laser radar carries out data acquisition along with the vehicle removes in the scanning process, leads to all points not to gather in a coordinate system, then will lead to laser radar point cloud distortion, wherein, concrete reason includes: the laser radar point cloud data acquired by the TOF method (Time Of flight Time method) are not acquired at the same Time, the laser radar detection is accompanied by the movement Of a carrier or the movement Of a detected object, and when the frame rate Of the laser radar is low, the movement distortion Of the carrier or the detected object cannot be ignored.
Because the laser radar calculates the laser point cloud coordinate, the reference coordinate system of the laser radar itself which receives the laser beam is used as the reference, so in the moving process of the carrier, the reference coordinate system of each laser radar point cloud is not consistent, but is transmitted in the same frame point cloud in a period finally, and in the automatic driving system, different laser radar point coordinate systems in the same frame need to be unified (namely, the motion compensation is completed), however, in the prior art, the laser radar point cloud speed cannot be directly obtained in the laser radar point cloud attribute, the motion speed and time of the laser radar carrier are obtained through the vehicle-mounted sensor and are used as the laser point cloud speed, the motion coordinate is calculated through the speed and time of different laser radar points, the motion compensation is completed by utilizing the coordinate conversion, but when the detected object moves, the motion compensation cannot be realized, and the form of the detected object cannot be accurately described.
The method and the device have the advantages that the relative speed of the point cloud of the laser radar is expanded by utilizing the characteristic of the point cloud speed of the millimeter wave radar acquired by the millimeter wave radar, so that the transformation matrix is calculated according to the relative speed and time of the point cloud of the laser radar and the laser radar, all the point clouds are transformed, and the point cloud after motion compensation is obtained, and the method and the device can be applied to the following scenes:
(1) The laser radar carrier is static, and the detected object moves, so that the motion compensation can be realized;
(2) The laser radar carrier moves, and the detected objects move oppositely, so that motion compensation can be realized;
(3) The laser radar carrier moves, and the detected object moves in the same direction, so that motion compensation can be realized.
Example 1
In accordance with an embodiment of the present invention, there is provided a method for processing point cloud data, it should be noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of a method for processing point cloud data according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, in the running process of a target vehicle, acquiring first point cloud data, second point cloud data and the running speed of the target vehicle, wherein the first point cloud data is obtained by sensing a first target area through a millimeter wave radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a laser radar installed on the target vehicle, and the first target area is partially overlapped with the second target area.
The target vehicle described above may be an autonomous vehicle in which the user is riding; the first point cloud data may be point cloud data obtained by a millimeter wave radar installed on the autonomous vehicle, including but not limited to: coordinates (x, y, z) of a plurality of point clouds and a velocity of each point cloud, corresponding time; the running speed of the target vehicle may be the running speed of the autonomous vehicle detected by the in-vehicle sensor; the second point cloud data may be point cloud data obtained by a lidar mounted on the autonomous vehicle, including but not limited to: coordinates (x, y, z) of the plurality of point clouds and a time corresponding to each point cloud; the first target region may be a region to be detected by the millimeter wave radar, and the second target region may be a region to be detected by the laser radar, as shown in fig. 2, a region 1 is the first target region, and a region 2 is the second target region.
It should be noted that the first point cloud data and the second point cloud data not only include the point cloud of the detected object, but also include the target point cloud.
The millimeter wave radar can be a radar which works in a millimeter wave band for detection, namely in a frequency domain of 30-300 GHz; the laser radar may be a radar that detects a characteristic quantity such as a position, a speed, and the like of an object by emitting a laser beam.
In an optional embodiment, an automatic driving controller, a laser radar and a millimeter wave radar are deployed on the automatic driving vehicle, in the running process of the automatic driving vehicle, first point cloud data can be collected through the millimeter wave radar, and second point cloud data can be collected through the laser radar, wherein attributes contained in the laser radar point cloud at least comprise coordinates (x, y, x) of a point and corresponding time, and attributes contained in the millimeter wave radar point cloud at least comprise the coordinates (x, y, z) of the point and the speed and corresponding time of the point. In addition, in the running process, the corresponding vehicle speed and the corresponding angular speed of the vehicle can be collected in a cycle of 10ms and stored with the corresponding time stamp.
It should be noted that, before laser radar and millimeter wave radar carry out data acquisition, at first do global time synchronization to autopilot controller, laser radar and millimeter wave radar, ensure that autopilot controller can fuse laser radar point cloud and millimeter wave radar point cloud that same time was gathered, then accomplish laser radar and millimeter wave radar and mark, unify the coordinate system of laser radar point cloud and millimeter wave radar point cloud, wherein, can regard vehicle rear axle central point to ground projection point as the coordinate system original point.
And step S104, determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed.
The first point cloud speed may be a millimeter wave radar point cloud speed included in the first point cloud data detected by the millimeter wave radar, and the second point cloud speed may be a laser radar point cloud speed calculated from the first point cloud speed.
In an alternative embodiment, after the first point cloud data and the second point cloud data are obtained, it may be determined whether the first point cloud and the second point cloud are similar by matching the first point cloud in the first point cloud data with the second point cloud in the second point cloud data, for example, by calculating an euclidean distance between the first point cloud and the second point cloud, and if the first point cloud and the second point cloud are similar, the first point cloud speed of the first point cloud may be directly used as the second point cloud speed of the second point cloud.
Because the laser radar and the millimeter wave radar can not only sense other obstacles around the automatic driving vehicle, but also sense the ground, ground points can be removed from the first point cloud data and the second point cloud data, other points are matched, and the speed of the second point cloud corresponding to the ground points is determined to be 0.
In addition, because the number of the point clouds included in the first point cloud data and the second point cloud data is large, in order to reduce the calculation amount and reduce the processing pressure of the automatic driving controller, the space can be divided according to the size of a grid unit, the first point cloud data and the second point cloud data are placed in corresponding grids, then ground points in the point clouds are removed, the mass centers of the remaining point clouds are calculated, the Euclidean distance/Mahalanobis distance of the point clouds of the millimeter wave radar point cloud and the laser radar point cloud mass center in the grids are respectively calculated, the millimeter wave radar point cloud which does not meet a threshold value is further removed, more accurate data of the first point cloud are obtained through calculation, finally the average speed of the remaining millimeter wave radar point cloud is calculated, and the average speed of the millimeter wave radar point cloud from which the ground points are removed is used as the laser radar point cloud speed (namely the second point cloud speed) in the grids.
And S106, converting the second point cloud data based on the second point cloud speed and the running speed to obtain target point cloud data.
The target point cloud data may be a laser radar point cloud finally obtained, that is, a point cloud after motion compensation.
In an optional embodiment, the relative motion position of the detected object and the autonomous vehicle may be determined based on the second point cloud speed and the operating speed according to a time relationship, and then the second point cloud data may be subjected to motion compensation based on the relative motion position to obtain the target point cloud data.
In another optional embodiment, after the first point cloud data and the second point cloud data are placed in corresponding grids, the vehicle speed corresponding to each grid can be read according to a time relationship, the relative speed between the grid point cloud and the vehicle speed is respectively calculated to obtain an initialized transformation matrix, and the target point cloud data is obtained by transformation through a calculation formula, so that the relative speed of the laser radar point cloud of the detected object is obtained by compensating the obtained reflection point cloud speed of the detected object and the motion speed of the laser radar body, and the motion compensation is completed by transforming the target point cloud speed, so that the form of the detected object can be more accurately described.
Through the steps, in the running process of the target vehicle, acquiring first point cloud data, second point cloud data and the running speed of the target vehicle, wherein the first point cloud data is obtained by sensing a first target area through a millimeter wave radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a laser radar installed on the target vehicle, and the first target area is partially overlapped with the second target area; further determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed; and further transforming the second point cloud data based on the second point cloud speed and the operating speed to obtain target point cloud data. It is noted easily that, through the second point cloud speed that the second point cloud data corresponds is confirmed to first point cloud speed based on first point cloud data corresponds, realize the purpose to laser radar point cloud extension speed attribute, and then transform the second point cloud data based on the operating speed of second point cloud speed and target vehicle, realize carrying out motion compensation's purpose to the laser radar point cloud, fully consider the condition of laser radar and surveyed object motion, reach the technological effect who promotes the description degree of accuracy of surveyed object form, and then solved the technical problem that the point cloud data that gathers through laser radar can't the accurate description of surveyed object form among the correlation technique.
In the above embodiment of the present invention, transforming the second point cloud data based on the second point cloud speed and the operating speed to obtain the target point cloud data includes: generating a transformation matrix based on the second point cloud speed and the running speed; and acquiring a product of the transformation matrix and the second point cloud data to obtain target point cloud data.
The transformation matrix may be a quaternion Q determined according to the relative motion parameters, and after the grids are divided, each grid corresponds to one transformation matrix.
In an optional embodiment, the relative motion parameters of the detected object and the laser radar can be determined based on the second point cloud speed and the operation speed, so as to initialize the transformation matrix, and then the position transformation is performed on the laser radar point cloud according to the transformation matrix, wherein the specific calculation formula is as follows: p '= Q × P (P' is a point cloud coordinate in the target point cloud data, and P is a point cloud coordinate in the second point cloud data), and target point cloud data, that is, the point cloud after motion compensation, is obtained.
Further, the second point cloud data comprises point cloud time, and a transformation matrix is generated based on the second point cloud speed and the operating speed, and the transformation matrix comprises: obtaining a difference value between the speed of the second point cloud and the running speed to obtain a speed difference; obtaining a time difference value between the obtained running time of the second radar and the point cloud time difference value; obtaining the product of the speed difference and the time difference to obtain the relative motion position of the target vehicle and the second point cloud data; based on the relative motion position, a transformation matrix is generated.
In an alternative embodiment, the loudness velocity and the time difference value of the second point cloud velocity and the running velocity may be calculated according to a time relationship, the motion of the autonomous vehicle and the detected object may be equivalent to a uniform motion in the point cloud acquisition period, and then the relative motion position of the detected object and the autonomous vehicle may be calculated by a formula of s = vt, and then a transformation matrix, generally a quaternion Q, is initialized according to the relative motion parameters.
In another optional embodiment, after the first point cloud data and the second point cloud data are put into the corresponding grids, the vehicle speed, the angular speed and the time corresponding to each grid can be read according to the time relationship, and the relative speed and the time difference between the grid point cloud and the vehicle speed are respectively calculated. In a point cloud acquisition period (the current conventional laser radar period is 100 ms), motion is equivalent to uniform motion, the relative motion position can be calculated (decomposed and calculated according to an xy coordinate system) through s = vt, a change matrix, generally a quaternion Q, is initialized according to relative motion parameters, and a transformation matrix Q of each grid is obtained.
Optionally, determining a second point cloud speed corresponding to the second point cloud data based on the first point cloud speed corresponding to the first point cloud data includes: dividing a preset plane to obtain a plurality of grids, wherein the preset plane is a plane in a preset coordinate system corresponding to the first point cloud data and the second point cloud data; mapping the first point cloud data with a plurality of grids to obtain a first point cloud corresponding to each grid; mapping the second point cloud data with a plurality of grids to obtain a second point cloud corresponding to each grid; second point cloud data of the second point cloud is determined based on the first point cloud speed of the first point cloud.
The preset coordinate system may be a coordinate system established by taking a projection point from a central point of a rear axle of the vehicle to the ground as an origin of the coordinate system, an x-axis as a driving direction of the vehicle, a y-axis as a central point of a wheel, and a z-axis as a projection point from the radar to the ground, as shown in fig. 3; the preset plane can be an xy plane in a preset coordinate system; the grid can be a data form which divides the space into regular grids, each grid is called a unit, and corresponding attribute values are given to the units to represent the entity; the first point cloud may be a point cloud detected by a millimeter wave radar; the second point cloud may be a point cloud detected by a lidar.
Specifically, in the embodiment of the present invention, a schematic diagram of a radar detection coordinate system is shown in fig. 3, a coordinate system is established by using a point 0 from a center point of a rear axle of a vehicle to a ground projection point as an origin of the coordinate system, an x-axis is a driving direction of the vehicle, a y-axis is a center point of a wheel, and a z-axis is a projection point from the radar to the ground, related point cloud data is obtained in the coordinate system, a grid is divided in an xy plane according to a detection range, and no segmentation is performed in the z direction, in this embodiment, a schematic diagram of the grid coordinate system is shown in fig. 4, a laser radar point cloud and a millimeter wave radar point cloud are converted into a radar detection coordinate system, a point O is the origin of the grid coordinate system, a point M is a center point of the rear axle of the wheel, and a corresponding point cloud is placed in a corresponding grid according to a unit size of the grid, wherein a calculation formula is: grid _ x = (x-xmin)/[ delta ] x; grid _ y = (y-ymin)/. DELTA.y.
Further, determining second point cloud data for a second point cloud based on the first point cloud velocity for the first point cloud, comprising: determining a ground point cloud in the second point cloud for representing the ground; determining second point cloud speeds of other point clouds except the ground point cloud in the second point cloud based on the first point cloud speed; and determining the second point cloud speed of the ground point cloud as a preset speed.
The ground point cloud can be the point cloud which is transmitted and received by the laser radar and can accurately represent the ground with high precision; the preset velocity may be a velocity used to represent the ground point cloud, typically taking the value 0.
In an alternative embodiment, the ground points present in the grid may be determined, then the ground points in the laser point cloud are removed, the remaining point cloud centroid is calculated, the second point cloud velocity of the other point clouds except the ground points in the second point cloud is determined, and the second point cloud velocity of the ground points may be directly set to 0.
Further, determining a second point cloud speed of other point clouds in the second point cloud except the target point cloud based on the first point cloud speed, including: determining a point cloud center corresponding to the second point cloud based on the second point cloud; acquiring the distance between the first point cloud and the point cloud centroid; determining a target point cloud in the first point cloud based on the distance, wherein the distance between the target point cloud and the point cloud centroid is smaller than a preset distance; obtaining an average value of first point cloud speeds of the target point cloud to obtain an average speed; and determining the average speed as a second point cloud speed of other point clouds.
The point cloud centroid can be the average value of the values of all points in the cloud through calculation in the coordinates; the preset distance can be a preset distance threshold value indicating that the first point cloud and the second point cloud are similar, and can be set manually by a user; .
In an alternative embodiment, the ground points in the lidar point cloud existing in the grid may be calculated, the ground points are removed, the center of mass of the remaining point cloud is calculated, the euclidean distance/mahalanobis distance between the millimeter wave radar point cloud in the grid and the center of mass of the lidar point cloud is calculated, then the non-conforming millimeter wave radar point cloud is removed according to the preset distance, the average speed of the remaining millimeter wave radar point cloud is calculated, the average speed of the remaining millimeter wave radar point cloud can be used as the lidar point cloud speed of the non-ground points in the grid, and the speed of the ground points in the lidar point cloud is set to 0.
Optionally, the obtaining the first point cloud data and the second point cloud data includes: acquiring first perception data perceived by a first radar, second perception data perceived by a second radar, and calibration parameters of the first radar and the second radar; and respectively transforming the first sensing data and the second sensing data to a preset coordinate system based on the calibration parameters to obtain first point cloud data and second point cloud data, wherein the preset coordinate system is constructed by taking a projection point from a preset position on the target vehicle to the ground as a center.
The first perception data may be perception data obtained by a millimeter wave radar (i.e., a first radar) through detection; the second perception data may be perception data acquired by the laser radar (i.e., the second radar) through detection; the calibration parameters can be calibration parameters obtained by calibrating the laser radar and the millimeter wave radar; the projection point of the ground can be the center point of the millimeter wave radar and the laser radar projected to the ground.
In an optional embodiment, in the process of running of the automatic driving vehicle, the laser radar and the millimeter wave radar are used for detecting respectively to obtain first perception data and second perception data, at the moment, coordinate systems corresponding to the first perception data and the second perception data are different, so that the first perception data and the second perception data can be converted into a preset coordinate system according to calibration data obtained by calibrating the laser radar and the millimeter wave radar to obtain first point cloud data and second point cloud data, and the purpose of unifying the coordinate systems is achieved.
Further, obtaining calibration parameters of the first radar and the second radar includes: acquiring a first installation position of a first radar on a target vehicle and a second installation position of a second radar on the target vehicle; and calibrating the first radar and the second radar based on the first mounting position and the second mounting position to obtain calibration parameters.
As described above, the first mounting position may be a mounting position of a millimeter wave radar disposed on the vehicle; the second mounting location may be a mounting location on the vehicle where the lidar is mounted.
In an alternative embodiment, a schematic diagram of a front view of deployed radars on an autonomous vehicle is shown in fig. 5, and two radars may be calibrated based on their installation positions to obtain calibration parameters, where the radar 3 is a laser radar and the radar 4 is a millimeter wave radar.
It should be noted that the radar calibration is to calculate a difference between a coordinate system of the radar itself and a coordinate system (such as a vehicle body coordinate system) required by us, so as to obtain a detection result of the laser radar in the vehicle coordinate system, thereby facilitating subsequent sensing calculation.
In addition, on the automatic driving vehicle, the radar is rigidly connected with the vehicle body, the relative attitude and displacement between the radar and the vehicle body are fixed, and the data points obtained by scanning the laser radar have unique position coordinates in an environment coordinate system corresponding to the vehicle.
A preferred embodiment of the present invention is described in detail below with reference to fig. 6, and as shown in fig. 6, the method includes: acquiring point cloud data, acquiring the point cloud data on a millimeter wave radar and a laser radar deployed on an automatic driving vehicle, wherein the point cloud data comprises first point cloud data of the millimeter wave radar and second point cloud data of the laser radar, processing the acquired data, and putting the first point cloud data and the second point cloud data into a coordinate system; further calculating a second point cloud speed through the first point cloud speed, wherein after the speed is obtained through the millimeter wave radar point cloud, the laser radar point cloud is matched with the millimeter wave radar point cloud under the same dimension, the speed attribute is expanded according to the millimeter wave radar point cloud speed, and the calculated average speed of the rest millimeter wave radar point clouds is used as the laser radar point cloud speed of the non-ground points in the grid, namely the second point cloud speed; finally, obtaining the point cloud after motion compensation through a transformation matrix, wherein the transformation matrix is calculated by utilizing the running speed and the corresponding time of the laser radar body obtained by a related vehicle-mounted sensor, and then all the point clouds are transformed to obtain the point cloud after motion compensation; and finally, the form of the detected object can be accurately described.
Example 2
According to the embodiment of the invention, the invention further provides a processing device of the point cloud data. The device can execute the method for processing the point cloud data in the above embodiment, and the specific implementation manner and the preferred application scenario are the same as those in the above embodiment, which are not described herein again.
Fig. 7 is a schematic structural diagram of an apparatus for processing point cloud data according to an embodiment of the present invention, as shown in the figure, the apparatus includes the following parts: an acquisition module 70, a determination module 72, and a transformation module 74.
The acquiring module 70 is configured to acquire first point cloud data, second point cloud data, and an operating speed of a target vehicle during driving of the target vehicle, where the first point cloud data is obtained by sensing a first target area through a millimeter wave radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a laser radar installed on the target vehicle, and the first target area is partially overlapped with the second target area.
The determining module 72 is configured to determine a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, where the first point cloud data includes the first point cloud speed.
And the transformation module 74 is configured to transform the second point cloud data based on the second point cloud speed and the operating speed to obtain target point cloud data.
Optionally, the transformation module comprises: a generating unit, which is used for generating a transformation matrix based on the second point cloud speed and the running speed; and the first transformation unit is used for transforming the product of the matrix and the second point cloud data to obtain target point cloud data.
Optionally, the second point cloud data includes point cloud time, and the generating unit is further configured to obtain a difference between a second point cloud speed and an operating speed to obtain a speed difference; acquiring a time difference value between the running time of the second radar and the point cloud to obtain a time difference; obtaining the product of the speed difference and the time difference to obtain the relative motion position of the target vehicle and the second point cloud data; based on the relative motion position, a transformation matrix is generated.
Optionally, the determining module includes: the dividing unit is used for dividing a preset plane to obtain a plurality of grids, and the preset plane is a plane in a preset coordinate system corresponding to the first point cloud data and the second point cloud data; the first mapping unit is used for mapping the first point cloud data with a plurality of grids to obtain a first point cloud corresponding to each grid; the second mapping unit is used for mapping the second point cloud data with a plurality of grids to obtain a second point cloud corresponding to each grid; the determining unit is used for determining second point cloud data of the second point cloud based on the first point cloud speed of the first point cloud.
Optionally, the determining unit is further configured to determine a ground-you point cloud in the second point cloud for characterizing the ground; determining second point cloud speeds of other point clouds except the ground point cloud in the second point cloud based on the first point cloud speed; and determining the second point cloud speed of the ground point cloud as a preset speed.
Optionally, the determining unit is further configured to determine, based on the second point cloud, a point cloud centroid corresponding to the second point cloud; acquiring the distance between the first point cloud and the point cloud centroid; determining a target point cloud in the first point cloud based on the distance, wherein the distance between the target point cloud and the point cloud centroid is smaller than a preset distance; obtaining an average value of first point cloud speeds of the target point cloud to obtain an average speed; and determining the average speed as a second point cloud speed of other point clouds.
Optionally, the obtaining module includes: the acquisition unit is used for acquiring first perception data perceived by a first radar, second perception data perceived by a second radar and calibration parameters of the first radar and the second radar; and the second transformation unit is used for respectively transforming the first sensing data and the second sensing data to a preset coordinate system based on the calibration parameters to obtain first point cloud data and second point cloud data, and the preset coordinate system is constructed by taking a projection point from a preset position on the target vehicle to the ground as a center.
Optionally, the obtaining unit is further configured to obtain a first installation position of the first radar on the target vehicle and a second installation position of the second radar on the target vehicle; and calibrating the first radar and the second radar based on the first mounting position and the second mounting position to obtain calibration parameters.
Example 3
According to another aspect of the embodiments of the present invention, there is also provided a target vehicle including: one or more processors; storage means for storing one or more programs; when executed by one or more processors, the one or more programs cause the one or more processors to perform the point cloud data processing method of any one of the above embodiments.
Example 4
According to another aspect of the embodiments of the present invention, a computer-readable storage medium is further provided, where the computer-readable storage medium includes a stored program, and when the program runs, the apparatus where the computer-readable storage medium is located is controlled to execute the processing method of any one of the foregoing embodiments of cloud data.
Example 5
According to another aspect of the embodiments of the present invention, there is further provided a processor, where the processor is configured to execute a program, where the program executes the processing method of any one of the cloud data in the foregoing embodiments when running.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described in detail in a certain embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed technical content can be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and for example, a division of a unit may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (10)

1. A method for processing point cloud data is characterized by comprising the following steps:
in the running process of a target vehicle, acquiring first point cloud data, second point cloud data and the running speed of the target vehicle, wherein the first point cloud data is obtained by sensing a first target area through a millimeter wave radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a laser radar installed on the target vehicle, and the first target area is partially overlapped with the second target area;
determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed;
and transforming the second point cloud data based on the second point cloud speed and the operating speed to obtain target point cloud data.
2. The method of claim 1, wherein transforming the second point cloud data based on the second point cloud speed and the operating speed to obtain target point cloud data comprises:
generating a transformation matrix based on the second point cloud speed and the operating speed;
and acquiring a product of the transformation matrix and the second point cloud data to obtain the target point cloud data.
3. The method of claim 2, wherein the second point cloud data comprises a point cloud time, and wherein generating a transformation matrix based on the second point cloud speed and the operating speed comprises:
obtaining a difference value between the second point cloud speed and the running speed to obtain a speed difference;
obtaining a time difference value by the obtained running time of the second radar and the point cloud time difference value;
obtaining the product of the speed difference and the time difference to obtain the relative motion position of the target vehicle and the second point cloud data;
generating the transformation matrix based on the relative motion position.
4. The method of claim 1, wherein determining a second point cloud velocity corresponding to the second point cloud data based on a first point cloud velocity corresponding to the first point cloud data comprises:
Dividing a preset plane to obtain a plurality of grids, wherein the preset plane is a plane in a preset coordinate system corresponding to the first point cloud data and the second point cloud data;
mapping the first point cloud data with the grids to obtain a first point cloud corresponding to each grid;
mapping the second point cloud data with the plurality of grids to obtain a second point cloud corresponding to each grid;
determining second point cloud data for the second point cloud based on a first point cloud velocity of the first point cloud.
5. The method of claim 4, wherein determining second point cloud data for the second point cloud based on the first point cloud velocity for the first point cloud comprises:
determining a ground point cloud in the second point cloud for characterizing the ground;
determining a second point cloud speed of other point clouds except the ground point cloud in the second point cloud based on the first point cloud speed;
and determining the second point cloud speed of the ground point cloud as a preset speed.
6. The method of claim 5, wherein determining a second point cloud velocity for the other of the second point clouds than the target point cloud based on the first point cloud velocity comprises:
Determining a point cloud centroid corresponding to the second point cloud based on the second point cloud;
acquiring the distance between the first point cloud and the point cloud centroid;
determining a target point cloud in the first point cloud based on the distance, wherein the distance between the target point cloud and the point cloud centroid is less than a preset distance;
obtaining the average value of the first point cloud speed of the target point cloud to obtain the average speed;
and determining the average speed as a second point cloud speed of the other point clouds.
7. The method of claim 1, wherein obtaining first point cloud data and second point cloud data comprises:
acquiring first perception data perceived by the first radar, second perception data perceived by the second radar and calibration parameters of the first radar and the second radar;
and respectively transforming the first sensing data and the second sensing data to a preset coordinate system based on the calibration parameters to obtain the first point cloud data and the second point cloud data, wherein the preset coordinate system is constructed by taking a projection point from a preset position on the target vehicle to the ground as a center.
8. The method of claim 7, wherein obtaining calibration parameters for the first radar and the second radar comprises:
Acquiring a first installation position of the first radar on the target vehicle and a second installation position of the second radar on the target vehicle;
calibrating the first radar and the second radar based on the first installation position and the second installation position to obtain the calibration parameters.
9. A processing apparatus for point cloud data, comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring first point cloud data, second point cloud data and the running speed of a target vehicle in the running process of the target vehicle, the first point cloud data is obtained by sensing a first target area through a first radar installed on the target vehicle, the second point cloud data is obtained by sensing a second target area through a second radar installed on the target vehicle, and the first target area is partially overlapped with the second target area;
the determining module is used for determining a second point cloud speed corresponding to the second point cloud data based on a first point cloud speed corresponding to the first point cloud data, wherein the first point cloud data comprises the first point cloud speed;
and the transformation module is used for transforming the second point cloud data based on the second point cloud speed and the running speed to obtain target point cloud data.
10. A target vehicle, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to perform the point cloud data processing method of any of claims 1-8.
CN202211051091.2A 2022-08-30 2022-08-30 Point cloud data processing method and device Pending CN115407304A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211051091.2A CN115407304A (en) 2022-08-30 2022-08-30 Point cloud data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211051091.2A CN115407304A (en) 2022-08-30 2022-08-30 Point cloud data processing method and device

Publications (1)

Publication Number Publication Date
CN115407304A true CN115407304A (en) 2022-11-29

Family

ID=84164316

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211051091.2A Pending CN115407304A (en) 2022-08-30 2022-08-30 Point cloud data processing method and device

Country Status (1)

Country Link
CN (1) CN115407304A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180074176A1 (en) * 2016-09-14 2018-03-15 Beijing Baidu Netcom Science And Technology Co., Ltd. Motion compensation method and apparatus applicable to laser point cloud data
US20200081119A1 (en) * 2018-09-07 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for determining relative pose, device and medium
CN110888120A (en) * 2019-12-03 2020-03-17 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
WO2020104423A1 (en) * 2018-11-20 2020-05-28 Volkswagen Aktiengesellschaft Method and apparatus for data fusion of lidar data and image data
CN112051575A (en) * 2020-08-31 2020-12-08 广州文远知行科技有限公司 Method for adjusting millimeter wave radar and laser radar and related device
CN112731450A (en) * 2020-08-19 2021-04-30 深圳市速腾聚创科技有限公司 Method, device and system for motion compensation of point cloud
CN112907747A (en) * 2021-03-26 2021-06-04 上海商汤临港智能科技有限公司 Point cloud data processing method and device, electronic equipment and storage medium
CN114763997A (en) * 2022-04-14 2022-07-19 中国第一汽车股份有限公司 Method and device for processing radar point cloud data acquired by vehicle and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180074176A1 (en) * 2016-09-14 2018-03-15 Beijing Baidu Netcom Science And Technology Co., Ltd. Motion compensation method and apparatus applicable to laser point cloud data
US20200081119A1 (en) * 2018-09-07 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for determining relative pose, device and medium
WO2020104423A1 (en) * 2018-11-20 2020-05-28 Volkswagen Aktiengesellschaft Method and apparatus for data fusion of lidar data and image data
CN110888120A (en) * 2019-12-03 2020-03-17 华南农业大学 Method for correcting laser radar point cloud data motion distortion based on integrated navigation system
CN112731450A (en) * 2020-08-19 2021-04-30 深圳市速腾聚创科技有限公司 Method, device and system for motion compensation of point cloud
CN112051575A (en) * 2020-08-31 2020-12-08 广州文远知行科技有限公司 Method for adjusting millimeter wave radar and laser radar and related device
CN112907747A (en) * 2021-03-26 2021-06-04 上海商汤临港智能科技有限公司 Point cloud data processing method and device, electronic equipment and storage medium
CN114763997A (en) * 2022-04-14 2022-07-19 中国第一汽车股份有限公司 Method and device for processing radar point cloud data acquired by vehicle and electronic equipment

Similar Documents

Publication Publication Date Title
WO2022022694A1 (en) Method and system for sensing automated driving environment
EP3620823B1 (en) Method and device for detecting precision of internal parameter of laser radar
KR20190082070A (en) Methods and apparatuses for map generation and moving entity localization
US20230141421A1 (en) Point cloud motion compensation method and apparatus, storage medium, and lidar
CN111324115B (en) Obstacle position detection fusion method, obstacle position detection fusion device, electronic equipment and storage medium
US10656259B2 (en) Method for determining trajectories of moving physical objects in a space on the basis of sensor data of a plurality of sensors
CN111192295A (en) Target detection and tracking method, related device and computer readable storage medium
CN112162297B (en) Method for eliminating dynamic obstacle artifacts in laser point cloud map
CN112051575B (en) Method for adjusting millimeter wave radar and laser radar and related device
CN112927309B (en) Vehicle-mounted camera calibration method and device, vehicle-mounted camera and storage medium
CN112034431B (en) External parameter calibration method and device for radar and RTK
EP3816663B1 (en) Method, device, equipment, and storage medium for determining sensor solution
WO2024012212A1 (en) Environmental perception method, domain controller, storage medium, and vehicle
EP4198901A1 (en) Camera extrinsic parameter calibration method and apparatus
CN114035187B (en) Perception fusion method of automatic driving system
CN115451948A (en) Agricultural unmanned vehicle positioning odometer method and system based on multi-sensor fusion
CN114485698A (en) Intersection guide line generating method and system
CN113156407A (en) Vehicle-mounted laser radar external parameter combined calibration method, system, medium and equipment
CN114623823A (en) UWB (ultra wide band) multi-mode positioning system, method and device integrating odometer
CN116817891A (en) Real-time multi-mode sensing high-precision map construction method
CN113030960B (en) Vehicle positioning method based on monocular vision SLAM
CN111753901B (en) Data fusion method, device, system and computer equipment
CN110398751A (en) The system and method for map is generated based on laser radar
CN115407304A (en) Point cloud data processing method and device
CN116182905A (en) Laser radar and combined inertial navigation space-time external parameter calibration method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination