CN115423985A - Method for eliminating dynamic object point cloud in point cloud map - Google Patents

Method for eliminating dynamic object point cloud in point cloud map Download PDF

Info

Publication number
CN115423985A
CN115423985A CN202211177233.XA CN202211177233A CN115423985A CN 115423985 A CN115423985 A CN 115423985A CN 202211177233 A CN202211177233 A CN 202211177233A CN 115423985 A CN115423985 A CN 115423985A
Authority
CN
China
Prior art keywords
point cloud
moving
point
cluster
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211177233.XA
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AutoCore Intelligence Technology Nanjing Co Ltd
Original Assignee
AutoCore Intelligence Technology Nanjing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AutoCore Intelligence Technology Nanjing Co Ltd filed Critical AutoCore Intelligence Technology Nanjing Co Ltd
Priority to CN202211177233.XA priority Critical patent/CN115423985A/en
Publication of CN115423985A publication Critical patent/CN115423985A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method for eliminating dynamic object point clouds in a point cloud map, wherein a vehicle normally runs, and the point cloud data of surrounding environment information obtained by scanning of a laser radar and 3D position and pose information of combined navigation are collected; recognizing and tracking the motion of a dynamic target point cloud and removing a dynamic target according to the surrounding environment information obtained by scanning the laser radar; and tracking the residual dynamic target point cloud cluster in the map by using the data of the time dimension of the map key frame, and introducing the point cloud quantity attribute in the target association stage of dynamic target tracking. The invention tracks the motion of the dynamic target point cloud by utilizing the information of the point cloud frame in two dimensions of time and space, and reduces the false deletion of the static target. When the method is used for tracking the motion of the point cloud where the moving target is located by data association, the ratio of the number of points of the target cluster in the adjacent point cloud frame is introduced as a new attribute when the data association is performed, and the effect of tracking the moving target by the point cloud output by only using the laser radar is improved.

Description

Method for eliminating dynamic object point cloud in point cloud map
Technical Field
The invention relates to an auxiliary driving technology, in particular to a method for eliminating a dynamic object point cloud in a point cloud map.
Background
With the rapid development of the automobile industry in recent years, the assistant driving technology, especially the advanced assistant driving technology, becomes the technological advanced point for the competing pursuits of various large enterprises. All large vehicle enterprises compete to release vehicle models carrying auxiliary driving technologies of the levels of l2+ and l 2.99.
At present, the realization of advanced assistant driving mainly faces the technical problems of positioning, perception, planning and control in several subdivided fields. The positioning technology solves the simple problem that the vehicle is on which road and the driving direction is towards which direction, and is also responsible for the lane where the high-speed advanced assistant driving system vehicle is currently located and the accurate transverse and longitudinal positions of the vehicle in the lane.
The most common method for solving the positioning problem at present is to use a prior map as a reference and match the environmental information around the vehicle scanned by a vehicle sensor to obtain the position of the vehicle in the prior map. The accuracy of the map directly affects the positioning accuracy in the later period, so that the static scene information needs to be restored as much as possible in the map viewing process, and the influence of dynamic objects on the map is reduced as much as possible.
At present, the methods for reducing the influence of dynamic objects on a map mainly include the following methods: filtering the dynamic object using the grid map; filtering the dynamic objects by calculating an occupancy probability; filtering the dynamic object by a closest point iterative algorithm; there have also been recent years of emerging filtering of dynamic objects using models trained using deep learning.
Different from the above algorithm, the invention provides a method for eliminating the dynamic object point cloud in the point cloud map, which can eliminate the dynamic target point cloud in the map more thoroughly without mistakenly deleting the static points in the map.
Disclosure of Invention
In order to solve the defects in the prior art, the invention aims to provide a method for rejecting dynamic object point clouds in a point cloud map.
In order to realize the purpose of the invention, the technical scheme adopted by the invention is as follows:
a method for eliminating dynamic object point clouds in a point cloud map comprises the following steps:
(1) The method comprises the following steps that when a vehicle runs normally, a data acquisition unit synchronously acquires point cloud data of surrounding environment information obtained by scanning of a laser radar and 3D position and pose information of combined navigation;
(2) The dynamic target removing unit identifies and tracks the motion of a dynamic target point cloud and removes a dynamic target according to the surrounding environment information obtained by scanning the laser radar;
(3) And the map generation unit generates a high-precision point cloud map of the surrounding environment according to the 3D pose of the integrated navigation and the point cloud frame of the eliminated dynamic target.
Further, in the step (1), the 3D pose output by the combined navigation and the scanning data output by the laser radar are synchronously acquired, so as to obtain a combined navigation pose sequence Position _ queue _ rtk and a laser radar scanning data frame sequence pointclosed _ queue _ lidar.
Further, the step (2) specifically includes the steps of:
(2.1) extracting continuous 5 frames of laser radar scanning data to a subsequence pointclosed _ queue _ submap from a first frame of the sequence of laser radar scanning data frames pointclosed _ queue _ lidar;
(2.2) starting from the first frame of the subsequence Point _ queue _ submap, extracting 1 frame of laser radar scanning data and executing a non-ground point extraction algorithm on the frame of point cloud to obtain a non-ground point in the frame of point cloud and storing the non-ground point in the non-ground point sequence Point _ non _ group;
(2.3) synchronously acquiring the scanning data of the laser radar by utilizing the combined navigation pose sequence Position _ queue _ rtk and the subsequence Position _ queue _ submap, and combining the pose information with the external reference matrix combined and navigated to the laser radar
Figure BDA0003865059340000021
Splicing adjacent point cloud frames except the current point cloud frame to obtain a local point cloud map Submap _ lidar;
(2.4) traversing each point in the non-ground point sequence pointclosed _ none _ group, and calculating the nearest point of the point in the local point cloud map Submap _ pointer and the distance to the nearest point; storing the point in the non-ground point sequence Pointcluster _ non _ group with the distance smaller than the threshold value to a possible moving point cluster Pointcluster _ moving;
(2.5) executing an Euclidean clustering algorithm in the possible moving point clusters Pointclustered _ moving, and deleting the clustering clusters with the number of clustering points smaller than a threshold value to obtain a moving target clustering Cluster _ moving [ i ];
(2.6) repeating the step 2.2 to the step 2.5 until all the point clouds in the subsequence Pointcluster _ queue _ submap are traversed to obtain a moving target Cluster _ moving [ i ], wherein the range of i is 0-4;
(2.7) repeating the step 2.1 to the step 2.6 until all the point clouds in the laser radar scanning data frame sequence Pointcluoud _ queue _ lidar traverse to obtain a moving target Cluster [ i ], wherein the range of i is 0- 'n-1';
(2.8) starting from i =0, tracking the motion track of the dynamic target Cluster in the moving target Cluster _ moving [ i ], and obtaining the speed and the motion direction of the moving target point cloud;
(2.9) extracting the moving target point cloud cluster with successful data association and speed greater than the threshold value in the step 2.8 to the point cloud cluster (Custer _ moving _ verified [ i ]) to be eliminated;
and (2.10) traversing the laser radar scanning data frame sequence pointclosed _ queue _ pointer, and removing the dynamic target point cloud cluster Custer _ moving _ verified [ i ] in the point cloud frame pointclosed _ queue _ pointer [ i ], thereby obtaining the point cloud frame pointclosed _ queue _ pointer _ node _ moving [ i ] after the dynamic target point cloud is removed.
Further, in step (2.8), cluster _ moving [ i =0 is taken]And Cluster _ moving [ i +1]In combination with Cluster _ moving [ i]And Cluster _ moving [ i +1]Time information, integrated navigation pose information and external parameter matrix of integrated navigation to laser radar when point cloud frames of respective positions are acquired
Figure BDA0003865059340000031
Calculate Cluster _ moving [ i]One moving object Cluster in the Cluster moves to Cluster _ moving [ i +1]]The speed and direction of motion of the corresponding location in (a).
Further, judging which point cloud clusters are the point clouds of the same moving target in the two frames of point cloud clusters by using data association; the data association attribute is the ratio of the number of points in a Cluster of a moving target in the Cluster of Cluster _ moving [ i +1] to the number of points in a Cluster of a moving target in the Cluster of Cluster _ moving [ i ].
Further, in the step (3), the pose sequence Position _ queue _ rtk and the point cloud frame pointclosed _ queue _ lidar _ none _ moving [ i ] after the dynamic target point cloud is removed are utilized]Pose information synchronously acquired by laser radar scanning data and external parameter matrix combined with combined navigation to laser radar
Figure BDA0003865059340000032
And (3) splicing adjacent point cloud frames from i =0 to i = n-1 to obtain a static point cloud Map _ lidar _ static containing no dynamic target point cloud.
A system for eliminating dynamic object point clouds in a point cloud map comprises a laser radar, a combined navigation unit, a data acquisition unit, a dynamic target eliminating unit and a map generating unit, wherein the laser radar and the combined navigation unit are installed at the top of a vehicle; the method for eliminating the dynamic object point cloud in the point cloud map is realized.
Compared with the prior art, the method has the advantages that the method tracks the motion (speed and motion direction) of the dynamic target point cloud by using the information of the point cloud frame in two dimensions of time and space, and reduces the mistaken deletion of the static target.
When the invention uses data association to track the motion of the point cloud where the moving target is located, the ratio of the number of the points of the target cluster in the adjacent point cloud frame is introduced as a new attribute when the data association is performed, thereby improving the effect of tracking the moving target by only using the point cloud output by the laser radar (the accuracy of the association of the same target is improved).
Drawings
FIG. 1 is a flow chart of a method for eliminating dynamic object point clouds in a point cloud map according to the invention.
Detailed Description
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples. The following examples are only for illustrating the technical solutions of the present invention more clearly, and the protection scope of the present application is not limited thereby.
According to the method for eliminating the dynamic object point clouds in the point cloud map, a system hardware architecture required to be used comprises a mechanical laser radar arranged on the top of a vehicle and combined navigation arranged on the top of the vehicle. And scanning by the laser radar to obtain surrounding environment information, and acquiring 3D pose information by combined navigation.
The combined navigation sends out 1hz PPS satellite synchronizing signal to a laser radar installed on the top of the vehicle, and after the laser radar receives the PPS satellite synchronizing signal, the acquisition time of the laser radar is synchronized to the combined navigation time (also called GPS time).
The method for eliminating the dynamic object point cloud in the point cloud map comprises a data acquisition unit, a dynamic target eliminating unit and a map generating unit.
And the data acquisition unit is used for synchronously acquiring point cloud data of the surrounding environment information obtained by scanning the laser radar and synchronously acquiring 3D position and pose information of the combined navigation.
And the dynamic target removing unit is used for identifying, tracking and removing the moving target by using the peripheral environment information obtained by scanning the laser radar.
And the map generation unit is used for generating a high-precision point cloud map of the surrounding environment by using the 3D pose output by the combined navigation and the point cloud frame (eliminating dynamic objects) acquired by the laser radar.
As shown in fig. 1, the method for eliminating the dynamic object point cloud in the point cloud map of the present invention includes the following steps:
(1) The method comprises the following steps that a vehicle normally runs on a road to be drawn, and a data acquisition unit synchronously acquires laser radar and combined navigation data;
wherein the outer parameter matrix of the combined navigation to the laser radar is recorded as
Figure BDA0003865059340000041
(2) Continuously and synchronously acquiring a 3D pose output by combined navigation and a scanning data frame output by a laser radar to obtain a combined navigation pose sequence Position _ queue _ rtk and a laser radar scanning data frame sequence Pointclosed _ queue _ lidar;
(3) Extracting continuous 5 frames of laser radar scanning data to a subsequence pointclosed _ queue _ submap from a first frame of a laser radar scanning data frame sequence pointclosed _ queue _ pointer;
(4) Starting from a first frame of the subsequence Pointcluou _ queue _ submap, extracting 1 frame of laser radar scanning data and executing a non-ground point extraction algorithm on the frame of point cloud, and storing the obtained non-ground points in the frame of point cloud into a non-ground point sequence Pointcluou _ none _ ground;
(5) Synchronously acquiring pose information by using the combined navigation pose sequence Position _ queue _ rtk and the subsequence Position _ queue _ submap laser radar scanning data, and combining the external parameter matrix of the combined navigation to the laser radar
Figure BDA0003865059340000042
Splicing adjacent point cloud frames except the current point cloud frame to obtain a local point cloud map Submap _ lidar;
(6) Traversing each point in the non-ground point sequence Pointclud _ none _ group, and calculating the nearest point of the point in the local point cloud map Submap _ lidar and the distance to the nearest point; storing the point-to-possible moving point cluster pointclosed _ moving in pointclosed _ none _ group with a distance smaller than a threshold (e.g. 0.1 m);
(7) Executing Euclidean clustering algorithm (Euclidean distance is less than 0.1 meter) in the possible moving point Cluster Pointclustered _ moving, and deleting the clustering clusters of which the number of clustering points is less than a threshold (10 points) to obtain a moving target clustering Cluster _ moving [ i ];
(8) Repeating the steps 4 to 7 until all point clouds in the subsequence Pointcluster _ queue _ submap are traversed to obtain a Cluster of moving objects, wherein the range of i is 0-4;
(9) Repeating the steps 3 to 8 until all point clouds in the laser radar scanning data frame sequence Pointcluoud _ queue _ lidar are traversed to obtain a moving target Cluster clustering [ i ] (range 0- 'n-1' of i);
(10) Starting from i =0, tracking the motion track of the dynamic target Cluster in the moving target Cluster _ moving [ i ], and obtaining the speed and the motion direction of the moving target point cloud;
a. cluster _ moving [ i ] represents the position information of the dynamic target in the current time point cloud frame in the 3D space;
b. get Cluster _ moving [ i =0]And Cluster _ moving [ i +1]In combination with Cluster _ moving [ i]And Cluster _ moving [ i +1]Time information, integrated navigation pose information and external parameter matrix of integrated navigation to laser radar when point cloud frames of respective positions are acquired
Figure BDA0003865059340000051
Cluster _ moving [ i ] can be calculated]One moving object Cluster in the Cluster moves to Cluster _ moving [ i +1]]The speed and direction of motion of the corresponding location in (a);
c. in the step b, data association is needed to judge which point cloud clusters are the point clouds of the same moving target in the two frames of point cloud clusters;
in addition to the traditional data association attributes (speed, moving direction), the invention introduces a new data association attribute, namely the ratio of the number of points in a Cluster of a moving target in a Cluster _ moving [ i +1] to the number of points in a Cluster of a moving target in the Cluster _ moving [ i ]; and if the moving direction of the dynamic target is opposite to the moving direction of the collection vehicle, substituting the ratio into data correlation calculation, and if the moving direction of the dynamic target is the same as the moving direction of the collection vehicle, substituting the inverse of the ratio into data correlation calculation.
(11) Extracting the moving target point cloud cluster with successful data association and speed greater than a threshold value (0.1 m/s) in the step 10 to the point cloud cluster Custer _ moving _ verified [ i ] to be eliminated;
points with the speed less than the threshold value and without data association success are considered to be pseudo dynamic targets caused by scanning the same static object from different angles by the radar, and are not added with the point cloud cluster Custer _ moving _ verified [ i ] to be rejected.
(12) Traversing the laser radar scanning data frame sequence pointclosed _ queue _ pointer, and removing the dynamic target point cloud cluster Custer _ moving _ verified [ i ] in the point cloud frame pointclosed _ queue _ pointer [ i ], thereby obtaining the point cloud frame pointclosed _ queue _ node _ moving [ i ] after the dynamic target point cloud is removed.
(13) Position-pose sequence Position _ queue _ rtk and Position _ queue _ lidar _ none _ moving [ i]Pose information synchronously acquired by laser radar scanning data and external parameter matrix combined with combined navigation to laser radar
Figure BDA0003865059340000061
And (3) splicing adjacent point cloud frames from i =0 to i = n-1 to obtain a static point cloud Map _ lidar _ static containing no dynamic target point cloud.
Compared with the prior art, the method has the advantages that the method tracks the motion (speed and motion direction) of the dynamic target point cloud by using the information of the point cloud frame in two dimensions of time and space, and reduces the mistaken deletion of the static target.
When the invention uses data association to track the motion of the point cloud where the moving target is located, the ratio of the number of the points of the target cluster in the adjacent point cloud frame is introduced as a new attribute when the data association is performed, thereby improving the effect of tracking the moving target by only using the point cloud output by the laser radar (the accuracy of the association of the same target is improved).
The present applicant has described and illustrated embodiments of the present invention in detail with reference to the accompanying drawings, but it should be understood by those skilled in the art that the above embodiments are merely preferred embodiments of the present invention, and the detailed description is only for the purpose of helping the reader to better understand the spirit of the present invention, and not for limiting the scope of the present invention, and on the contrary, any improvement or modification made based on the spirit of the present invention should fall within the scope of the present invention.

Claims (7)

1. A method for eliminating dynamic object point clouds in a point cloud map is characterized by comprising the following steps:
(1) The method comprises the following steps that when a vehicle runs normally, a data acquisition unit synchronously acquires point cloud data of surrounding environment information obtained by scanning of a laser radar and 3D position and pose information of combined navigation;
(2) The dynamic target removing unit identifies and tracks the motion of a dynamic target point cloud and removes a dynamic target according to the surrounding environment information obtained by scanning the laser radar;
(3) And the map generation unit generates a high-precision point cloud map of the surrounding environment according to the 3D pose of the integrated navigation and the point cloud frame of the eliminated dynamic target.
2. The method for eliminating the dynamic object point clouds in the point cloud map as claimed in claim 1, wherein in the step (1), the 3D poses output by the combined navigation and the scanning data output by the laser radar are synchronously collected to obtain a sequence of combined navigation poses Position _ queue _ rtk and a sequence of laser radar scanning data frames pointcluoud _ queue _ pointer.
3. The method for eliminating the dynamic point cloud of the object in the point cloud map according to claim 2, wherein the step (2) specifically comprises the steps of:
(2.1) extracting continuous 5 frames of laser radar scanning data to a subsequence pointclosed _ queue _ submap from a first frame of the sequence of laser radar scanning data frames pointclosed _ queue _ lidar;
(2.2) starting from the first frame of the subsequence Point _ queue _ submap, extracting 1 frame of laser radar scanning data and executing a non-ground point extraction algorithm on the frame of point cloud to obtain a non-ground point in the frame of point cloud and storing the non-ground point in the non-ground point sequence Point _ non _ group;
(2.3) synchronously acquiring pose information by using the combined navigation pose sequence Position _ queue _ rtk and subsequence Position _ queue _ Submap laser radar scanning data, and splicing adjacent point cloud frames except the current point cloud frame by combining and navigating the external reference matrix of the laser radar to obtain a local point cloud map Submap _ lidar;
(2.4) traversing each point in the non-ground point sequence pointclosed _ none _ group, and calculating the nearest point of the point in the local point cloud map Submap _ pointer and the distance to the nearest point; storing the point in the non-ground point sequence Pointcluster _ non _ group with the distance smaller than the threshold value to a possible moving point cluster Pointcluster _ moving;
(2.5) executing an Euclidean clustering algorithm in the possible moving point clusters Pointclustered _ moving, and deleting the clustering clusters with the number of clustering points smaller than a threshold value to obtain a moving target clustering Cluster _ moving [ i ];
(2.6) repeating the step 2.2 to the step 2.5 until all the point clouds in the subsequence Pointcluster _ queue _ submap are traversed to obtain a moving target Cluster _ moving [ i ], wherein the range of i is 0-4;
(2.7) repeating the step 2.1 to the step 2.6 until all the point clouds in the laser radar scanning data frame sequence Pointcluoud _ queue _ lidar traverse to obtain a moving target Cluster [ i ], wherein the range of i is 0- 'n-1';
(2.8) starting from i =0, tracking the motion track of the dynamic target Cluster in the moving target Cluster _ moving [ i ], and obtaining the speed and the motion direction of the moving target point cloud;
(2.9) extracting the moving target point cloud cluster with successful data association and speed greater than the threshold value in the step 2.8 to the point cloud cluster Custer _ moving _ verified [ i ] to be eliminated;
and (2.10) traversing the laser radar scanning data frame sequence pointclosed _ queue _ pointer, and removing the dynamic target point cloud cluster Custer _ moving _ verified [ i ] in the point cloud frame pointclosed _ queue _ pointer [ i ], thereby obtaining the point cloud frame pointclosed _ queue _ pointer _ node _ moving [ i ] after the dynamic target point cloud is removed.
4. The method for eliminating the dynamic object point clouds in the point cloud map as claimed in claim 3, wherein in the step (2.8), the Cluster _ moving [ i ] and the Cluster _ moving [ i +1] are taken from i =0, and the speed and the moving direction of a moving object Cluster in the Cluster _ moving [ i ] moving to the corresponding position in the Cluster _ moving [ i +1] are calculated by combining the time information when the Cluster _ moving [ i ] and the Cluster _ moving [ i +1] are respectively collected and the external reference matrix of the laser radar.
5. The method for eliminating the dynamic object point clouds in the point cloud map according to claim 4, wherein data association is used for judging which point cloud clusters are the point clouds of the same moving target in two frames of point cloud clusters; the data association attribute is the ratio of the number of points in a Cluster of a moving object in the Cluster _ moving [ i +1] to the number of points in a Cluster of a moving object in the Cluster _ moving [ i ].
6. The method for eliminating the dynamic object point clouds in the point cloud Map as claimed in claim 5, wherein in the step (3), the Position and pose sequence Position _ queue _ rtk and Position _ queue _ pointer _ non _ moving [ i ] laser radar scanning data are synchronously acquired, and the adjacent point cloud frames from i =0 to i = n-1 are spliced by combining the external reference matrix navigated to the laser radar, so as to obtain the Map _ pointer _ static without the dynamic target point clouds.
7. A system for eliminating dynamic object point clouds in a point cloud map is characterized by comprising a laser radar, a combined navigation unit, a data acquisition unit, a dynamic target eliminating unit and a map generating unit, wherein the laser radar and the combined navigation unit are installed at the top of a vehicle; the method for eliminating the dynamic object point cloud in the point cloud map as claimed in any one of claims 1 to 6.
CN202211177233.XA 2022-09-26 2022-09-26 Method for eliminating dynamic object point cloud in point cloud map Pending CN115423985A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211177233.XA CN115423985A (en) 2022-09-26 2022-09-26 Method for eliminating dynamic object point cloud in point cloud map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211177233.XA CN115423985A (en) 2022-09-26 2022-09-26 Method for eliminating dynamic object point cloud in point cloud map

Publications (1)

Publication Number Publication Date
CN115423985A true CN115423985A (en) 2022-12-02

Family

ID=84205835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211177233.XA Pending CN115423985A (en) 2022-09-26 2022-09-26 Method for eliminating dynamic object point cloud in point cloud map

Country Status (1)

Country Link
CN (1) CN115423985A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117607897A (en) * 2023-11-13 2024-02-27 深圳市其域创新科技有限公司 Dynamic object removing method and related device based on light projection method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117607897A (en) * 2023-11-13 2024-02-27 深圳市其域创新科技有限公司 Dynamic object removing method and related device based on light projection method

Similar Documents

Publication Publication Date Title
CN111583369B (en) Laser SLAM method based on facial line angular point feature extraction
CN112162297B (en) Method for eliminating dynamic obstacle artifacts in laser point cloud map
CN111179152B (en) Road identification recognition method and device, medium and terminal
CN112362072B (en) High-precision point cloud map creation system and method in complex urban environment
CN111340855A (en) Road moving target detection method based on track prediction
CN111830953A (en) Vehicle self-positioning method, device and system
CN113516664A (en) Visual SLAM method based on semantic segmentation dynamic points
CN110487286B (en) Robot pose judgment method based on point feature projection and laser point cloud fusion
US20220398856A1 (en) Method for reconstruction of a feature in an environmental scene of a road
CN116403139A (en) Visual tracking and positioning method based on target detection
CN113640822A (en) High-precision map construction method based on non-map element filtering
CN115423985A (en) Method for eliminating dynamic object point cloud in point cloud map
CN113689393A (en) Three-dimensional target detection algorithm based on image and point cloud example matching
Zhou et al. Lane information extraction for high definition maps using crowdsourced data
CN113865581B (en) Closed scene positioning method based on multi-level map
CN114556419A (en) Three-dimensional point cloud segmentation method and device and movable platform
CN110927765B (en) Laser radar and satellite navigation fused target online positioning method
CN113227713A (en) Method and system for generating environment model for positioning
CN117011481A (en) Method and device for constructing three-dimensional map, electronic equipment and storage medium
CN111338336B (en) Automatic driving method and device
CN113762195A (en) Point cloud semantic segmentation and understanding method based on road side RSU
Saleh et al. Robust Collision Warning System based on Multi Objects Distance Estimation
CN113888713B (en) Method for recovering road surface missing points by vehicle-mounted laser point cloud data
CN111724409A (en) Target tracking method based on densely connected twin neural network
CN112200831A (en) Dense connection twin neural network target tracking method based on dynamic template

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination