WO2023179718A1 - 用于激光雷达的点云处理方法、装置、设备及存储介质 - Google Patents

用于激光雷达的点云处理方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2023179718A1
WO2023179718A1 PCT/CN2023/083414 CN2023083414W WO2023179718A1 WO 2023179718 A1 WO2023179718 A1 WO 2023179718A1 CN 2023083414 W CN2023083414 W CN 2023083414W WO 2023179718 A1 WO2023179718 A1 WO 2023179718A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
points
point cloud
grid
lidar
Prior art date
Application number
PCT/CN2023/083414
Other languages
English (en)
French (fr)
Inventor
王栋
夏冰冰
石拓
Original Assignee
北京一径科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京一径科技有限公司 filed Critical 北京一径科技有限公司
Publication of WO2023179718A1 publication Critical patent/WO2023179718A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Definitions

  • the present disclosure relates to point cloud data processing technology in lidar, and in particular, to a point cloud processing method, device, equipment and storage medium for lidar.
  • lidar technology With the development of industrial intelligence, autonomous driving, robot obstacle avoidance, vehicle-road collaboration in smart cities, and surveying and mapping, there is an increasing demand for 3D sensing technology, especially lidar technology.
  • lidar for environmental perception, there is often a situation: due to the divergence angle of the emitted light of lidar, the light spot formed covers a certain area. When a certain light spot illuminates two objects in front and behind at the same time and interacts with each other, When the boundaries between objects are close together, the echoes generated will be superimposed. As shown in Figure 2, the dotted lines are the echoes formed at the boundaries of two objects that are close to each other when a light spot hits two objects at the same time, and the solid line is the actual superimposed echo signal.
  • embodiments of the present disclosure provide a point cloud processing method, device, equipment and storage medium for lidar.
  • a point cloud processing method for lidar including:
  • the set threshold is A first set threshold
  • the set threshold is a second set threshold
  • the second set threshold is greater than the first set threshold
  • the sticking point processing includes: retaining points within a certain distance range of the closest point and the farthest point in the grid, and determining the remaining points as sticking points;
  • the method further includes:
  • retaining points within a certain distance range of the closest point and the farthest point, and determining the remaining points as sticking points includes:
  • Points within the first set distance range of the nearest point and points within the second set distance range of the farthest point are retained, and the remaining points in the grid are determined as sticking points.
  • dividing the lidar point cloud into different grids includes:
  • the point cloud represented by the spherical coordinate system is divided into different grids; wherein the number of points in each grid is greater than or equal to a set value.
  • the method further includes:
  • the point cloud of the lidar is divided into different grids.
  • the method further includes:
  • the point cloud of lidar is divided into different grids, including:
  • a point cloud processing device for lidar including:
  • the division unit is used to divide the lidar point cloud into different grids according to the preset angle range and resolution;
  • a sticking point processing unit is used to calculate the distance difference between the farthest point and the closest point, and perform sticky point processing when the distance difference is greater than a set threshold; wherein, points within the grid When the points in the grid are formed by reflections from different objects, the set threshold is the first set threshold; when the points in the grid are formed by reflections from the same object, the set threshold is the second set threshold; The second set threshold is greater than the first set threshold; the sticking point processing includes: retaining point clouds within a certain distance range of the closest point and the farthest point, and determining the remaining point clouds as sticking point clouds;
  • a deletion unit is used to delete the adhesive point cloud.
  • the adhesion point processing unit is also used to:
  • the adhesion point processing unit is also used to:
  • the points within the first set distance range of the nearest point and the points within the second set distance range of the farthest point are retained, and the remaining points in the grid are determined as sticking points.
  • the dividing unit is also used for:
  • the point cloud represented by the spherical coordinate system is rasterized; the number of point clouds in each grid is greater than or equal to the set value.
  • the dividing unit is also used for:
  • the point cloud of the lidar is divided into different grids.
  • the device further includes:
  • a determination unit configured to determine the area whose distance is smaller than the third set threshold as the ROI of the point cloud
  • the dividing unit is also used to divide the point cloud contained in the ROI into different grids.
  • a computer-readable storage medium is provided.
  • a computer program is stored in the computer-readable storage medium.
  • the computer program is executed by a processor, the method for laser radar is implemented. Point cloud processing method steps.
  • an electronic device including:
  • a computer program including computer readable code.
  • the computer readable code When the computer readable code is run on a computing processing device, it causes the computing processing device to execute the point cloud processing method for lidar proposed by the embodiment of the first aspect of the present disclosure.
  • the lidar point cloud is divided into different grids according to the preset angle range and resolution, and it is determined whether the distance difference between the nearest point and the farthest point in the grid is greater than the set threshold. If it is greater than , only then perform glue point processing on the raster, so as to avoid accidental deletion.
  • the distance difference between the nearest point and the farthest point formed in the grid is large, but other features and sticking points are very close. Therefore, you can first compare the nearest point and the farthest point on the grid. The farthest point distance difference is used to judge, and the grid formed when the incidence angle of the laser radar and the obstacle is large is eliminated to avoid accidental deletion.
  • the distance difference between the farthest point and the closest point in the grid is used to determine whether to perform sticky point processing on the grid, which improves the accuracy of sticky point determination and reduces the probability of accidentally deleting sticky points.
  • the first set threshold is set smaller.
  • the difference between the farthest point and the closest point is less than the first set threshold, it corresponds to different
  • the second set threshold is set larger, so that the distance difference between the farthest point and the closest point in each grid increases as the incident angle between the lidar and the detection object increases.
  • Figure 1 shows a schematic diagram of the scene where point cloud data is adhered
  • Figure 2 shows a schematic diagram of the echo signal of the laser signal where adhesion occurs
  • Figure 3 shows a schematic diagram of the point cloud where adhesion occurs in the point cloud image
  • Figure 4 is a schematic flowchart of a point cloud processing method for lidar according to an embodiment of the present disclosure
  • Figure 5 is a schematic diagram of an example of a point cloud processing method for lidar according to an embodiment of the present disclosure
  • Figure 6 is a schematic diagram of the distribution of each point in the grid according to an embodiment of the present disclosure.
  • Figure 7 is a schematic diagram of the distribution of each point in the grid according to an embodiment of the present disclosure.
  • Figure 8 is a schematic diagram of the distribution of each point in the grid according to an embodiment of the present disclosure.
  • Figure 9 shows a schematic diagram of the point cloud in which the sticking points have been deleted
  • Figure 10 is a schematic structural diagram of a point cloud processing device for lidar according to an embodiment of the present disclosure
  • FIG. 11 shows a configuration block diagram of an electronic device according to an embodiment of the present disclosure.
  • Figure 1 shows a schematic diagram of the scene where point cloud data is stuck.
  • the laser spot emitted by the lidar has a certain size
  • the distance is related to the width of the luminous pulse
  • the echoes generated will be superimposed together.
  • the distance is calculated based on the superimposed echo signals, which will lead to a discrepancy between the calculated result and the object.
  • There will be a huge deviation in the distance causing the size of the obstacle processed by the perception algorithm to deviate from the true value.
  • Figure 2 shows a schematic diagram of the echo signal of the laser signal where adhesion occurs.
  • the dotted line shows the echo signals formed at the boundaries of the objects that are close to each other when a beam of light spots irradiates two objects at the same time.
  • the solid line shows the actual echo signal after superposition.
  • Figure 3 shows a schematic diagram of the point cloud where adhesion occurs in the point cloud image. As shown in Figure 3, on the complete point cloud image, point cloud adhesion appears as floating point clouds appearing between the edges of the front and rear objects in the same direction.
  • FIG. 4 is a schematic flow chart of a point cloud processing method for lidar according to an embodiment of the present disclosure. As shown in Figure 4, the point cloud processing method for lidar according to an embodiment of the present disclosure includes the following steps:
  • Step 401 Divide the lidar point cloud into different grids according to the preset angle range and resolution.
  • the sticking points are searched for and deleted in the entire point cloud of the lidar.
  • the point cloud is rasterized according to the spherical coordinate system according to the preset angle range and resolution.
  • Laser radar The point cloud represented by the spherical coordinate system is converted into a spherical coordinate system; according to the preset angle range and resolution, the point cloud represented by the spherical coordinate system is divided into different grids, where the number of points in each grid is Greater than or equal to the set value.
  • the purpose of dividing the point cloud into grids is to search for adhesive points based on the grid distribution characteristics of the point cloud. Since the grid contains a certain number of points, there is a high probability that the grid will contain point clouds illuminated by objects in front, adhesion point clouds, and point clouds illuminated by objects behind.
  • the setting value here can be set to 4, which ensures that the number of point clouds in a sub-grid is maintained at about 4. Specifically, the setting value may also be 5, 6, etc. There are no limitations in the embodiments of this disclosure.
  • the size of the grid can be adjusted according to the preset angle range and the resolution of the point cloud data, so that the number of points in each sub-grid is maintained at around the set value.
  • the point cloud of the lidar can be divided into different grids according to the coordinate information of the point cloud of the lidar. That is, determine which grid the point cloud falls into according to its coordinate value, and divide the point cloud into the corresponding grid.
  • the distribution angle range of the grid can be adjusted according to the resolution of the point cloud, so that the number of point clouds in each grid is ultimately maintained at the set number. value quantity.
  • Step 402 Traverse all points in each grid, obtain the nearest point and the farthest point in the grid, and calculate the distance difference between the farthest point and the nearest point. When the distance difference is greater than the set threshold, In this case, perform adhesion point processing.
  • the sticking point processing includes: retaining points within a certain distance range of the closest point and the farthest point in the grid, and determining the remaining points as sticking points.
  • the ranging value of the point in each grid is obtained, and the farthest point and the closest point in the grid are determined. Calculate the distance difference between the farthest point and the closest point, and perform sticky point processing when the distance difference is greater than a set threshold.
  • sticky point processing is performed. That is, when the points in the grid are formed by reflections from different objects, the set threshold is the first set threshold.
  • the first set threshold tends to be the minimum value of the possible value range of the set threshold.
  • the difference between the farthest point and the closest point is less than the first set threshold, that is, when the corresponding objects are very close, although the generated sticking points cannot be deleted, the distance between the objects that are closer will not be deleted.
  • the adhesion points have less impact on practical applications, so this situation is acceptable in practical applications.
  • sticky point processing is performed. That is, when the points in the grid are formed by the reflection of the same object, the setting The threshold is a second set threshold; wherein the second set threshold is greater than the first set threshold.
  • the second set threshold tends to the maximum value of the range of possible values of the set threshold.
  • the distance difference between the farthest point and the closest point in each grid increases with the increase of the incident angle between the lidar and the detection object.
  • the distance between the lidar and the obstacle is formed in the grid.
  • the distance difference between the closest point and the farthest point is large, but other features are very close to the adhesion points. Therefore, you can first judge the distance difference between the closest point and the farthest point of the grid to rule out the large incident angle of lidar and obstacles.
  • the grid is formed at the same time to avoid accidental deletion.
  • the set threshold can be in the range of 0.1m to 1m.
  • the first set threshold can be set with reference to the minimum value of the set threshold, such as 0.1m, for example, it can be set to 0.2m, 0.35m, etc.
  • the second set threshold can be set with reference to the minimum value of the set threshold, such as 0.1m.
  • the fixed threshold can be set by referring to the maximum value of the set threshold, such as 1m, for example, it can be set to 0.7m, 0.8m or 0.95m, etc.
  • the distance difference is greater than the set threshold, find the point cloud that is within the first set distance range from the nearest point and the point cloud that is within the second set distance range from the farthest point. , the points outside the points within a certain distance range of the searched closest point and the farthest point are determined as the sticking points.
  • the first set distance range may be the same as the second set distance range.
  • the first set distance range can be set to a range of no more than 0.06m. Points within 0.06m of the nearest point in the grid are considered normal points, and the remaining points are adhesion points.
  • the second set distance range can be set to a range of no more than 0.06m. Points within a range of 0.06m near the farthest point in the grid are considered normal points, and the remaining points are adhesion points.
  • the first set distance range may or may not be the same as the second set distance range.
  • the first set distance range can be set to a range of no more than 0.06m. Points within 0.06m of the nearest point in the grid are considered normal points, and the remaining points are adhesion points.
  • the second set distance range can be set to a range not exceeding 0.08m. When the point cloud within 0.08m near the farthest point in the grid is considered to be a normal point cloud, the remaining points are considered to be adhesive. point.
  • the above-mentioned first set distance range and second set distance range are only illustrative and not limiting.
  • Step 403 Delete the adhesion point.
  • the lidar point cloud is divided into different grids according to the preset angle range and resolution, and it is determined whether the distance difference between the nearest point and the farthest point in the grid is greater than the set threshold. If it is greater than , only then perform glue point processing on the raster, so as to avoid accidental deletion.
  • the distance difference between the closest point and the farthest point formed in the grid is large, but other features and adhesion points are very close, so it can First, judge the distance difference between the closest point and the farthest point of the grid, and eliminate grids formed when the incidence angle of lidar and obstacles is large to avoid accidental deletion.
  • the distance difference between the farthest point and the closest point in the grid is used to determine whether to perform sticky point processing on the grid, which improves the accuracy of sticky point determination and reduces the probability of accidentally deleting sticky points.
  • the first set threshold is set smaller.
  • the difference between the farthest point and the closest point is less than the first set threshold, it corresponds to different
  • the second set threshold is set larger, so that the distance difference between the farthest point and the closest point in each grid increases as the incident angle between the lidar and the detection object increases.
  • the distance difference between the nearest point and the farthest point formed in the grid is large, and other features are very close to the adhesion points. Therefore, you can first judge the distance difference between the nearest point and the farthest point in the grid. Exclude grids formed when the incident angle between lidar and obstacles is large to avoid accidental deletion. Obstacle size perception based on the point cloud processed in the embodiments of the present disclosure is more accurate, which is conducive to determining the corresponding route planning of real obstacle sizes such as autonomous vehicles, and can accurately avoid obstacles, etc., which greatly facilitates autonomous driving.
  • the path planning in the method ensures driving safety; the embodiment of the present disclosure also supports first determining the areas where adhesion may occur, and only determining and deleting adhesion points in the areas where adhesion may occur, thereby improving the point cloud processing efficiency.
  • the relevant areas where lidar may produce adhesion points are first determined so that adhesion points can be determined directly in the relevant areas, thereby saving computing resources for point cloud identification and improving the processing efficiency of adhesion point identification.
  • the distance between the obstacle and the lidar is close, such as less than 1.6m, the pulse width of the lidar echo signal is wider, and the probability of the laser echo signals of two closer obstacles being superimposed together is increase, leading to adhesion.
  • the pulse width of echoes that are far away is narrower, so the possibility that the echoes of closer obstacles will be superimposed will be reduced.
  • the area whose distance is smaller than the third set threshold is determined as Region of Interest (ROI) of the point cloud
  • ROI is the area where sticking points are prone to occur.
  • the ROI area where adhesion points may occur is first determined, so that the search and deletion of adhesion points can only be performed in the ROI area. It is not necessary to search for adhesion points in all point clouds of the entire lidar, thus greatly improving the efficiency of the operation. Improve the efficiency of point cloud processing. Therefore, the ROI area of the point cloud can be selected based on distance or other areas that may produce sticking points.
  • the third set threshold may be 1.6m. Those skilled in the art will understand that the third set threshold may also be other values. Such as 1.7m, 2.1m, etc., these are only examples.
  • the adhesion phenomenon that may be caused by obstacles is determined based on the distance of the point cloud. For example, when the point determined based on the echo signal is within a distance of less than 1.6m, the adhesion phenomenon of the point cloud is likely to occur. All the points can be In the collected point cloud, the area where the point distance is less than 1.6m is determined as the ROI area. In the embodiment of the present disclosure, the edge area of the obstacle may also be divided into ROI areas and the like according to the general shape of the obstacle.
  • the point cloud of the lidar after the sticking point processing is used as the effective point cloud, that is, the point cloud included in the ROI after removing the sticking point and the point cloud in the non-ROI are determined as the valid point cloud, and the obstacle distance calculation and other data are performed processing analysis, etc.
  • the grid areas that may cause adhesion can be distinguished from the grids that basically do not have adhesion, and only the grids that may have adhesion will be processed for adhesion points.
  • the points in the grid in the ROI will not be processed as sticking points. Instead, the points in the grid need to be judged as sticking points. Only when there may be sticking points, In this case, perform adhesion point processing. Moreover, the sticking point processing method of the embodiment of the present disclosure will not lead to the mistaken deletion of non-sticking points. For example, when the incident angle between the radar and the object is greater than a certain angle, some normal point clouds will be mistakenly judged as sticking points.
  • Points as shown in Figure 6, 0 represents lidar, 3 represents a normal object (assumed to be a plane), points A and B represent the closest and farthest points on the same object 3 in a grid, 1 and 2
  • the lengths respectively correspond to the distances from the laser radar to point A and point B. If the points within the first set distance range of the nearest point and the points within the second set distance range of the farthest point are directly based on, they are non-adhesive points, and the grid Other points in the grid are sticking points, which may lead to accidental deletion of normal points. For example, if the incident angle is small, the distance difference between point A and point B is small.
  • the embodiment of the present disclosure proposes a more reasonable point cloud processing method to address the above-mentioned situation of accidental deletion of sticky points.
  • FIG. 5 is a schematic diagram of an example of a point cloud processing method for lidar according to an embodiment of the present disclosure. As shown in Figure 5, the point cloud processing method for lidar according to an embodiment of the present disclosure includes the following processing steps:
  • Step 1 Select the ROI area from the point cloud formed by all echo signals of the lidar.
  • the ROI area can be selected based on distance or other areas that may produce adhesion points.
  • the original point cloud is divided into ROI point cloud and non-ROI point cloud.
  • the ROI area can be the area corresponding to the point cloud within a certain distance (such as less than 1.6m).
  • the purpose of dividing the ROI area is to reduce the efficiency of sticking point processing, that is, only the sticking point cloud is processed in the ROI area, and the sticking point cloud is not processed in other areas outside the ROI area.
  • Step 2 Rasterize the point cloud according to polar coordinates according to the preset angle range and resolution.
  • the back calculation is (element (polar angle), azimuth (azimuth angle), distance (distance)) of the spherical coordinate system.
  • all points in the point cloud are rasterized and divided into different grids according to the spherical coordinate system.
  • Rasterization refers to calculating the horizontal and vertical coordinate numbers of the grid where each point is located based on the preset angular range and angular resolution, and obtaining the horizontal and vertical serial numbers (hori_pos and vert_pos) of the grid where each point is located. .
  • hori_pos floor(azimuth-angle_hori_min)/angle_hori_resolution
  • angle_hori_min, angle_vert_min represent the minimum horizontal and vertical angle of the point to be processed
  • angle_hori_resolution represent the horizontal and vertical resolution of rasterization respectively
  • Floor(arg) is a downward rounding function, returning the largest integer not greater than arg value.
  • Step 3 calculate the closest point and the farthest point within each grid. Traverse all points in each grid and get the distance to the closest point and the distance to the farthest point.
  • the grid size can be adjusted according to the radar scanning resolution (ie, the spacing between points) to control the number of points in the grid.
  • the number of points in the grid is at least 4. Since it contains A certain number of points will make the grid more likely to contain points that illuminate the front object, adhesion points and points that illuminate the rear object.
  • the points within the first set distance range of the nearest point and the points within the second set distance range of the farthest point in each grid are retained, and the remaining points are regarded as glue points and deleted.
  • the normal point clouds can be retained by directly processing the sticky points. However, for situations where the incident angle is large, such as the scene shown in Figure 7, this sticky point processing method will cause point deletion by mistake. situation, resulting in loss of point cloud quality, especially the loss of rear object edges.
  • Step 4 Determine whether the distance difference between the nearest point and the farthest point in each grid is greater than the set threshold. If the distance difference between the nearest point and the farthest point in the grid is greater than the set threshold, proceed to step 5. Otherwise, It is considered that the grid does not contain sticky points, and the sticky point processing in step 5 is not performed.
  • the grid to be processed only when the distance difference between the farthest point and the closest point is greater than the set threshold X (the value range of Deletion processing of adhesion points.
  • 0 represents the lidar
  • 3 and 4 represent the two objects before and after respectively
  • point A is the closest point in the grid
  • point B is the farthest point in the grid
  • C is the sticking point at the boundary. It will only be processed when the distance between objects 3 and 4 exceeds a certain threshold X1.
  • the sticking point C is deleted, and the distance between points A and B is kept constant. points within the range.
  • the threshold value tends to be the minimum value of the range of values that X can take.
  • X1 can take a value of 0.1m, 0.2m, or 0.25m, etc.
  • the distance difference between the farthest point and the closest point is greater than the first set threshold X1, it is considered that there is a sticking point phenomenon in the grid, and sticking point processing needs to be performed on the points in the grid.
  • the point cloud in the same grid is formed by the reflection of an object, as shown in Figure 7, because the distance difference between the farthest point distance 1 and the nearest point distance 2 in each grid increases as the incident angle increases. Increase, only when the distance between two points A and B on the same object exceeds the second set threshold X2, it will be processed. For this situation, the value of the second set threshold X2 tends to the maximum value of the possible value range of X. To avoid deletion of normal point clouds as much as possible.
  • Step 5 Determine the points in each grid that are within the first set distance range from the nearest point, and the points that are within the second set distance range from the farthest point.
  • the remaining points in the grid are regarded as Sticky points need to be deleted.
  • This processing method in the embodiment of the present disclosure will retain the points that illuminate the front obstacle and the points that illuminate the rear obstacle in the grid containing the sticking points, and the sticking points will be deleted.
  • the points in the normal grid are usually within a certain distance range; that is, the distance difference between the farthest point and the nearest point is less than the above
  • the first setting threshold is to not perform the above sticky point deletion operation, so that the normal point cloud can be retained.
  • Figure 9 shows a schematic diagram of the point cloud after deleting the sticking points, as shown in Figure 9 .
  • Step 6 Delete the sticky points in the grid.
  • the point cloud data of the lidar with the sticky points removed will be used as valid point cloud data.
  • Figure 10 is a schematic structural diagram of a point cloud processing device for lidar according to an embodiment of the present disclosure. As shown in Figure 10, the point cloud processing device for lidar according to an embodiment of the present disclosure includes:
  • the dividing unit 80 is used to divide the lidar point cloud into different grids according to the preset angle range and resolution;
  • the acquisition unit 81 is used to traverse all point clouds in each grid and obtain the closest point and the farthest point in the grid;
  • the sticking point processing unit 82 is used to calculate the distance difference between the farthest point and the closest point.
  • sticky point processing is performed; the points in the grid come from
  • the set threshold is the first set threshold; in the case where the points in the grid are formed by reflections from the same object, the set threshold is the second set threshold; so The second set threshold is greater than the first set threshold; the sticking point processing includes: retaining point clouds within a certain distance range of the closest point and the farthest point, and determining the remaining point clouds as sticking point clouds;
  • the deletion unit 83 is used to delete the adhesion point cloud.
  • the adhesion point processing unit 82 is also used to:
  • the adhesion point processing unit 82 is also used to:
  • the points within the first set distance range of the nearest point and the points within the second set distance range of the farthest point are retained, and the remaining points in the grid are determined as sticking points.
  • the dividing unit 80 is also used to:
  • the point cloud represented by the spherical coordinate system is rasterized; the number of point clouds in each grid is greater than or equal to the set value.
  • the dividing unit 80 is also used to:
  • the point cloud of the lidar is divided into different grids.
  • the point cloud processing device for lidar according to the embodiment of the present disclosure also includes:
  • a determination unit (not shown in Figure 8), configured to determine the area whose distance is smaller than the third set threshold as the ROI of the point cloud;
  • the dividing unit 80 is also used to divide the point cloud contained in the ROI into different grids.
  • the dividing unit 80, the obtaining unit 81, the sticking point processing unit 82, the deleting unit 83, the determining unit, etc. may be processed by one or more central processing units (CPUs, Central Processing Units), graphics processing units (GPUs), etc. , Graphics Processing Unit), Application Specific Integrated Circuit (ASIC, Application Specific Integrated Circuit), DSP, Programmable Logic Device (PLD, Programmable Logic Device), Complex Programmable Logic Device (CPLD, Complex Programmable Logic Device), Field Programmable Gate array (FPGA, Field-Programmable Gate Array), general-purpose processor, controller, microcontroller (MCU, Micro Controller Unit), microprocessor (Microprocessor), or other electronic components.
  • CPUs Central Processing Units
  • GPUs graphics processing units
  • GPUs Graphics Processing Unit
  • ASIC Application Specific Integrated Circuit
  • DSP Programmable Logic Device
  • PLD Programmable Logic Device
  • CPLD Complex Programmable Logic Device
  • FPGA Field-Programmable Gate Array
  • general-purpose processor controller,
  • Embodiments of the present disclosure also record a computer-readable storage medium.
  • a computer program is stored in the computer-readable storage medium.
  • the computer program is executed by a processor, the point cloud for laser radar of the embodiment is implemented. Processing method steps.
  • An embodiment of the present disclosure also describes an electronic device.
  • the electronic device includes: a processor and a memory for storing executable instructions by the processor, wherein the processor is configured to: when calling the executable instructions in the memory , the steps of the point cloud processing method for lidar of the embodiment can be performed.
  • Embodiments of the present disclosure also record a computer program, including computer readable code.
  • the computer readable code When the computer readable code is run on a computing processing device, the computing processing device causes the computing processing device to perform the aforementioned point cloud processing for lidar. method.
  • FIG. 11 shows a configuration block diagram of an electronic device 1100 according to an embodiment of the present disclosure.
  • Electronic device 1100 may be any type of general or special purpose computing device, such as a desktop computer, laptop computer, server, mainframe computer, cloud-based computer, tablet computer, etc.
  • the electronic device 1100 includes an input/output (I/O) interface 1101 , a network interface 1102 , a memory 1104 and a processor 1103 .
  • I/O input/output
  • I/O interface 1101 is a collection of components that can receive input from and/or provide output to the user.
  • I/O interface 1101 may include, but is not limited to, buttons, keyboards, keypads, LCD displays, LED displays, or other similar display devices, including display devices with touch screen capabilities that enable interaction between the user and the electronic device.
  • Network interface 1102 may include various adapters and circuitry implemented in software and/or hardware to enable communication with the lidar system using wired or wireless protocols.
  • Wired protocols such as serial port protocol, parallel port protocol, Ethernet Any one or more of the Internet protocol, USB protocol or other wired communication protocols.
  • the wireless protocol is, for example, any IEEE802.11 Wi-Fi protocol, cellular network communication protocol, etc.
  • Memory 1104 includes a single memory or one or more memories or storage locations, including but not limited to random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), read only memory (ROM) ), EPROM, EEPROM, flash memory, logic blocks of FPGA, hard disk, or any other layer of the memory hierarchy.
  • RAM random access memory
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • ROM read only memory
  • EPROM electrically erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • flash memory logic blocks of FPGA, hard disk, or any other layer of the memory hierarchy.
  • Memory 1104 may be used to store any type of instructions, software, or algorithms, including instructions 1105 for controlling the general functionality and operation of electronic device 1100 .
  • Processor 1103 controls the general operation of electronic device 1100 .
  • the processor 1103 may include, but is not limited to, a CPU, a hardware microprocessor, a hardware processor, a multi-core processor, a single-core processor, a microcontroller, an application specific integrated circuit (ASIC), a DSP, or other similar processing device capable of executing Any type of instructions, algorithms, or software for controlling the operation and functionality of electronic device 1100 of the embodiments described in this disclosure.
  • Processor 1103 may be various implementations of digital circuitry, analog circuitry, or mixed-signal (a combination of analog and digital) circuitry that perform functions in a computing system.
  • Processor 1103 may include, for example, a portion or circuit such as an integrated circuit (IC), a separate processor core, an entire processor core, a separate processor, a programmable hardware device such as a field programmable gate array (FPGA), and/or Systems that include multiple processors.
  • IC integrated circuit
  • FPGA field programmable gate array
  • Internal bus 1106 may be used to establish communication between components of electronic device 1100 .
  • the electronic device 1100 is communicatively coupled to an autonomous vehicle including a lidar system to control the autonomous vehicle to avoid obstacles.
  • the point cloud processing method for lidar of the present disclosure may be stored on the memory 1104 of the electronic device 1100 in the form of computer-readable instructions.
  • the processor 1103 implements the point cloud processing method for lidar by reading stored computer-readable instructions.
  • electronic device 1100 is described using specific components, in alternative embodiments different components may be present in electronic device 1100 .
  • electronic device 1100 may include one or more additional processors, memory, network interfaces, and/or I/O interfaces. Additionally, one or more of the components may not be present in electronic device 1100 . Additionally, although separate components are shown in FIG. 11 , in some embodiments some or all of a given component may be integrated into one or more of the other components in electronic device 1100 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种用于激光雷达的点云处理方法、装置、设备及存储介质,该方法包括:根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格(401);遍历每个栅格内的所有点,获取栅格内最近点以及最远点,进行粘连点处理(402);该粘连点处理包括:保留栅格内最近点以及最远点的一定距离范围内的点,将其余的点确定为粘连点;将该粘连点删除(403)。该方法处理后的点云数据更合理,能准确对障碍物进行避让,大大方便了自动驾驶中的路径规划,保证了行车安全。

Description

用于激光雷达的点云处理方法、装置、设备及存储介质
相关申请的交叉引用
本公开要求在2022年03月24日提交中国专利局、申请号为202210293601.0、名称为“用于激光雷达的点云处理方法及装置、存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及激光雷达中点云数据处理技术,尤其涉及一种用于激光雷达的点云处理方法、装置、设备及存储介质。
背景技术
随着工业智能化的发展,自动驾驶、机器人避障、智慧城市的车路协同以及测绘领域等,对3D感知技术尤其是激光雷达技术的需求日益增加。在利用激光雷达进行环境感知的情况下,经常存在这样一种情况:由于激光雷达的发射光存在发散角,形成的光斑覆盖有一定面积,当某束光斑同时照射于前后两个物体且相互之间相距较近的物体边界处时,产生的回波就会叠加到一起。如图2所示,虚线为一束光斑同时打到前后两个物体且相互之间相距较近的物体边界处分别形成的回波,实线为实际叠加后的回波信号。由于不能区分前后两个物体分别形成的回波,实际信号处理中是根据叠加后的回波信号进行距离计算,计算与物体之间的距离会存在极大的偏差,因此会导致处于同方向上的前后物体边缘间出现漂浮的虚假点云,即出现点云粘连现象。对于自动驾驶车辆而言,若前方有两个障碍物相邻较近,很容易出现点云粘连现象,自动驾驶车辆根据出现粘连的点云进行感知,将会影响自动驾驶车辆的感知路线规划等算法,影响车速调整等发生。
发明内容
有鉴于此,本公开实施例提供一种用于激光雷达的点云处理方法、装置、设备及存储介质。
根据本公开实施例的第一方面,提供一种用于激光雷达的点云处理方法,包括:
根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格;
遍历每个栅格内的所有点,获取栅格内最近点以及最远点,计算所述最远点和所述 最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理;其中,在栅格内的点来自于不同对象反射形成的情况下,所述设定阈值为第一设定阈值;在栅格内的点来自于同一对象反射形成的情况下,所述设定阈值为第二设定阈值;所述第二设定阈值大于第一设定阈值;
所述粘连点处理包括:保留栅格内最近点以及最远点的一定距离范围内的点,将其余的点确定为粘连点;
将所述粘连点删除。
在根据第一方面的一些示例性的实施例中,所述方法还包括:
在所述距离差值小于或等于所述设定阈值的情况下,不进行粘连点处理。
在根据第一方面的一些示例性的实施例中,所述保留最近点以及最远点的一定距离范围内的点,将其余的点确定为粘连点,包括:
保留最近点的第一设定距离范围内的点以及最远点的第二设定距离范围内的点,将所述栅格内的其余的点确定为粘连点。
在根据第一方面的一些示例性的实施例中,所述将激光雷达的点云划分为不同的栅格,包括:
将所述激光雷达的点云转换为以球坐标系表征;
将所述球坐标系表征的点云划分为不同的栅格化;其中,每个栅格中的点的数量大于或等于设定值。
在根据第一方面的一些示例性的实施例中,所述方法还包括:
根据所述激光雷达的点云的坐标信息,将所述激光雷达的点云划分至不同的栅格。
在根据第一方面的一些示例性的实施例中,所述方法还包括:
将激光雷达的点云中距离小于第三设定阈值的区域确定为感兴趣区域(Region of Interest,ROI);
对应地,所述将激光雷达的点云划分为不同的栅格,包括:
将ROI包含的点云划分为不同的栅格。
根据本公开实施例的第二方面,提供一种用于激光雷达的点云处理装置,包括:
划分单元,用于根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格;
获取单元,用于遍历每个栅格内的所有点云,获取栅格内最近点以及最远点;
粘连点处理单元,用于计算所述最远点和所述最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理;其中,在栅格内的点来自于不同对象反射形成的情况下,所述设定阈值为第一设定阈值;在栅格内的点来自于同一对象反射形成的情况下,所述设定阈值为第二设定阈值;所述第二设定阈值大于第一设定阈值;所述粘连点处理包括:保留最近点以及最远点的一定距离范围内的点云,将其余的点云确定为粘连点云;
删除单元,用于将所述粘连点云删除。
在根据第二方面的一些示例性的实施例中,所述粘连点处理单元,还用于:
在所述距离差值小于或等于所述设定阈值的情况下,不进行粘连点处理。
在根据第二方面的一些示例性的实施例中,所述粘连点处理单元,还用于:
保留最近点的第一设定距离范围内的点,以及最远点的第二设定距离范围内的点,将栅格内的其余的点确定为粘连点。
在根据第二方面的一些示例性的实施例中,所述划分单元,还用于:
将所述激光雷达的点云转换为以球坐标系表征;
将所述球坐标系表征的点云进行栅格化处理;其中,每个栅格中的点云的数量大于或等于设定值。
在根据第二方面的一些示例性的实施例中,所述划分单元,还用于:
根据所述激光雷达的点云的坐标信息,将所述激光雷达的点云划分至不同的栅格。
在根据第二方面的一些示例性的实施例中,所述装置还包括:
确定单元,用于将距离小于第三设定阈值的区域确定为点云的ROI;
对应地,所述划分单元,还用于:将ROI包含的点云划分为不同的栅格。
根据本公开实施例的第三方面,提供一种计算机可读存储介质,所述计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现所述的用于激光雷达的点云处理方法的步骤。
根据本公开实施例的第四方面,提供一种电子设备,所述电子设备包括:
处理器,和
用于存储处理器可执行指令的存储器,其中,所述处理器被配置为在调用存储器中的可执行指令时执行本公开第一方面实施例所提出的用于激光雷达的点云处理方法。
根据本公开实施例的第五方面,提供一种计算机程序,包括计算机可读代码,当所 述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行本公开第一方面实施例所提出的用于激光雷达的点云处理方法。
本公开实施例中,根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格,判断栅格内最近点和最远点的距离差是否大于设定阈值,如果大于,才对该栅格进行粘连点处理,这样可以避免误删。当激光雷达和障碍物入射角较大时,其在栅格中形成的最近点和最远点距离差值较大,但是其他特征和粘连点非常接近,因此,可以先对栅格最近点和最远点距离差值进行判断,排除激光雷达和障碍物入射角较大时形成的栅格,避免误删。通过栅格内最远点和最近点之间的距离差值来确定是否针对栅格进行粘连点处理,提升了粘连点确定的准确性,降低了粘连点误删除的概率。当栅格内的点来自不同对象如来自前后两个物体的情况下,第一设定阈值设置的较小,当最远点和最近点的差值小于第一设定阈值时,即对应不同对象离得特别近时,虽然产生的粘连点删除不掉,但是离得较近的对象间的粘连点对实际应用产生的影响较小;而当栅格内的点来自于同一个对象时,第二设定阈值设置的较大,这样,每个栅格内的最远点和最近点的距离差值随着激光雷达与探测对象的入射角的增大而增加,当激光雷达和障碍物入射角较大时,栅格中形成的最近点和最远点距离差值较大,其他特征和粘连点非常接近,因此,可以先对栅格最近点和最远点距离差值进行判断,排除激光雷达和障碍物入射角较大时形成的栅格,避免误删。基于本公开实施例处理后的点云进行障碍物尺寸感知时更准确,有利于自动驾驶车辆等真实的障碍物尺寸确定相应的路线规划,能准确对障碍物进行避让等,大大方便了自动驾驶中的路径规划,保证了行车安全;本公开实施例还支持先确定可能出现粘连的区域,仅对可能出现粘连的区域进行粘连点的确定并删除,从而提升点云处理效率。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍。显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了点云数据发生粘连的场景示意图;
图2示出了发生粘连的激光信号的回波信号示意图;
图3示出了点云图中发生粘连的点云示意图;
图4为本公开实施例的用于激光雷达的点云处理方法的流程示意图;
图5为本公开实施例的用于激光雷达的点云处理方法的示例示意图;
图6为本公开实施例的栅格内各点分布情况示意图;
图7为本公开实施例的栅格内各点分布情况示意图;
图8为本公开实施例的栅格内各点分布情况示意图;
图9示出了点云图中删除了粘连点的点云示意图;
图10为本公开实施例的用于激光雷达的点云处理装置的组成结构示意图;
图11示出根据本公开的实施例的电子设备的配置框图。
具体实施方式
以下结合附图,详细阐明本公开实施例技术方案的实质。
图1示出了点云数据发生粘连的场景示意图,如图1所示,由于激光雷达发射的激光光斑有一定大小,当一束激光光斑照射于前后两个物体且相互之间相距较近的物体边界处(该距离和发光脉冲宽度有关),产生的回波就会叠加到一起,实际信号处理中是根据叠加后的回波信号进行距离计算,这样会导致所算结果与物体之间的距离会存在极大的偏差,导致感知算法处理得到的障碍物的尺寸偏离真实值。
图2示出了发生粘连的激光信号的回波信号示意图,如图2所示,虚线所示为一束光斑同时照射于前后两个物体且相距较近的物体边界处分别形成的回波信号,实线所示为实际叠加后的回波信号。激光雷达根据图2中实线所示叠加后的回波信号对物体的距离进行计算时,将与实际距离存在较大的偏差。实际应用中如自动驾驶车辆行进方向存在相邻较近的障碍物,会影响自动驾驶车辆的路线规划,无法正常通行。
图3示出了点云图中发生粘连的点云示意图,如图3所示,在完整的点云图上,点云粘连表现为处于同方向上的前后物体边缘间会出现漂浮的点云。
图4为本公开实施例的用于激光雷达的点云处理方法的流程示意图,如图4所示,本公开实施例的用于激光雷达的点云处理方法包括以下步骤:
步骤401,根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格。
本公开实施例中,为了避免激光雷达的点云中存在粘连点,对整个激光雷达的点云进行粘连点的查找并进行删除。
具体地,根据预设的角度范围以及分辨率,将点云按照球坐标系栅格化。将激光雷 达的点云转换为以球坐标系表征;根据预设的角度范围以及分辨率,将所述球坐标系表征的点云划分为不同的栅格,其中,每个栅格中的点的数量大于或等于设定值。本公开实施例中,将点云划分栅格的目的,是依据点云的分布特点按栅格进行粘连点的查找。由于栅格包含一定数量的点,从而使栅格内较大概率会包含照射在前面物体的点云,粘连点云和照射在后面物体的点云,因此,划分为栅格可以提高粘连点的查找准确率及查找效率。这里的设定值可以设置为4,即保证一个子栅格中点云数量维持于4个左右。具体地,该设定值也可以为5、或6等。本公开实施例中不作限定。在栅格划分时,为保证每个栅格中的点数量,可以根据预设的角度范围以及点云数据的分辨率来调整栅格的大小,以使每个子栅格中的点数量维持在设定值左右。
本公开实施例中,可以根据所述激光雷达的点云的坐标信息,将所述激光雷达的点云划分至不同的栅格。即根据点云的坐标值确定其落入哪个栅格中,将该点云划分至相应栅格中。当栅格中的点云数量不满足上述设定数量的情况下,可以根据点云的分辨率,调整栅格的分配角度范围,最终使每个栅格内的点云数量均保持在设定值数量左右。
步骤402,遍历每个栅格内的所有点,获取栅格内最近点以及最远点,计算所述最远点和所述最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理。
这里,所述粘连点处理包括:保留栅格内最近点以及最远点的一定距离范围内的点,将其余的点确定为粘连点。
本公开实施例中,针对所划分后的每个栅格,获取每个栅格中的点的测距值,确定栅格中的最远点和最近点。计算所述最远点和所述最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理。
具体地,在栅格内的点来自于不同对象反射形成时,在所述距离差值大于第一设定阈值的情况下,进行粘连点处理。即在栅格内的点来自于不同对象反射形成的情况下,所述设定阈值为第一设定阈值。
在具体实施过程中,第一设定阈值趋向于设定阈值可取值范围的最小值。在此情形时,当最远点和最近点的差值小于第一设定阈值时,即对应不同对象离得特别近时,虽然产生的粘连点删除不掉,但是离得较近的对象间的粘连点对实际应用产生的影响较小,因此这种情况在实际应用是可以接受的。
在栅格内的点来自于同一对象反射形成时,在所述距离差值大于第二设定阈值的情况下,进行粘连点处理。即在栅格内的点来自于同一对象反射形成的情况下,所述设定 阈值为第二设定阈值;其中,所述第二设定阈值大于第一设定阈值。
在具体实施过程中,第二设定阈值趋向于设定阈值可取值范围的最大值。每个栅格内的最远点和最近点的距离差随着激光雷达与探测对象的入射角的增大而增加,当激光雷达和障碍物入射角较大时,其在栅格中形成的最近点和最远点距离差值较大,但是其他特征和粘连点非常接近,因此,可以先对栅格最近点和最远点距离差值进行判断,排除激光雷达和障碍物入射角较大时形成的栅格,避免误删。
作为一种示例,设定阈值可以位于0.1m~1m的范围,第一设定阈值可以参考设定阈值的最小值如0.1m来设置,如可以设置为0.2m、0.35m等,第二设定阈值可以参考设定阈值的最大值如1m来设置,如可以设置为0.7m、0.8m或0.95m等。
在距离差值大于设定阈值的情况下,查找出与所述最近点距离位于第一设定距离范围内的点云,以及与所述最远点距离位于第二设定距离范围内的点,将所查找的最近点以及最远点的一定距离范围内的点之外的点确定为所述粘连点。
本公开实施例中,第一设定距离范围可以与第二设定距离范围相同。例如,第一设定距离范围可以设置为不超过0.06m的范围,当栅格内的最近点的附近0.06m范围内的点,认为是正常点,其余的点则为粘连点。作为一种示例,第二设定距离范围可以设置为不超过0.06m的范围,当栅格内的最远点附近0.06m范围内的点,认为是正常点,其余的点则为粘连点。本领域技术人员应当理解,上述第一设定距离范围、第二设定距离范围仅为示例性说明,并非是限定。
本公开实施例中,第一设定距离范围可以与第二设定距离范围也可以不相同。例如,第一设定距离范围可以设置为不超过0.06m的范围,当栅格内的最近点的附近0.06m范围内的点,认为是正常点,其余的点则为粘连点。作为一种示例,第二设定距离范围可以设置为不超过0.08m的范围,当栅格内的最远点附近0.08m范围内的点云,认为是正常点云,其余的点则为粘连点。本领域技术人员应当理解,上述第一设定距离范围、第二设定距离范围仅为示例性说明,并非是限定。
步骤403,将所述粘连点删除。
本公开实施例中,根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格,判断栅格内最近点和最远点的距离差是否大于设定阈值,如果大于,才对该栅格进行粘连点处理,这样可以避免误删。当激光雷达和障碍物入射角较大时,其在栅格中形成的最近点和最远点距离差值较大,但是其他特征和粘连点非常接近,因此,可以 先对栅格最近点和最远点距离差值进行判断,排除激光雷达和障碍物入射角较大时形成的栅格,避免误删。通过栅格内最远点和最近点之间的距离差值来确定是否针对栅格进行粘连点处理,提升了粘连点确定的准确性,降低了粘连点误删除的概率。当栅格内的点来自不同对象如来自前后两个物体的情况下,第一设定阈值设置的较小,当最远点和最近点的差值小于第一设定阈值时,即对应不同对象离得特别近时,虽然产生的粘连点删除不掉,但是离得较近的对象间的粘连点对实际应用产生的影响较小;而当栅格内的点来自于同一个对象时,第二设定阈值设置的较大,这样,每个栅格内的最远点和最近点的距离差值随着激光雷达与探测对象的入射角的增大而增加,当激光雷达和障碍物入射角较大时,栅格中形成的最近点和最远点距离差值较大,其他特征和粘连点非常接近,因此,可以先对栅格最近点和最远点距离差值进行判断,排除激光雷达和障碍物入射角较大时形成的栅格,避免误删。基于本公开实施例处理后的点云进行障碍物尺寸感知时更准确,有利于自动驾驶车辆等真实的障碍物尺寸确定相应的路线规划,能准确对障碍物进行避让等,大大方便了自动驾驶中的路径规划,保证了行车安全;本公开实施例还支持先确定可能出现粘连的区域,仅对可能出现粘连的区域进行粘连点的确定并删除,而提升点云处理效率。
本公开实施例中,为提升点云处理效率,还可以对可能出现点云粘连的区域进行预判断,将可能出现点云粘连的区域尽可能确定出,并将不同区域内的粘连点进行删除。
本公开实施例中,首先确定激光雷达可能产生粘连点的相关区域,以便直接在相关区域进行粘连点的判断,以节省点云识别的运算资源,提升粘连点识别的处理效率。当障碍物与激光雷达之间的距离较近如小于1.6m的情况下,激光雷达的回波信号的脉宽较宽,两个距离较近的障碍物的激光回波信号叠加到一起的概率增加,导致产生粘连现象。而距离远的回波脉宽较窄,所以距离较近的障碍物的回波叠加到一起的可能性会降低,因此,本公开实施例中,将距离小于第三设定阈值的区域确定为点云的感兴趣区域(Region Of Interest,ROI),ROI即为容易出现粘连点的区域。本公开实施例中,先确定出可能产生粘连点的ROI区域,以便仅在ROI区域中进行粘连点的查找及删除等处理,不必对整个激光雷达的所有点云进行粘连点的查找,从而大大提升点云处理的效率。因此,点云的ROI区域可以根据距离或者其他可能产生粘连点的区域进行选取,通过确定出点云的ROI区域,可以将激光雷达的原始点云分为ROI点云和非ROI点云。这里,第三设定阈值可以为1.6m,本领域技术人员应当理解,该第三设定阈值也可以是其他数值 如1.7m、2.1m等,仅为示例性说明。
本公开实施例中,根据点云的距离确定障碍物可能导致的粘连现象,如当根据回波信号确定的点所在距离小于1.6m的范围内时,容易产生点云粘连的现象,可以将所采集的点云中,点距离小于1.6m的点所在区域均确定为ROI区域。本公开实施例中,也可以根据障碍物的大致形状,将障碍物的边缘区域划分为ROI区域等。
针对ROI区域中的栅格,确定栅格内的最近点和最远点,根据最远点和最近点之间的距离差值,确定是否针对该栅格进行粘连点处理。将进行了粘连点处理后的激光雷达的点云作为有效点云,即将去除粘连点后的ROI包含的点云及非ROI中的点云确定为有效点云,进行障碍物距离运算及其他数据处理分析等。
本公开实施例中,在所述距离差值小于或等于所述设定阈值的情况下,不进行粘连点处理。通过设定阈值,即可将可能产生粘连的栅格区域和基本不会产生粘连现象的栅格作出区分,仅对可能出现粘连现象的栅格进行粘连点的处理。
以下通过具体示例,进一步阐明本公开实施例的技术方案的实质。
本公开实施例中,当确定出ROI后,并不会对ROI中的栅格中的点进行粘连点处理,而是需要对栅格中的点进行粘连点判断,仅在可能存在粘连点的情况下,再进行粘连点处理。并且,本公开实施例的粘连点处理方式不会导致对非粘连点的误删的情况,例如当雷达与物体间的入射角度大于一定角度时,此时会将部分正常点云误判为粘连点,如图6所示,0表示激光雷达,3表示一个正常物体(假设为一个平面),A点和B点表示一个栅格内同一物体3上的最近点和最远点,1和2的长度分别对应激光雷达到A点和到B点的距离,若直接依据最近点第一设定距离范围内的点以及最远点的第二设定距离范围内点为非粘连点,而栅格内的其他点为粘连点,很可能会导致正常点的误删情况。例如,如果入射角较小,则A点和B点的距离差值较小,保留最近点A和最远点B一定距离内的点不会造成误删除,但是如果雷达和物体的入射角大于一定角度,例如如图7所示,在A点和B点的距离差值较大的情况下,A、B之间的点很可能被误删除,例如远处的地面点由于入射角较大会被误删除等。
本公开实施例正是针对上述的粘连点误删除的情况,提出一种更合理的点云处理方法。
图5为本公开实施例的用于激光雷达的点云处理方法的示例示意图,如图5所示,本公开实施例的用于激光雷达的点云处理方法包括以下处理步骤:
步骤1,对激光雷达所有的回波信号形成的点云选取ROI区域,该ROI区域可以根据距离或者其他可能产生粘连点的区域选取,将原始点云分为ROI点云和非ROI点云。
由于距离远的回波脉宽较窄,所以两个物体的回波叠加到一起的可能性会降低,因此ROI区域可以为一定距离(如小于1.6m)范围内的点云对应的区域。本公开实施例中,划分ROI区域的目的,是降低粘连点处理的效率,即仅对ROI区域进行粘连点云的处理,而对ROI区域之外的其他区域不进行粘连点云的处理。
作为一种实现方式,也可以不必区分ROI区域,而对激光雷达的点云直接进行粘连点云的处理。
步骤2,根据预设的角度范围和分辨率对点云按极坐标栅格化。对于ROI区域的点云的每一个点云中的点云坐标(x,y,z),反算为球坐标系的(element(极角),azimuth(方位角),distance(距离)),然后根据预设的角度范围以及分辨率,将点云中所有点按照球坐标系进行栅格化划分,划分为不同的栅格。
栅格化是指根据预设的角度范围和角度分辨率分别计算得到每个点所处的栅格的横纵坐标序号,得到每个点所处栅格的水平和垂直序号(hori_pos和vert_pos)。
点云的栅格单元对应的坐标确定示例如下:
hori_pos=floor(azimuth-angle_hori_min)/angle_hori_resolution
vert_pos=floor(element-angle_vert_min)/angle_vert_resolution
其中,angle_hori_min,angle_vert_min,表示待处理点最小的水平和垂直角度;angle_hori_resolution,angle_vert_resolution分别表示栅格化水平和垂直的分辨率;Floor(arg)为向下取整函数,返回不大于arg的最大整数值。
步骤3,计算每个栅格内的最近点和最远点。遍历每个栅格内的所有点,得到最近点的距离以及最远点的距离。
本公开实施例中,可以根据雷达扫描分辨率(即点之间的间距)调整栅格大小,从而对栅格内点的数量进行控制,例如使栅格中点的数量至少为4,由于包含一定数量的点从而使栅格内较大概率会包含照射在前面物体的点,粘连点和照射于后面物体的点。保留每个栅格中最近点第一设定距离范围内的点和最远点第二设定距离范围内的点,其余点视为粘连点而删除。对于只包含正常物体点云的栅格,由于每个栅格的入射角度较小,所有物体的点云都处在一定范围内,因此直接进行粘连点处理,可以保留正常的点云。但是对于入射角度较大的情况,如图7所示的场景,这种粘连点处理方法会存在点误删 的情况,导致点云质量损失,尤其是会导致后物体边缘缺失。
步骤4,判断每个栅格内最近点和最远点的距离差值是否大于设定阈值,如果栅格内最近点和最远点的距离差值大于设定阈值,执行步骤5,否则,认为该栅格内不包含粘连点,不执行步骤5的粘连点处理。
本公开实施例中,对于待处理的栅格,只有当最远点和最近点的距离差值大于设定阈值X(X的取值范围为例如0.1m~1m)时,才进行栅格内的粘连点的删除处理。
具体针对以下两种情形:
对于同一栅格内的点为两个物体反射形成的情况下,如图8所示,0表示激光雷达,3和4分别为前后两个物体,A点为栅格内的最近点,B点为栅格内的最远点,C为边界处的粘连点,只有当3和4两个物体的距离超过一定阈值X1时才会处理,将粘连点C删除,保留距离在A,B点一定范围内的点。对于此情形,阈值取值趋向于X可取值范围的最小值。
在此情形时,当最远点和最近点的距离差值小于第一设定阈值X1时,即对应两个物体离得特别近时产生的粘连点删除不掉,但是这种情况在实际应用是可以接受的,两个离得较近的物体间的粘连点对实际应用产生的影响较小。作为一种示例,X1可以取值为0.1m、0.2m、或0.25m等。而当最远点和最近点的距离差值大于第一设定阈值X1时,则认为栅格中存在粘连点现象,需要对该栅格内的点进行粘连点处理。
对于同一栅格内的点云为一个物体反射形成的情况,如图7所示,因为每个栅格内的最远点距离1和最近点距离2的距离差随着入射角的增大而增加,只有同一个物体上两个A点和B点距离超过第二设定阈值X2时才会处理。对于此情形,第二设定阈值X2的取值趋向于X可取值范围的最大值。以尽可能避免正常点云被删除。
由于此时阈值X2取值较大,所以对于图6来说,由于通常每个栅格的角度设置的角度较小,所以每个栅格内的最远点距离1和最近点距离2的距离差都小于该阈值,所以这些情况不进行处理,从而相比原方法不会造成误删.
步骤5,确定每个栅格中与最近点的距离位于第一设定距离范围内的点,及与最远点的距离位于第二设定距离范围内的点,栅格中其余点视为粘连点,需要进行删除。
本公开实施例中的这种处理方式会将包含粘连点的栅格中照射于前面障碍物的点和照射于在后面障碍物的点保留,粘连点被删除。而对于未包含粘连点的正常栅格,正常栅格中的点通常都处在一定距离范围内;即最远点的距离和最近点的距离差值小于上述 第一设定阈值,不执行上述粘连点删除操作,从而可以保留正常的点云。图9示出了删除了粘连点后的点云示意图,如图9所示。
步骤6,删除栅格中的粘连点。将删除粘连点的激光雷达的点云数据作为有效点云数据。
图10为本公开实施例的用于激光雷达的点云处理装置的组成结构示意图,如图10所示,本公开实施例的用于激光雷达的点云处理装置包括:
划分单元80,用于根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格;
获取单元81,用于遍历每个栅格内的所有点云,获取栅格内最近点以及最远点;
粘连点处理单元82,用于计算所述最远点和所述最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理;在栅格内的点来自于不同对象反射形成的情况下,所述设定阈值为第一设定阈值;在栅格内的点来自于同一对象反射形成的情况下,所述设定阈值为第二设定阈值;所述第二设定阈值大于第一设定阈值;所述粘连点处理包括:保留最近点以及最远点的一定距离范围内的点云,将其余的点云确定为粘连点云;
删除单元83,用于将所述粘连点云删除。
本公开实施例中,作为一种实现方式,所述粘连点处理单元82,还用于:
在所述距离差值小于或等于所述设定阈值的情况下,不进行粘连点处理。
在一些示例性的实施例中,所述粘连点处理单元82,还用于:
保留最近点的第一设定距离范围内的点,以及最远点的第二设定距离范围内的点,将栅格内的其余的点确定为粘连点。
在一些示例性的实施例中,所述划分单元80,还用于:
将所述激光雷达的点云转换为以球坐标系表征;
将所述球坐标系表征的点云进行栅格化处理;其中,每个栅格中的点云的数量大于或等于设定值。
在一些示例性的实施例中,所述划分单元80,还用于:
根据所述激光雷达的点云的坐标信息,将所述激光雷达的点云划分至不同的栅格。
在图8所示的用于激光雷达的点云处理装置的基础上,本公开实施例的用于激光雷达的点云处理装置还包括:
确定单元(图8中未示出),用于将距离小于第三设定阈值的区域确定为点云的ROI;
对应地,所述划分单元80,还用于:将ROI包含的点云划分为不同的栅格。
在示例性实施例中,划分单元80、获取单元81、粘连点处理单元82、删除单元83、确定单元等可以被一个或多个中央处理器(CPU,Central Processing Unit)、图形处理器(GPU,Graphics Processing Unit)、应用专用集成电路(ASIC,Application Specific Integrated Circuit)、DSP、可编程逻辑器件(PLD,Programmable Logic Device)、复杂可编程逻辑器件(CPLD,Complex Programmable Logic Device)、现场可编程门阵列(FPGA,Field-Programmable Gate Array)、通用处理器、控制器、微控制器(MCU,Micro Controller Unit)、微处理器(Microprocessor)、或其他电子元件实现。
在本公开实施例中,图10示出的用于激光雷达的点云处理装置中各个单元执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
本公开实施例还记载了一种计算机可读存储介质,所述计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现所述实施例的用于激光雷达的点云处理方法的步骤。
本公开实施例还记载了一种电子设备,所述电子设备包括:处理器和用于存储处理器可执行指令的存储器,其中,所述处理器被配置为在调用存储器中的可执行指令时,能够执行所述实施例的用于激光雷达的点云处理方法的步骤。
本公开实施例还记载了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行前述的用于激光雷达的点云处理方法。
图11示出根据本公开的实施例的电子设备1100的配置框图。电子设备1100可为任何类型的通用或专用计算设备,诸如台式计算机、膝上型计算机、服务器、大型计算机、基于云的计算机、平板计算机等。如图11所示,电子设备1100包括输入输出(Input/Output,I/O)接口1101、网络接口1102、存储器1104和处理器1103。
I/O接口1101是可以从用户接收输入和/或向用户提供输出的组件的集合。I/O接口1101可以包括但不限于按钮、键盘、小键盘、LCD显示器、LED显示器或其它类似的显示设备,包括具有触摸屏能力使得能够进行用户和电子设备之间的交互的显示设备。
网络接口1102可以包括各种适配器以及以软件和/或硬件实现的电路系统,以便能够使用有线或无线协议与激光雷达系统通信。有线协议例如是串口协议、并口协议、以太 网协议、USB协议或其它有线通信协议中的任何一种或多种。无线协议例如是任何IEEE802.11Wi-Fi协议、蜂窝网络通信协议等。
存储器1104包括单个存储器或一个或多个存储器或存储位置,包括但不限于随机存取存储器(RAM)、动态随机存取存储器(DRAM)、静态随机存取存储器(SRAM)、只读存储器(ROM)、EPROM、EEPROM、闪存、FPGA的逻辑块、硬盘或存储器层次结构的任何其他各层。存储器1104可以用于存储任何类型的指令、软件或算法,包括用于控制电子设备1100的一般功能和操作的指令1105。
处理器1103控制电子设备1100的一般操作。处理器1103可以包括但不限于CPU、硬件微处理器、硬件处理器、多核处理器、单核处理器、微控制器、专用集成电路(ASIC)、DSP或其他类似的处理设备,能够执行根据本公开中描述的实施例的用于控制电子设备1100的操作和功能的任何类型的指令、算法或软件。处理器1103可以是在计算系统中执行功能的数字电路系统、模拟电路系统或混合信号(模拟和数字的组合)电路系统的各种实现。处理器1103可以包括例如诸如集成电路(IC)、单独处理器核心的部分或电路、整个处理器核心、单独的处理器、诸如现场可编程门阵列(FPGA)的可编程硬件设备、和/或包括多个处理器的系统。
可以使用内部总线1106来建立电子设备1100的组件之间的通信。
电子设备1100通信耦接到包括激光雷达系统的自动驾驶车辆,以控制自动驾驶车辆对障碍物进行避让。例如,可以将本公开的用于激光雷达的点云处理方法以计算机可读指令的形式存储在电子设备1100的存储器1104上。处理器1103通过读取所存储的计算机可读指令来实施用于激光雷达的点云处理方法。
尽管使用特定组件来描述电子设备1100,但是在替选实施例中,电子设备1100中可以存在不同的组件。例如,电子设备1100可以包括一个或多个附加处理器、存储器、网络接口和/或I/O接口。另外,电子设备1100中可能不存在组件的一个或多个。另外,尽管在图11中示出单独的组件,但是在一些实施例中,给定组件的一些或全部可以集成到电子设备1100中的其他组件中的一个或多个中。
应理解,说明书通篇中提到的“一个实施例”或“一实施例”意味着与实施例有关的特定特征、结构或特性包括在本公开的至少一个实施例中。因此,在整个说明书各处出现的“在一个实施例中”或“在实施例中”未必一定指相同的实施例。此外,这些特定的特征、结构或特性可以任意适合的方式结合在一个或多个实施例中。应理解,在本 公开的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本公开实施例的实施过程构成任何限定。上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
在本公开所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。以上所描述的设备实施例仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不存在。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
以上所述,仅为本公开的实施方式,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以所述权利要求的保护范围为准。

Claims (15)

  1. 一种用于激光雷达的点云处理方法,其特征在于,所述方法包括:
    根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格;
    遍历每个栅格内的所有点,获取栅格内最近点以及最远点,计算所述最远点和所述最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理;其中,在栅格内的点来自于不同对象反射形成的情况下,所述设定阈值为第一设定阈值;在栅格内的点来自于同一对象反射形成的情况下,所述设定阈值为第二设定阈值;所述第二设定阈值大于第一设定阈值;
    所述粘连点处理包括:保留栅格内最近点以及最远点的一定距离范围内的点,将其余的点确定为粘连点;
    将所述粘连点删除。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    在所述距离差值小于或等于所述设定阈值的情况下,不进行粘连点处理。
  3. 根据权利要求1或2所述的方法,其特征在于,所述保留最近点以及最远点的一定距离范围内的点,将其余的点确定为粘连点,包括:
    保留最近点的第一设定距离范围内的点以及最远点的第二设定距离范围内的点,将所述栅格内的其余的点确定为粘连点。
  4. 根据权利要求1或2所述的方法,其特征在于,所述将激光雷达的点云划分为不同的栅格,包括:
    将所述激光雷达的点云转换为以球坐标系表征;
    将所述球坐标系表征的点云划分为不同的栅格;其中,每个栅格中的点的数量大于或等于设定值。
  5. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    根据所述激光雷达的点云的坐标信息,将所述激光雷达的点云划分至不同的栅格。
  6. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    将激光雷达的点云中距离小于第三设定阈值的区域确定为感兴趣区域ROI;
    对应地,所述将激光雷达的点云划分为不同的栅格,包括:
    将ROI包含的点云划分为不同的栅格。
  7. 一种用于激光雷达的点云处理装置,其特征在于,所述装置包括:
    划分单元,用于根据预设的角度范围以及分辨率,将激光雷达的点云划分为不同的栅格;
    获取单元,用于遍历每个栅格内的所有点云,获取栅格内最近点以及最远点;
    粘连点处理单元,用于计算所述最远点和所述最近点的距离差值,在所述距离差值大于设定阈值的情况下,进行粘连点处理;其中,在栅格内的点来自于不同对象反射形成的情况下,所述设定阈值为第一设定阈值;在栅格内的点来自于同一对象反射形成的情况下,所述设定阈值为第二设定阈值;所述第二设定阈值大于第一设定阈值;所述粘连点处理包括:保留最近点以及最远点的一定距离范围内的点云,将其余的点云确定为粘连点云;
    删除单元,用于将所述粘连点云删除。
  8. 根据权利要求7所述的装置,其特征在于,所述粘连点处理单元,还用于:
    在所述距离差值小于或等于所述设定阈值的情况下,不进行粘连点处理。
  9. 根据权利要求7或8所述的装置,其特征在于,所述粘连点处理单元,还用于:
    保留最近点的第一设定距离范围内的点,以及最远点的第二设定距离范围内的点,将栅格内的其余的点确定为粘连点。
  10. 根据权利要求7或8所述的装置,其特征在于,所述划分单元,还用于:
    将所述激光雷达的点云转换为以球坐标系表征;
    将所述球坐标系表征的点云进行栅格化处理;其中,每个栅格中的点云的数量大于或等于设定值。
  11. 根据权利要求7或8所述的装置,其特征在于,所述划分单元,还用于:
    根据所述激光雷达的点云的坐标信息,将所述激光雷达的点云划分至不同的栅格。
  12. 根据权利要求7或8所述的装置,其特征在于,所述装置还包括:
    确定单元,用于将距离小于第三设定阈值的区域确定为点云的ROI;
    对应地,所述划分单元,还用于:将ROI包含的点云划分为不同的栅格。
  13. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1至6任一项所述的用于激光雷达的点云处理方法的步骤。
  14. 一种电子设备,其特征在于,所述电子设备包括:
    处理器,和
    用于存储处理器可执行指令的存储器,其中,所述处理器被配置为在调用存储器中的可执行指令时执行如权利要求1至6任一项所述的用于激光雷达的点云处理方法。
  15. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行根据权利要求1至6任一项所述的用于激光雷达的点云处理方法。
PCT/CN2023/083414 2022-03-24 2023-03-23 用于激光雷达的点云处理方法、装置、设备及存储介质 WO2023179718A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210293601.0 2022-03-24
CN202210293601.0A CN114384492B (zh) 2022-03-24 2022-03-24 用于激光雷达的点云处理方法及装置、存储介质

Publications (1)

Publication Number Publication Date
WO2023179718A1 true WO2023179718A1 (zh) 2023-09-28

Family

ID=81204938

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/083414 WO2023179718A1 (zh) 2022-03-24 2023-03-23 用于激光雷达的点云处理方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN114384492B (zh)
WO (1) WO2023179718A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114384492B (zh) * 2022-03-24 2022-06-24 北京一径科技有限公司 用于激光雷达的点云处理方法及装置、存储介质
CN114384491B (zh) * 2022-03-24 2022-07-12 北京一径科技有限公司 用于激光雷达的点云处理方法及装置、存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705543A (zh) * 2019-08-23 2020-01-17 芜湖酷哇机器人产业技术研究院有限公司 基于激光点云进行车道线识别的方法和系统
CN111079801A (zh) * 2019-11-29 2020-04-28 上海有个机器人有限公司 基于点云匹配快速搜索最近点的方法、介质、终端和装置
CN111337941A (zh) * 2020-03-18 2020-06-26 中国科学技术大学 一种基于稀疏激光雷达数据的动态障碍物追踪方法
CN112183393A (zh) * 2020-09-30 2021-01-05 深兰人工智能(深圳)有限公司 激光雷达点云目标检测方法、系统及装置
US20210327128A1 (en) * 2019-01-30 2021-10-21 Baidu Usa Llc A point clouds ghosting effects detection system for autonomous driving vehicles
CN113569958A (zh) * 2021-07-29 2021-10-29 清华大学苏州汽车研究院(吴江) 激光点云数据聚类方法、装置、设备及介质
CN114384492A (zh) * 2022-03-24 2022-04-22 北京一径科技有限公司 用于激光雷达的点云处理方法及装置、存储介质

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663237B (zh) * 2012-03-21 2014-12-17 武汉大学 基于网格分块与移动最小二乘的点云数据全自动滤波方法
CN104484648B (zh) * 2014-11-27 2017-07-25 浙江工业大学 基于轮廓识别的机器人可变视角障碍物检测方法
CN106204705B (zh) * 2016-07-05 2018-12-07 长安大学 一种基于多线激光雷达的3d点云分割方法
JP6826023B2 (ja) * 2017-12-13 2021-02-03 Kddi株式会社 点群から対象を特定する対象識別装置、プログラム及び方法
JP7402608B2 (ja) * 2018-12-27 2023-12-21 ヤンマーパワーテクノロジー株式会社 作業車両用の衝突回避システム
CN110208819A (zh) * 2019-05-14 2019-09-06 江苏大学 一种多个障碍物三维激光雷达数据的处理方法
CN110879399B (zh) * 2019-10-08 2023-04-11 驭势科技(浙江)有限公司 处理点云数据的方法、装置、车辆、电子设备和介质
CN111310663A (zh) * 2020-02-17 2020-06-19 北京三快在线科技有限公司 道路栅栏检测方法、装置、设备及存储介质
WO2021207954A1 (zh) * 2020-04-15 2021-10-21 华为技术有限公司 一种目标识别的方法和装置
CN111652060B (zh) * 2020-04-27 2024-04-19 宁波吉利汽车研究开发有限公司 一种基于激光雷达的限高预警方法、装置、电子设备及存储介质
CN112162297B (zh) * 2020-09-24 2022-07-19 燕山大学 一种剔除激光点云地图中动态障碍伪迹的方法
CN112816993B (zh) * 2020-12-25 2022-11-08 北京一径科技有限公司 激光雷达点云处理方法和装置
CN113640779B (zh) * 2021-10-15 2022-05-03 北京一径科技有限公司 雷达失效判定方法及装置、存储介质
CN114089377A (zh) * 2021-10-21 2022-02-25 江苏大学 一种基于激光雷达的点云处理和物体识别系统及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210327128A1 (en) * 2019-01-30 2021-10-21 Baidu Usa Llc A point clouds ghosting effects detection system for autonomous driving vehicles
CN110705543A (zh) * 2019-08-23 2020-01-17 芜湖酷哇机器人产业技术研究院有限公司 基于激光点云进行车道线识别的方法和系统
CN111079801A (zh) * 2019-11-29 2020-04-28 上海有个机器人有限公司 基于点云匹配快速搜索最近点的方法、介质、终端和装置
CN111337941A (zh) * 2020-03-18 2020-06-26 中国科学技术大学 一种基于稀疏激光雷达数据的动态障碍物追踪方法
CN112183393A (zh) * 2020-09-30 2021-01-05 深兰人工智能(深圳)有限公司 激光雷达点云目标检测方法、系统及装置
CN113569958A (zh) * 2021-07-29 2021-10-29 清华大学苏州汽车研究院(吴江) 激光点云数据聚类方法、装置、设备及介质
CN114384492A (zh) * 2022-03-24 2022-04-22 北京一径科技有限公司 用于激光雷达的点云处理方法及装置、存储介质

Also Published As

Publication number Publication date
CN114384492A (zh) 2022-04-22
CN114384492B (zh) 2022-06-24

Similar Documents

Publication Publication Date Title
WO2023179718A1 (zh) 用于激光雷达的点云处理方法、装置、设备及存储介质
WO2023179717A1 (zh) 用于激光雷达的点云处理方法、装置、设备及存储介质
WO2020134082A1 (zh) 一种路径规划方法、装置和移动设备
WO2020034820A1 (zh) 障碍物或地面识别及飞行控制方法、装置、设备及存储介质
CN108629231B (zh) 障碍物检测方法、装置、设备及存储介质
WO2022142628A1 (zh) 一种点云数据处理方法及装置
US20150317037A1 (en) Image processing device and image processing method
EP4130798A1 (en) Target identification method and device
CN113761999B (zh) 一种目标检测方法、装置、电子设备和存储介质
US20230386076A1 (en) Target detection method, storage medium, electronic device, and vehicle
CN112904369B (zh) 机器人重定位方法、装置、机器人和计算机可读存储介质
WO2023155387A1 (zh) 多传感器目标检测方法、装置、电子设备以及存储介质
CN114494075A (zh) 基于三维点云的障碍物识别方法、电子设备和存储介质
WO2021056516A1 (zh) 目标检测方法、设备及可移动平台
WO2022206517A1 (zh) 一种目标检测方法及装置
CN112560800A (zh) 路沿检测方法、装置及存储介质
WO2023169337A1 (zh) 目标物速度的估计方法及装置、车辆和存储介质
US20220277595A1 (en) Hand gesture detection method and apparatus, and computer storage medium
CN113325388A (zh) 一种自动驾驶中激光雷达泛光噪点的过滤方法和装置
US20210003698A1 (en) Radar image processing device, radar image processing method, and storage medium
US11734850B2 (en) On-floor obstacle detection method and mobile machine using the same
WO2024007869A1 (zh) 水平度校验方法、终端和计算机可读存储介质
JP7451628B2 (ja) 車両姿勢推定方法、装置、電子デバイス、記憶媒体、及びプログラム
KR101392222B1 (ko) 표적 윤곽을 추출하는 레이저 레이더, 그것의 표적 윤곽 추출 방법
CN115511944A (zh) 基于单相机的尺寸估计方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23773964

Country of ref document: EP

Kind code of ref document: A1