CN114812435B - Vehicle three-dimensional point cloud data filtering method - Google Patents

Vehicle three-dimensional point cloud data filtering method Download PDF

Info

Publication number
CN114812435B
CN114812435B CN202210473084.5A CN202210473084A CN114812435B CN 114812435 B CN114812435 B CN 114812435B CN 202210473084 A CN202210473084 A CN 202210473084A CN 114812435 B CN114812435 B CN 114812435B
Authority
CN
China
Prior art keywords
vehicle
point cloud
height
dimensional point
midpoint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210473084.5A
Other languages
Chinese (zh)
Other versions
CN114812435A (en
Inventor
范立
徐锦锦
李启达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Seecar Information System Co ltd
Original Assignee
Suzhou Seecar Information System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Seecar Information System Co ltd filed Critical Suzhou Seecar Information System Co ltd
Priority to CN202210473084.5A priority Critical patent/CN114812435B/en
Publication of CN114812435A publication Critical patent/CN114812435A/en
Application granted granted Critical
Publication of CN114812435B publication Critical patent/CN114812435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2433Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring outlines by shadow casting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/04Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving
    • G01B11/043Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving for measuring length
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/04Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving
    • G01B11/046Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving for measuring width
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0691Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material of objects while moving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a vehicle three-dimensional point cloud data filtering method, which can well filter various noise point clouds such as water spray noise, tail gas noise, dust noise, laser reflection noise and the like by utilizing a geometric characteristic filtering method, and obtain accurate vehicle three-dimensional point cloud data. The method comprises the following steps: (a) Obtaining three-dimensional point cloud data of a view angle through a laser radar installed at a road end; (b) Removing the environment point cloud from the three-dimensional point cloud data in the step (a) and primarily extracting vehicle position and contour information to obtain original three-dimensional point cloud information of the vehicle; (c) And projecting all points of the original three-dimensional point cloud information of the vehicle into a plurality of plane views, and respectively filtering noise of each plane view by using an interval filtering method to obtain accurate vehicle contour information.

Description

Vehicle three-dimensional point cloud data filtering method
Technical Field
The application relates to the technical field of point cloud data processing, in particular to a vehicle three-dimensional point cloud data filtering method.
Background
The laser radar data of the vehicle is stored in the form of point clouds, and the vehicle point cloud data consists of a certain number of point clouds in a specific space, wherein the point clouds have a spatial relationship. Processing vehicle three-dimensional point cloud data requires filtering of various noisy point clouds in the vehicle point cloud data, in addition to requiring that the data be kept unchanged from different permutations and some spatial transformations.
In the prior art, the laser radar is widely used for detecting vehicles at road ends, the imaging characteristics of the laser radar are not influenced by illumination in the daytime and the night, and the noise of the vehicle point cloud image which can be obtained under normal conditions is very little and clear. However, if water is accumulated on the ground due to weather reasons, a large amount of water is generated during the running of the vehicle, the accurate contour detection of the vehicle is greatly affected, and the accuracy of measuring the vehicle contour is greatly reduced.
In view of the above, the application provides a vehicle three-dimensional point cloud data filtering method, which can well filter various noise point clouds and obtain accurate vehicle three-dimensional point cloud data.
Disclosure of Invention
The application aims to provide a vehicle three-dimensional point cloud data filtering method, which can well filter various noise point clouds such as water bloom noise, tail gas noise, dust noise, laser reflection noise and the like by utilizing a geometric characteristic filtering method, so as to obtain accurate vehicle three-dimensional point cloud data.
A vehicle three-dimensional point cloud data filtering method comprises the following steps:
(a) Obtaining three-dimensional point cloud data of a view angle through a laser radar installed at a road end;
(b) Removing the environment point cloud from the three-dimensional point cloud data in the step (a) and primarily extracting vehicle position and contour information to obtain original three-dimensional point cloud information of the vehicle;
(c) And projecting all points of the original three-dimensional point cloud information of the vehicle into a plurality of plane views, and respectively filtering noise of each plane view by using an interval filtering method to obtain accurate vehicle contour information.
In some embodiments, in step (a), the lidar is a multi-beam lidar capable of integrating in a vertical direction to identify altitude information of the vehicle.
Furthermore, the multi-beam laser radar is arranged on the road side or in the road, the installation height of the multi-beam laser radar is larger than the height of the vehicle, and the multi-beam laser radar can scan and detect the head or tail of the vehicle, the side surface of the vehicle and the roof of the vehicle.
Furthermore, the multi-beam laser radar is installed on a road side device or a portal frame, and the installation angle of the multi-beam laser radar can scan the front side or the rear side of the detection vehicle.
In some embodiments, in step (b), ambient point clouds are removed by three-dimensional point cloud de-background and vehicle location and contour information is initially extracted.
Further, the three-dimensional point cloud background removing technology comprises the following steps: the method for initially extracting the vehicle position and contour information comprises the following steps of: clustering or 3D deep learning detection.
Further, the dynamic point cloud extraction is to draw a 3D cuboid frame, the bottom surface of the cuboid is located at the ground critical position, the other surfaces are point cloud extraction ranges, whether points are inside the cuboid frame is judged, and all the internal point clouds are dynamic point clouds such as ground vehicles, pedestrians and the like from which background point clouds are removed; the ground background point removal is to take all point clouds in the interested range, fit and calculate a ground surface equation, then bring all points in the interested range into a ground surface function, judge the relationship between the points and the ground through the function value, if the function value is greater than 0, the points are indicated to be on the ground, and if not, the points are deleted, and the obtained all points are dynamic point clouds of vehicles, pedestrians and the like on the ground.
Further, the clustering mode is to analyze the separation amount (such as Euclidean distance) among the points of the original point cloud, gather the points smaller than a certain threshold (such as 1 meter) into a category, and classify the points into vehicles and pedestrians according to the length, width, height and position characteristics of each clustered point cloud; the 3D deep learning detection needs to train a large amount of manual annotation data, weight parameters are obtained after training is completed, and the positions and the sizes of vehicles and pedestrians on the ground can be directly identified.
Further, the original three-dimensional point cloud information of the vehicle is affected by noise, and the vehicle contour information cannot be clearly reflected, where the noise includes: water spray noise, tail gas noise, dust noise and laser reflection noise.
In some embodiments, in step (c), the plan view is a front view of the vehicle and the side view of the vehicle, and the front view and the side view of the vehicle are respectively statistically processed by using a fixed resolution interval filtering method, wherein the statistical information is a total number of point clouds and a maximum point cloud height in each interval.
Further preferably, each A cm is a vertical interval, the total number of point clouds and the maximum point cloud height in each vertical interval are counted, the maximum point cloud height is an average value of 5% -30% of the maximum point cloud height, A is 5-40cm, and the vertical interval width (A) of the front view of the vehicle is smaller than the vertical interval width of the side view of the vehicle.
Further, in the side view of the vehicle, searching backwards from the vehicle head, finding a real tail section (denoted as tmp_index) of the vehicle according to the statistical information, and deleting all point clouds behind the real tail section; the judgment of the real tail section of the vehicle needs to meet the conditions: the method comprises the steps of (1) the point cloud height is not less than a set height threshold value, the point cloud quantity is less than a set quantity threshold value, (2) the position where the point cloud quantity interval is suddenly dropped, and (3) searching backwards to prevent the point cloud from exceeding the height threshold value and the quantity threshold value.
Further preferably, the height threshold is 0.5-0.65 times of the highest height of the vehicle, the quantity threshold is in descending order of intervals with the quantity of all interval point clouds exceeding 10, the average value of the quantity of the point clouds in the interval 20% before the reciprocal is taken as the quantity threshold, or the quantity 40 is taken as the quantity threshold, and the quantity threshold is greater than the quantity threshold. The connection part of the truck or trailer head and the carriage meets the conditions (1) and (2), but does not meet the condition (3).
Further preferably, in order to reduce the measurement error of a single section, a multi-section common determination is adopted, if several sections following tmp_index also meet the condition (1), then tmp_index is determined as a real tail section, and the point clouds of all sections following tmp_index can be directly filtered and deleted.
Further, in the front view of the vehicle, taking the middle position of the vehicle point cloud in the width direction as a geometric midpoint (marked as geometric_mid_idx), and calculating a true midpoint (marked as true_mid_idx) according to the set deviation threshold and the set fluctuation threshold; searching from the right to the left by the true midpoint, and filtering out all point clouds left or right in a certain vertical interval if the maximum point cloud height in the vertical interval is lower than the set front vision threshold height.
Further preferably, the side not irradiated by the laser radar has almost no noise, and the deviation threshold value set by the displacement of the geometric midpoint toward the other side of the vehicle (the side irradiated by the laser radar) is set as a no-interference midpoint (denoted as no_noise_mid_idx), and the true midpoint is calculated according to whether or not the distance between the geometric midpoint and the no-interference midpoint is smaller than the fluctuation threshold value.
Further preferably, the deviation threshold is 5-15 vertical intervals of the front view of the vehicle, the fluctuation threshold is 0.1-0.5m, if the distance between the geometric midpoint and the undisturbed midpoint is smaller than the fluctuation threshold, the real midpoint is the average value of the geometric midpoint and the undisturbed midpoint, and if the distance between the geometric midpoint and the undisturbed midpoint is greater than the fluctuation threshold, the real midpoint is the undisturbed midpoint.
Further preferably, the front view threshold height is 0.5 to 0.65 times or 0.8m of the highest height of the vehicle, and both have larger values as the front view threshold height.
In some embodiments, after the noise filtering in the step (c), various noises around the vehicle are filtered to obtain accurate vehicle contour information, and the length, width and height of the vehicle are measured according to the vehicle contour, so that the subsequent analysis of the real vehicle point cloud is facilitated.
Drawings
The foregoing and other features of the present disclosure will be more fully described when considered in conjunction with the following drawings. It is appreciated that these drawings depict only several embodiments of the present disclosure and are therefore not to be considered limiting of its scope. The present disclosure will be described more specifically and in detail by using the accompanying drawings.
Fig. 1 is a diagram showing the installation position and scanning range of a multi-beam lidar according to embodiment 1 of the present application.
Fig. 2 is an acquired original three-dimensional point cloud information map of a vehicle according to embodiment 1 of the present application.
Fig. 3 is a side view of the vehicle after the original three-dimensional point cloud information of the vehicle according to embodiment 1 of the present application is projected.
Fig. 4 is a front view of the vehicle after the original three-dimensional point cloud information of the vehicle according to embodiment 1 of the present application is projected.
Fig. 5a is the total number of point clouds in the vertical section of the side view of the vehicle of embodiment 1 of the present application.
Fig. 5b is the maximum point cloud height in the vertical section of the side view of the vehicle of embodiment 1 of the application.
Fig. 6 is a side view of a filtered point cloud projection vehicle of embodiment 1 of the present application.
Fig. 7 is a front view of a filtered point cloud projection vehicle according to embodiment 1 of the present application.
Fig. 8 is a filtered three-dimensional point cloud image according to embodiment 1 of the present application.
Detailed Description
The following examples are described to aid in the understanding of the application and are not, nor should they be construed in any way to limit the scope of the application.
In the following description, those skilled in the art will recognize that components may be described as separate functional units (which may include sub-units) throughout this discussion, but those skilled in the art will recognize that various components or portions thereof may be divided into separate components or may be integrated together (including integration within a single system or component).
Meanwhile, the connection between components or systems is not intended to be limited to a direct connection. Rather, data between these components may be modified, reformatted, or otherwise changed by intermediate components. In addition, additional or fewer connections may be used. It should also be noted that the terms "coupled," "connected," or "input" are to be construed as including direct connection, indirect connection or fixation through one or more intermediaries.
Example 1:
and installing the laser radar at the road end to obtain three-dimensional point cloud data of the field angle. The laser radar is a multi-beam laser radar, and the multi-beam laser radar can integrate in the vertical direction to identify the height information of the vehicle. The multi-beam laser radar is arranged on a device at the road side, the installation height of the multi-beam laser radar is larger than that of a vehicle, the installation angle of the multi-beam laser radar can scan and detect the front side of the vehicle, the multi-beam laser radar can scan and detect the head, the side surface and the roof of the vehicle, and the installation position and the scanning range of the multi-beam laser radar are shown in figure 1.
The original data obtained by the multi-beam laser radar is three-dimensional point cloud data of the laser radar field angle, only the vehicle information is concerned, and the environment point cloud needs to be removed by using a three-dimensional point cloud background removing technology; and initially extracting vehicle position and contour information by using a clustering mode. The three-dimensional point cloud background removing technology comprises dynamic point cloud extraction and ground background point removal, wherein the dynamic point cloud extraction is to draw a 3D cuboid frame, the bottom surface of the cuboid is located at a ground critical position, the other surfaces are point cloud extraction ranges, whether points are inside the cuboid frame is judged, and all the point clouds inside the cuboid frame are dynamic point clouds such as ground vehicles, pedestrians and the like from which the background point cloud is removed. The clustering mode is to analyze the separation amount (Euclidean distance is selected) among the points of the original point cloud, gather the points smaller than a certain threshold (1 meter) into a category, and classify the points into vehicles and pedestrians according to the length, width, height and position characteristics of each clustered point cloud. The obtained original three-dimensional point cloud information of the vehicle is shown in fig. 2, the scene of fig. 2 is a noise point cloud with road surface water accumulation, water spray can be generated when the vehicle runs, obvious outlier noise is also provided in fig. 2, and the vehicle contour information cannot be clearly reflected under the influence of various noises. While there is no water bloom noise in the case of road surface without water accumulation or rain, other noise is also present.
The method comprises the steps of obtaining original three-dimensional point cloud information of a vehicle, sorting the point cloud data according to the length direction of the vehicle, and distinguishing the front end of the vehicle from the side face of the vehicle, wherein the smallest point cloud data are the front end of the vehicle. All points are projected onto two planes respectively, and the obtained plan view is a vehicle side view and a vehicle front view, wherein the vehicle side view of the left vehicle in fig. 2 is shown in fig. 3, and the vehicle front view of the left vehicle in fig. 2 is shown in fig. 4. And respectively carrying out statistical information on the front view and the side view of the vehicle by using a fixed resolution interval filtering method, wherein the statistical information is the total number of point clouds and the maximum point cloud height in each interval. Every A cm is a vertical interval, the total number of point clouds and the maximum point cloud height in each vertical interval are counted, the maximum point cloud height is an average value of 20% of the maximum point cloud height, the vertical interval width (A) of the front view of the vehicle is 10cm, and the vertical interval width (A) of the side view of the vehicle is 20cm. Fig. 5a is the total number of point clouds in the vertical section of the vehicle side view in fig. 3, and fig. 5b is the maximum point cloud height in the vertical section of the vehicle side view in fig. 3.
In the side view of the vehicle, searching backwards from the vehicle head, since the length of most vehicles exceeds 3 meters, searching backwards from the position with the coordinates of 3 meters directly, finding out the real tail section (marked as tmp_index) of the vehicle according to the statistical information, and deleting all point clouds behind the real tail section; the judgment of the real tail section of the vehicle needs to meet the conditions: (1) The height of the point cloud is less than a set height threshold, the height threshold is 0.6 times of the highest height of the vehicle, the number of the point clouds is less than a set number threshold, the number threshold is the descending order of the intervals with the number of the point clouds exceeding 10 in all the intervals, and the average value of the number of the point clouds in the interval 20% before the reciprocal is obtained; (2) a position where sudden drop occurs in the point cloud quantity interval; (3) The backward search no longer occurs with the point cloud exceeding the height threshold and the quantity threshold. For the connection part of the truck or trailer head and the carriage, the condition (1) and the condition (2) are met, but the condition (3) is not met, the backward search should be continued until the section meeting the above 3 conditions appears as the real tail section. In order to reduce the measurement error of a single interval, multi-interval common judgment is adopted, if a plurality of subsequent intervals of tmp_index also meet the condition (1), the tmp_index is determined as a real tail interval, and all point clouds behind the tmp_index can be directly filtered and deleted. The filtered point cloud projection vehicle side view of fig. 3 is shown in fig. 6, and it can be seen from fig. 6 that the water spray noise at the tail of the vehicle is substantially completely filtered.
In the front view of the vehicle, the middle position in the width direction of the vehicle point cloud is taken as a geometric midpoint (marked as geometry_mid_idx), the side which is not irradiated by the laser radar has almost no noise, the deviation threshold value set by the displacement of the geometric midpoint towards the other side of the vehicle (the side which is irradiated by the laser radar) is taken as a non-interference midpoint (marked as no_noise_mid_idx), the deviation threshold value is 10 vertical sections (namely 1 m) of the front view of the vehicle, and the fluctuation threshold value is 0.5m (namely 5 section positions) according to whether the distance between the geometric midpoint and the non-interference midpoint is smaller than the fluctuation threshold value. If the distance between the geometric midpoint and the undisturbed midpoint is smaller than the fluctuation threshold, the real midpoint is the average value of the geometric midpoint and the undisturbed midpoint, and if the distance between the geometric midpoint and the undisturbed midpoint is larger than the fluctuation threshold, the influence of noise such as water spray is excessively large, and the real midpoint is the undisturbed midpoint. Searching the left and right sides by using the true midpoint, and directly filtering out all point clouds left or right in a vertical section if the maximum point cloud height in the vertical section is lower than the set front vision threshold height which is 0.5 times of the highest height of the vehicle. Since the width of most vehicles exceeds 1.2 meters, 6 section positions on the left and right can be directly used as starting points of two-side searching and filtering from the true midpoint position. As shown in fig. 7, it can be seen from fig. 7 that not only the water noise point cloud on the left side near the radar irradiation direction is filtered, but also a small amount of outlier noise on the right side is filtered.
After the original three-dimensional point cloud of the vehicle goes through the steps, the three-dimensional point cloud image filtered in fig. 2 is shown in fig. 8, and as can be seen from fig. 8, various surrounding noises are filtered, at the moment, the measurement (length, width and height) of the contour is more accurate, and the subsequent analysis of the real vehicle point cloud is facilitated.
While the application has been disclosed in terms of various aspects and embodiments, other aspects and embodiments will be apparent to those skilled in the art in view of this disclosure, and many changes and modifications can be made without departing from the spirit of the application. The various aspects and embodiments of the present application are disclosed for illustrative purposes only and are not intended to limit the application, the true scope of which is set forth in the following claims.

Claims (9)

1. The vehicle three-dimensional point cloud data filtering method is characterized by comprising the following steps of:
(a) Obtaining three-dimensional point cloud data of a view angle through a laser radar installed at a road end;
(b) Removing the environment point cloud from the three-dimensional point cloud data in the step (a) and primarily extracting vehicle position and contour information to obtain original three-dimensional point cloud information of the vehicle;
(c) And projecting all points of original three-dimensional point cloud information of the vehicle into a plurality of plane views, respectively filtering noise of each plane view by using an interval filtering method to obtain accurate vehicle contour information, wherein the plane views are a front view of the vehicle and a side view of the vehicle, the front view and the side view of the vehicle are respectively subjected to statistical information by using a fixed resolution interval filtering method, and the statistical information is the total number of point clouds and the maximum point cloud height in each interval.
2. The method for filtering three-dimensional point cloud data of a vehicle according to claim 1, wherein in the step (a), the laser radar is a multi-beam laser radar capable of integrating in a vertical direction to identify height information of the vehicle; the multi-beam laser radar is arranged on the road side or in the road, the installation height of the multi-beam laser radar is larger than the height of the vehicle, and the multi-beam laser radar can scan and detect the head or tail of the vehicle, the side surface of the vehicle and the roof of the vehicle.
3. The method for filtering three-dimensional point cloud data of a vehicle according to claim 1, wherein in the step (b), ambient point clouds are removed by a three-dimensional point cloud background removing technique, and vehicle position and contour information are initially extracted; the three-dimensional point cloud background removing technology comprises the following steps: the method for initially extracting the vehicle position and contour information comprises the following steps of: a clustering mode or a 3D deep learning detection mode; the original three-dimensional point cloud information of the vehicle is affected by noise, and the vehicle contour information cannot be clearly reflected, wherein the noise comprises: water spray noise, tail gas noise, dust noise and laser reflection noise.
4. The method for filtering three-dimensional point cloud data of a vehicle according to claim 1, wherein each A cm is a vertical interval, the total number of point clouds and the maximum point cloud height in each vertical interval are counted, the maximum point cloud height is an average value of 5% -30% of the maximum point cloud height, A is 5-40cm, and the width of the vertical interval of the front view of the vehicle is smaller than that of the vertical interval of the side view of the vehicle.
5. The method for filtering three-dimensional point cloud data of a vehicle according to claim 1, wherein in a side view of the vehicle, searching backwards from a vehicle head, finding a real tail section of the vehicle according to the statistical information, and deleting all point clouds behind the real tail section; the judgment of the real tail section of the vehicle needs to meet the conditions: (1) The height of the point cloud is less than the set height threshold, the quantity of the point cloud is less than the set quantity threshold,
(2) And (3) searching backward for a position where the point cloud quantity interval suddenly drops, wherein the point cloud no longer appears beyond a height threshold value and a quantity threshold value.
6. The method for filtering three-dimensional point cloud data of a vehicle according to claim 5, wherein the height threshold is 0.5-0.65 times of the highest height of the vehicle, the quantity threshold is sorted in descending order of intervals in which the quantity of point clouds in all intervals exceeds 10, the average value of the quantity of the point clouds in the interval 20% before the reciprocal is taken as the quantity threshold, or the quantity 40 is taken as the quantity threshold, and the two are taken as the quantity threshold with larger value; the connection part of the truck or trailer head and the carriage meets the conditions (1) and (2), but does not meet the condition (3).
7. The method for filtering three-dimensional point cloud data of a vehicle according to claim 1, wherein in a front view of the vehicle, a middle position in a width direction of the point cloud of the vehicle is taken as a geometric midpoint, and a true midpoint is calculated according to a set deviation threshold value and a fluctuation threshold value; searching from the right to the left by the true midpoint, and filtering out all point clouds left or right in a certain vertical interval if the maximum point cloud height in the vertical interval is lower than the set front vision threshold height.
8. The method for filtering three-dimensional point cloud data of a vehicle according to claim 7, wherein the side not irradiated by the laser radar has almost no noise, a deviation threshold value set by displacement of the geometric midpoint toward the side irradiated by the laser radar of the vehicle is used as an undisturbed midpoint, and a true midpoint is calculated according to whether the distance between the geometric midpoint and the undisturbed midpoint is smaller than a fluctuation threshold value; the deviation threshold is 5-15 vertical intervals of the front view of the vehicle, the fluctuation threshold is 0.1-0.5m, if the distance between the geometric midpoint and the undisturbed midpoint is smaller than the fluctuation threshold, the real midpoint is the average value of the geometric midpoint and the undisturbed midpoint, and if the distance between the geometric midpoint and the undisturbed midpoint is greater than the fluctuation threshold, the real midpoint is the undisturbed midpoint.
9. The method for filtering three-dimensional point cloud data of a vehicle according to claim 8, wherein the front view threshold height is 0.5 to 0.65 times or 0.8m of the highest height of the vehicle, and both of them take larger values as the front view threshold height.
CN202210473084.5A 2022-04-29 2022-04-29 Vehicle three-dimensional point cloud data filtering method Active CN114812435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210473084.5A CN114812435B (en) 2022-04-29 2022-04-29 Vehicle three-dimensional point cloud data filtering method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210473084.5A CN114812435B (en) 2022-04-29 2022-04-29 Vehicle three-dimensional point cloud data filtering method

Publications (2)

Publication Number Publication Date
CN114812435A CN114812435A (en) 2022-07-29
CN114812435B true CN114812435B (en) 2023-10-20

Family

ID=82510535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210473084.5A Active CN114812435B (en) 2022-04-29 2022-04-29 Vehicle three-dimensional point cloud data filtering method

Country Status (1)

Country Link
CN (1) CN114812435B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115876098B (en) * 2022-12-12 2023-10-24 苏州思卡信息系统有限公司 Vehicle size measurement method of multi-beam laser radar

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226833A (en) * 2013-05-08 2013-07-31 清华大学 Point cloud data partitioning method based on three-dimensional laser radar
CN111291775A (en) * 2018-12-07 2020-06-16 北京万集科技股份有限公司 Vehicle positioning method, device and system
CN111540201A (en) * 2020-04-23 2020-08-14 山东大学 Vehicle queuing length real-time estimation method and system based on roadside laser radar
CN111580131A (en) * 2020-04-08 2020-08-25 西安邮电大学 Method for identifying vehicles on expressway by three-dimensional laser radar intelligent vehicle
CN112199991A (en) * 2020-08-27 2021-01-08 广州中国科学院软件应用技术研究所 Simulation point cloud filtering method and system applied to vehicle-road cooperative roadside sensing
CN112432647A (en) * 2020-11-09 2021-03-02 深圳市汇川技术股份有限公司 Positioning method, device and system of carriage and computer readable storage medium
CN112859088A (en) * 2021-01-04 2021-05-28 北京科技大学 Vehicle position information acquisition method and system based on three-dimensional radar
WO2021142996A1 (en) * 2020-01-13 2021-07-22 五邑大学 Point cloud denoising method, system, and device employing image segmentation, and storage medium
CN113156455A (en) * 2021-03-16 2021-07-23 武汉理工大学 Vehicle positioning system, method, device and medium based on roadside multi-laser radar perception
CN113191459A (en) * 2021-05-27 2021-07-30 山东高速建设管理集团有限公司 Road-side laser radar-based in-transit target classification method
CN113514847A (en) * 2020-04-10 2021-10-19 深圳市镭神智能系统有限公司 Vehicle outer contour dimension detection method and system and storage medium
CN114155720A (en) * 2021-11-29 2022-03-08 上海交通大学 Vehicle detection and track prediction method for roadside laser radar
CN114399744A (en) * 2021-12-24 2022-04-26 深圳市镭神智能系统有限公司 Vehicle type recognition method and device, electronic equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226833A (en) * 2013-05-08 2013-07-31 清华大学 Point cloud data partitioning method based on three-dimensional laser radar
CN111291775A (en) * 2018-12-07 2020-06-16 北京万集科技股份有限公司 Vehicle positioning method, device and system
WO2021142996A1 (en) * 2020-01-13 2021-07-22 五邑大学 Point cloud denoising method, system, and device employing image segmentation, and storage medium
CN111580131A (en) * 2020-04-08 2020-08-25 西安邮电大学 Method for identifying vehicles on expressway by three-dimensional laser radar intelligent vehicle
CN113514847A (en) * 2020-04-10 2021-10-19 深圳市镭神智能系统有限公司 Vehicle outer contour dimension detection method and system and storage medium
CN111540201A (en) * 2020-04-23 2020-08-14 山东大学 Vehicle queuing length real-time estimation method and system based on roadside laser radar
CN112199991A (en) * 2020-08-27 2021-01-08 广州中国科学院软件应用技术研究所 Simulation point cloud filtering method and system applied to vehicle-road cooperative roadside sensing
CN112432647A (en) * 2020-11-09 2021-03-02 深圳市汇川技术股份有限公司 Positioning method, device and system of carriage and computer readable storage medium
CN112859088A (en) * 2021-01-04 2021-05-28 北京科技大学 Vehicle position information acquisition method and system based on three-dimensional radar
CN113156455A (en) * 2021-03-16 2021-07-23 武汉理工大学 Vehicle positioning system, method, device and medium based on roadside multi-laser radar perception
CN113191459A (en) * 2021-05-27 2021-07-30 山东高速建设管理集团有限公司 Road-side laser radar-based in-transit target classification method
CN114155720A (en) * 2021-11-29 2022-03-08 上海交通大学 Vehicle detection and track prediction method for roadside laser radar
CN114399744A (en) * 2021-12-24 2022-04-26 深圳市镭神智能系统有限公司 Vehicle type recognition method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114812435A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN110487562B (en) Driveway keeping capacity detection system and method for unmanned driving
US20210350149A1 (en) Lane detection method and apparatus,lane detection device,and movable platform
CN104778444B (en) The appearance features analysis method of vehicle image under road scene
CN110705543A (en) Method and system for recognizing lane lines based on laser point cloud
CN106679633B (en) A kind of vehicle-mounted distance-finding system base and method
CN110415544B (en) Disaster weather early warning method and automobile AR-HUD system
CN111880196A (en) Unmanned mine car anti-interference method, system and computer equipment
CN110659552B (en) Tramcar obstacle detection and alarm method
CN114812435B (en) Vehicle three-dimensional point cloud data filtering method
CN112597839B (en) Road boundary detection method based on vehicle-mounted millimeter wave radar
CN114879160B (en) Rail foreign matter invasion real-time monitoring method and system based on three-dimensional point cloud data
JP2018055597A (en) Vehicle type discrimination device and vehicle type discrimination method
CN113034378A (en) Method for distinguishing electric automobile from fuel automobile
CN114863064A (en) Method and system for constructing automobile contour curved surface model
Yamazaki et al. Vehicle extraction and speed detection from digital aerial images
CN114155720B (en) Vehicle detection and track prediction method for roadside laser radar
CN114235679B (en) Pavement adhesion coefficient estimation method and system based on laser radar
Vaibhav et al. Real-time fog visibility range estimation for autonomous driving applications
CN113962301B (en) Multi-source input signal fused pavement quality detection method and system
CN114821499A (en) Object classification method
CN116165635B (en) Denoising method for photon cloud data of different beams under daytime condition of multistage filtering algorithm
CN115656962B (en) Method for identifying height-limited object based on millimeter wave radar
CN116794650A (en) Millimeter wave radar and camera data fusion target detection method and device
CN116587978A (en) Collision early warning method and system based on vehicle-mounted display screen
CN115116034A (en) Method, device and system for detecting pedestrians at night

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant