CN114994635A - Intelligent driving travelable area detection method and device - Google Patents

Intelligent driving travelable area detection method and device Download PDF

Info

Publication number
CN114994635A
CN114994635A CN202210539241.8A CN202210539241A CN114994635A CN 114994635 A CN114994635 A CN 114994635A CN 202210539241 A CN202210539241 A CN 202210539241A CN 114994635 A CN114994635 A CN 114994635A
Authority
CN
China
Prior art keywords
point cloud
cloud data
grid
vehicle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210539241.8A
Other languages
Chinese (zh)
Inventor
黄文锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hanrun Automotive Electronics Co ltd
Original Assignee
Shanghai Hanrun Automotive Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hanrun Automotive Electronics Co ltd filed Critical Shanghai Hanrun Automotive Electronics Co ltd
Priority to CN202210539241.8A priority Critical patent/CN114994635A/en
Publication of CN114994635A publication Critical patent/CN114994635A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a method and a device for detecting an intelligent driving travelable area, wherein the method comprises the following steps: acquiring point cloud data of a plurality of laser radars under the same timestamp; performing fusion processing on multi-frame point cloud data to obtain a fusion point cloud set; determining non-ground point cloud data from the fused point cloud set according to a z-direction coordinate value of fused point cloud data contained in the fused point cloud set so as to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data; mapping the non-ground point cloud sets to a pre-established target polar coordinate system, and determining the closest distances between the obstacles around the vehicle under the full angle and the vehicle according to the non-ground point cloud sets under the target polar coordinate system; and determining the travelable area of the vehicle according to the closest distances between the obstacles around the vehicle at all angles and the vehicle. According to the method, a plurality of laser radars are adopted to collect point cloud data, the detection distance is greatly increased, the environmental adaptability is better, and the autonomous detection performance is high due to the fact that the assistance of conditions such as a high-precision map and a structured road is not needed.

Description

Intelligent driving travelable area detection method and device
Technical Field
The application relates to the technical field of intelligent driving, in particular to a method and a device for detecting a driving area of intelligent driving.
Background
The intelligent driving travelable region detection is a technology for providing a safety boundary for intelligent driving, and detects a travelable region of a vehicle through a surrounding environment of the vehicle.
The existing intelligent driving drivable area detection method mainly comprises two methods, wherein one method is to use a camera to detect a road type target and identify a road to determine a drivable area, and the other method is to use a laser radar and a high-precision map to obtain boundary points to solve the drivable area.
However, the method for detecting the road type target by the camera has limited detection distance and high requirement on light, is generally suitable for a structured scene, and has poor environmental adaptability; for the detection of travelable areas for lidar and high-precision maps, high-precision maps need to be maintained and updated regularly.
Disclosure of Invention
In view of the above, the present application provides a method and an apparatus for detecting an intelligent driving drivable area, which are used to solve the problems of poor environmental adaptability in a method for detecting a road type target by using a camera, and the problems of regular maintenance and high-precision map updating in a method for detecting a drivable area by using a laser radar and a high-precision map, and the technical scheme is as follows:
an intelligent driving travelable area detection method includes:
acquiring point cloud data of a plurality of laser radars under the same timestamp, wherein the point cloud data are acquired by the laser radars installed on a vehicle, and the point cloud data are all point cloud data transformed to a vehicle coordinate system; the point cloud data comprises multi-frame point cloud data;
performing fusion processing on multi-frame point cloud data to obtain a fusion point cloud set;
determining non-ground point cloud data from the fused point cloud set according to a z-direction coordinate value of fused point cloud data contained in the fused point cloud set so as to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data;
mapping the non-ground point cloud sets to a pre-established target polar coordinate system, and determining the nearest distances between obstacles around the vehicle under the full angle and the vehicle according to the non-ground point cloud sets under the target polar coordinate system, wherein the target polar coordinate system is a target polar coordinate system established by taking the origin of a vehicle coordinate system as a pole;
and determining the travelable area of the vehicle according to the closest distances between the obstacles around the vehicle at all angles and the vehicle.
Optionally, the fusion processing is performed on the multi-frame point cloud data to obtain a fusion point cloud set, including:
splicing multi-frame point cloud data into one-frame point cloud data according to a preset laser radar sequence to obtain a fusion point cloud set;
or the like, or a combination thereof,
aiming at each point cloud data contained in multi-frame point cloud data, calculating a horizontal angle and a vertical angle corresponding to the point cloud data, and filling the point cloud data into a corresponding point cloud grid according to the horizontal angle and the vertical angle corresponding to the point cloud data; filling each point cloud data contained in the multi-frame point cloud data into the corresponding point cloud grids;
for each point cloud grid, if the point cloud grid contains a plurality of point cloud data, taking the gravity centers of the point cloud data as fused point cloud data under the point cloud grid; so as to obtain a fusion point cloud set consisting of fusion point cloud data under all point cloud grids.
Optionally, determining non-ground point cloud data from the fused point cloud set according to a z-direction coordinate value of the fused point cloud data included in the fused point cloud set, including:
removing fused point cloud data where abnormal z-direction coordinate values in the fused point cloud set are located, and taking the fused point cloud set with the abnormal data removed as a target fused point cloud set;
determining ground point cloud data from the target fusion point cloud set according to a z-direction coordinate value of fusion point cloud data contained in the target fusion point cloud set so as to obtain a to-be-updated ground point cloud set consisting of the determined ground point cloud data;
determining a fitting plane equation according to the ground point cloud set to be updated;
calculating the distance between each fused point cloud data in the target fused point cloud set and the fitting plane equation, determining the fused point cloud data with the calculated distance being smaller than a preset first distance threshold value as ground point cloud data, and taking the determined ground point cloud data as an updated ground point cloud set;
and adding 1 to the updating iteration frequency, judging whether the updating iteration frequency added with 1 reaches the preset total iteration frequency, if not, taking the updated ground point cloud set as the ground point cloud data to be updated, returning to execute the step of determining a fitting plane equation according to the ground point cloud set to be updated, and taking other point cloud data except the updated ground point cloud set in the target fusion point cloud set as non-ground point cloud data until the updating iteration frequency reaches the total iteration frequency, wherein the initial updating iteration frequency is 0.
Optionally, determining the ground point cloud data from the target fusion point cloud set according to a z-direction coordinate value of the fusion point cloud data included in the target fusion point cloud set, including:
selecting a preset number of fused point cloud data from the target fused point cloud set according to a z-direction coordinate value of the fused point cloud data contained in the target fused point cloud set, and calculating a z-direction average value of the preset number of fused point cloud data;
and aiming at each fused point cloud data in the target fused point cloud set, if the difference value between the z-direction coordinate value and the z-direction average value of the fused point cloud data is smaller than a preset difference threshold value, determining the fused point cloud data as ground point cloud data.
Optionally, determining a fitting plane equation according to the cloud set of ground points to be updated includes:
calculating an x-direction average value, a y-direction average value and a z-direction average value of a cloud set of ground points to be updated;
determining a covariance matrix according to an x-direction average value, a y-direction average value and a z-direction average value of a ground point cloud set to be updated, and solving a plurality of eigenvalues and eigenvectors of the covariance matrix;
determining a minimum characteristic value from the plurality of characteristic values, and taking a characteristic vector corresponding to the minimum characteristic value as a normal vector of the fitting plane;
and solving a fitting plane equation according to the ground point cloud set to be updated and the normal vector of the fitting plane.
Optionally, the target polar coordinate system includes a plurality of angle grids, and each angle grid includes a plurality of radial grids;
mapping the non-ground point cloud sets to a pre-established target polar coordinate system, and determining the respective closest distances between the obstacles around the vehicle under the full angle and the vehicle according to the non-ground point cloud sets under the target polar coordinate system, wherein the method comprises the following steps:
according to the x-direction coordinate value and the y-direction coordinate value of each piece of non-ground point cloud data in the non-ground point cloud set, solving a horizontal angle and a two-dimensional projection distance corresponding to each piece of non-ground point cloud data;
mapping each non-ground point cloud data in the non-ground point cloud set into a target polar coordinate system according to a horizontal angle and a two-dimensional projection distance corresponding to each non-ground point cloud data in the non-ground point cloud set to obtain a non-ground point cloud set under the target polar coordinate system;
for each angle grid included in the target polar coordinate system, determining a non-empty radial grid closest to a pole from the angle grids, and calculating a distance average value between non-ground point cloud data and the pole included in the non-empty radial grids to be used as a closest distance between an obstacle and a vehicle at an angle corresponding to the angle grid; so as to obtain the nearest distance between the obstacles around the vehicle at all angles and the vehicle respectively.
Optionally, determining a travelable region of the vehicle according to the respective closest distances between the obstacles around the vehicle at all angles and the vehicle, includes:
determining the probability that each grid in a preset grid map contains the obstacle according to the non-ground point cloud set and the nearest distance;
and determining the travelable area of the vehicle according to the probability that each grid in the grid map contains the obstacle.
Optionally, determining, according to the non-ground point cloud set and the closest distance, a probability that each grid in the preset grid map contains an obstacle, includes:
mapping each non-ground point cloud data to a corresponding grid contained in a grid map according to an x-direction coordinate value and a y-direction coordinate value of each non-ground point cloud data in a non-ground point cloud set and a preset grid resolution, wherein the grid resolution is used for dividing the grid map into a plurality of grids;
determining non-ground point cloud data contained in the grid map as low point data or high point data according to a target fitting plane equation and a preset second distance threshold, wherein the target fitting plane equation is a fitting plane equation determined by the last iteration;
determining the mixed occupation probability corresponding to each grid according to the low-point data and the high-point data contained in each grid in the grid map, the low-point data contained in the adjacent grid, the preset grid occupation probability and a target fitting plane equation;
calculating the distance between each grid contained in the grid map and the center of the rear axle of the vehicle based on the coordinate transformation relation between the center of the rear axle of the vehicle and the origin of the grid map, and determining the initial occupation probability corresponding to each grid according to the calculated distance and the closest distance between the obstacle and the vehicle at the corresponding angle of each grid;
and determining the probability that each grid contained in the grid map contains the obstacle according to the mixed occupation probability and the initial occupation probability corresponding to each grid contained in the grid map.
Optionally, determining a mixed occupation probability corresponding to each grid according to low-point data and high-point data included in each grid in the grid map, and low-point data, a preset grid occupation probability and a target fitting plane equation included in adjacent grids, including:
calculating the number of high point data contained in each grid in the grid map, and determining the high point occupation probability corresponding to each grid in the grid map according to the number and the preset grid occupation probability;
determining the low point occupation probability corresponding to each grid in the grid map according to the low point data contained in each grid in the grid map, the low point data contained in adjacent grids and a target fitting plane equation;
and determining the mixed occupation probability corresponding to each grid according to the high point occupation probability and the low point occupation probability corresponding to each grid in the grid map.
An intelligent driving travelable area detection device, comprising:
the system comprises a point cloud data acquisition module, a data processing module and a data processing module, wherein the point cloud data acquisition module is used for acquiring point cloud data under the same timestamp, and the point cloud data is acquired by a laser radar installed on a vehicle;
the non-ground point cloud set determining module is used for determining non-ground point cloud data from the point cloud data according to the z-direction coordinate value of the fused point cloud data contained in the fused point cloud set so as to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data;
the system comprises a nearest distance determining module, a target polar coordinate system and a target polar coordinate system, wherein the nearest distance determining module is used for mapping non-ground point cloud sets to a pre-established target polar coordinate system so as to determine the nearest distances between obstacles around the vehicle under the full angle and the vehicle respectively according to the non-ground point cloud sets under the target polar coordinate system, and the target polar coordinate system is a polar coordinate system established by taking the origin of a vehicle coordinate system as a pole;
and the travelable area determining module is used for determining the travelable area of the vehicle according to the closest distance between the obstacles around the vehicle at all angles and the vehicle.
According to the technical scheme, the method for detecting the intelligent driving drivable area comprises the steps of firstly obtaining point cloud data of a plurality of laser radars under the same timestamp, then carrying out fusion processing on multi-frame point cloud data to obtain a fusion point cloud set, then determining non-ground point cloud data from the fusion point cloud set according to a z-direction coordinate value of the fusion point cloud data contained in the fusion point cloud set to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data, mapping the non-ground point cloud set to a pre-established target polar coordinate system to determine the closest distances between obstacles around the vehicle under the full angle and the vehicle according to the non-ground point cloud set under the target polar coordinate system, and finally determining the drivable area of the vehicle according to the closest distances between the obstacles around the vehicle under the full angle and the vehicle. According to the method, a plurality of laser radars are adopted to collect point cloud data, so that the detection distance is greatly increased, and the environmental adaptability is better; the method and the device can determine the nearest distance between each barrier and the vehicle at all angles around the vehicle, can accurately obtain the boundary of the drivable area at all angles around the vehicle based on the nearest distance, only rely on the point cloud data acquired by the laser radar in the whole process, do not need additional condition assistance such as a high-precision map and a structured road, and have high autonomous detection performance.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only the embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for detecting an intelligent driving drivable area according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a point cloud grid provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of a target polar coordinate system provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an intelligent driving feasible region detection device provided by an embodiment of the application;
fig. 5 is a block diagram of a hardware structure of the intelligent driving travelable area detection device according to the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In view of the problems in the prior art, the present inventors have conducted intensive studies, and finally provide an intelligent driving travelable area detection method, which can be optionally used in the fields of unmanned navigation such as route planning and route control, and then detailed descriptions are provided for the intelligent driving travelable area detection method provided by the present application through the following embodiments.
Referring to fig. 1, a schematic flow chart of an intelligent driving drivable area detection method provided in an embodiment of the present application is shown, where the intelligent driving drivable area detection method may include:
and S101, point cloud data of a plurality of laser radars under the same timestamp are obtained.
The system comprises a vehicle, a laser radar, a point cloud data acquisition unit, a vehicle coordinate system and a vehicle coordinate system, wherein the point cloud data is acquired by the laser radar arranged on the vehicle; the point cloud data comprises a plurality of frames of point cloud data.
In this step, point cloud data may be collected by a plurality of lidar mounted on the vehicle, for example, point cloud data and the like are collected based on a main lidar and left and right lidar.
After the laser radar collects the point cloud data, the timestamp can be determined for the point cloud data collected by each laser radar through the upper computer, and the time when the laser radar collects the point cloud data is recorded through the timestamp. It can be understood that a plurality of laser radars can collect a plurality of frames of point cloud data at a plurality of moments, and the application can match a plurality of frames of point cloud data collected by a plurality of laser radars according to corresponding timestamps (i.e. time synchronization is carried out on the plurality of frames of point cloud data) so as to obtain multi-frame point cloud data under each timestamp, wherein the point cloud data under the same timestamp acquired in the step refers to point cloud multi-frame data under any timestamp.
It should be understood that different lidar positions are different, and in order to perform subsequent processing on the point cloud data acquired by different lidar, the multi-frame point cloud data acquired by a plurality of lidar may be subjected to spatial synchronization, where the spatial synchronization refers to uniformly converting the multi-frame point cloud data acquired by a plurality of lidar into a coordinate system, such as a vehicle coordinate system, where the vehicle coordinate system refers to a coordinate system with a vehicle rear axle center as a coordinate origin. That is to say, the point cloud data under the same timestamp acquired in this step are all the point cloud data transformed to the vehicle coordinate system.
And S102, carrying out fusion processing on multi-frame point cloud data to obtain a fusion point cloud set.
The multi-frame point cloud data is obtained by shooting obstacles around the vehicle at multiple angles through multiple laser radars, the multi-frame point cloud data can be fused into the frame point cloud data in the step for convenience of subsequent processing, and the fused point cloud set in the step is formed by each point cloud data contained in the fused frame point cloud data.
And S103, determining non-ground point cloud data from the fused point cloud set according to the z-direction coordinate value of the fused point cloud data contained in the fused point cloud set so as to obtain the non-ground point cloud set consisting of the determined non-ground point cloud data.
Specifically, a frame of point cloud data collected by a lidar may include at least one point cloud data, which typically includes some ground point cloud data and some non-ground point cloud data. In this embodiment, the z-coordinate value of the fused point cloud data is positively correlated with the ground clearance of the fused point cloud data, so that the non-ground point cloud data can be determined from the fused point cloud set obtained in the above steps.
And step S104, mapping the non-ground point cloud sets to a pre-established target polar coordinate system, and determining the nearest distance between each obstacle and the vehicle in the full angle around the vehicle according to the non-ground point cloud sets in the target polar coordinate system.
The target polar coordinate system is a polar coordinate system established by taking the origin of the vehicle coordinate system as a pole.
It can be appreciated that non-ground point cloud data is likely to be an obstacle due to the nature of the lidar detecting the object. In the step, a target polar coordinate system is established by taking the origin of a vehicle coordinate system (namely the center of a rear axle of the vehicle) as a pole, and then each non-ground point cloud data in the non-ground point cloud set is mapped to the target polar coordinate system, so that the closest distance between the non-ground point cloud data and the vehicle can be determined according to the distance between the non-ground point cloud data and the pole of the target polar coordinate system, namely, the closest distance between an obstacle and the vehicle at each angle around the vehicle can be determined. For example, there are three obstacles in the 90 ° direction of the vehicle, namely, the obstacles a, b, and c, and if the obstacle b is closest to the vehicle, the distance between the obstacle b and the vehicle is determined as the closest distance between the obstacle and the vehicle at 90 degrees.
In this step, the closest distance, that is, the distance of the obstacle closest to the center of the rear axle of the vehicle in each angular direction, is information. In the step, the boundary information of the vehicle driving feasible region can be obtained by calculating the nearest distance, so that the vehicle driving range is intuitively narrowed.
And S105, determining a driving area of the vehicle according to the shortest distances between the obstacles around the vehicle at all angles and the vehicle.
In this step, the shortest distance can represent the boundary information of the vehicle driving area, for example, the intelligent driving vehicle can only drive to the corresponding shortest distance at a certain angle and cannot exceed the shortest distance. Therefore, after the shortest distances between the obstacles around the vehicle at all angles and the vehicle are determined, the driving area of the vehicle can be obtained.
The method for detecting the intelligent driving travelable area comprises the steps of firstly obtaining point cloud data of a plurality of laser radars under the same time stamp, then conducting fusion processing on multi-frame point cloud data to obtain a fusion point cloud set, then determining non-ground point cloud data from the fusion point cloud set according to a z-direction coordinate value of the fusion point cloud data contained in the fusion point cloud set to obtain a non-ground point cloud set formed by the determined non-ground point cloud data, then mapping the non-ground point cloud set to a pre-established target polar coordinate system to determine the nearest distances between obstacles around the vehicle under the full angle and the vehicle according to the non-ground point cloud set under the target polar coordinate system, and finally determining the travelable area of the vehicle according to the nearest distances between the obstacles around the vehicle under the full angle and the vehicle. According to the method, a plurality of laser radars are adopted to collect point cloud data, so that the detection distance is greatly increased, and the environmental adaptability is better; the method and the device can determine the nearest distance between each barrier and the vehicle at all angles around the vehicle, can accurately obtain the boundary of the drivable area at all angles around the vehicle based on the nearest distance, only rely on the point cloud data acquired by the laser radar in the whole process, do not need additional condition assistance such as a high-precision map and a structured road, and have high autonomous detection performance.
In an embodiment of the application, a process of performing fusion processing on the multi-frame point cloud data to obtain a fusion point cloud set in step S102 is introduced.
In this embodiment, the fused point cloud set includes a plurality of fused point cloud data.
Optionally, the embodiment may adopt two methods, namely GPU-based parallel multithreading and CPU-based single thread, to perform fusion processing.
In this embodiment, there are various implementation manners for the process of "performing fusion processing on multi-frame point cloud data", and the present application provides, but is not limited to, the following two implementation manners.
The first implementation mode comprises the following steps: and performing fusion processing on the multi-frame point cloud data in an unstructured fusion mode.
Here, the unstructured fusion mode process includes: and splicing the multi-frame point cloud data into one frame of point cloud data according to a preset laser radar sequence to obtain a fusion point cloud set.
According to the implementation mode, one frame of point cloud data collected by each laser radar is directly spliced according to the preset laser radar sequence, point cloud data corresponding to the same barrier collected by different laser radars cannot be fused into point cloud data during splicing, namely, the total number of point clouds contained in multi-frame point cloud data is the same as the total number of point clouds contained in a fusion point cloud set.
For example, assuming that the number of the laser radars is N, the laser radars are respectively recorded as 0 to N-1 according to a preset laser radar sequence, and the number of the point cloud data included in one frame of point cloud data acquired by the ith laser radar is N i Wherein i ∈ [0, N-1 ]]The number of the point clouds included in the cloud collection of the fusion points is
Figure BDA0003649929780000091
The second implementation mode comprises the following steps: and performing fusion processing on the multi-frame point cloud data in a structured fusion mode.
In this implementation, the process of structured fusion includes the following steps a 1-a 2:
step a1, calculating a horizontal angle and a vertical angle corresponding to each point cloud data contained in the multi-frame point cloud data, and filling the point cloud data into a corresponding point cloud grid according to the horizontal angle and the vertical angle corresponding to the point cloud data, so as to fill each point cloud data contained in the multi-frame point cloud data into the corresponding point cloud grid.
Specifically, the method includes the steps of dividing a three-dimensional space into W rows in a horizontal direction of 0-360 degrees, dividing the three-dimensional space into H rows in a vertical direction of 0-180 degrees to obtain a point cloud grid shown in fig. 2, calculating a horizontal angle corresponding to each frame of point cloud contained in each frame of point cloud (one frame of point cloud data usually comprises a plurality of pieces of point cloud data) according to an x-direction coordinate value and a y-direction coordinate value of the point cloud data, calculating a vertical angle corresponding to the point cloud data according to the x-direction coordinate value, the y-direction coordinate value and the z-direction coordinate value of the point cloud data, and filling the point cloud data into the corresponding point cloud grid according to the calculated horizontal angle and vertical angle.
Optionally, if the three-dimensional coordinate of the point cloud data is (x, y, z), it should be noted that the three-dimensional coordinate of the point cloud data in the present application is (x, y, z) which represents a returned three-dimensional coordinate when a laser beam of the laser radar hits on an object. The origin of the coordinate system is the center of the rear axle of the vehicle, the coordinate system of X, Y and Z in the application is the coordinate system with the center of the rear axle of the vehicle as the origin of the coordinate, X represents the distance from the object point to the transverse plane of the rear axle of the vehicle, Y represents the distance from the object point to the longitudinal plane of the rear axle of the vehicle, and Z represents the height from the object point to the central point of the rear axle, so that the horizontal angle corresponding to the point cloud data can be calculated by adopting a formula acrtan (Y/X), and the vertical angle corresponding to the point cloud data can be calculated by adopting a formula acrtan (Z/sqrt (X ^2+ Y ^ 2)). Based on this, when the point cloud data is filled into the corresponding point cloud grids, the number of columns where the corresponding point cloud grids are located is acrtan (y/x)/360 × W, and the number of columns is obtained by querying a look-up table according to the calculated vertical angle (each mechanical laser radar corresponds to a vertical angle look-up table of a laser beam, that is, the vertical angle of each beam is fixed when the laser radar leaves the factory).
A2, regarding each point cloud grid, if the point cloud grid contains a plurality of point cloud data, taking the gravity centers of the point cloud data as fused point cloud data under the point cloud grid; so as to obtain a fused point cloud set consisting of fused point cloud data under all point cloud grids.
Because the detection angles of a plurality of laser radars may overlap (for example, the main laser radar and the left laser radar have angle overlap at the upper left corner of the vehicle, and the main laser radar and the right laser radar have angle overlap at the upper right corner of the vehicle), if an obstacle is just in the area of angle overlap, the same obstacle can be detected by a plurality of laser radars, at the moment, fusion processing is performed, and a plurality of point cloud data corresponding to the same obstacle may be filled into the same point cloud grid.
In this step, if one point cloud grid contains a plurality of point cloud data after the point cloud data of all the laser radars are filled into the corresponding point cloud grids, the barycenter (i.e., the x-direction average value, the y-direction average value, and the z-direction average value of the plurality of point cloud data) of the plurality of point cloud data is used as the fused point cloud data under the point cloud grid.
Through the fusion processing of the implementation mode, a plurality of point cloud data can be fused into point cloud data, so that the total number of the point clouds contained in the multi-frame point cloud data is greater than that of the point clouds contained in the cloud collection of the fusion points.
Comparing the two implementation modes, the first implementation mode has the advantages that the time and space complexity of the algorithm is low, the environment information can be well reserved, the defect that the space neighborhood relation of each point in the point cloud frame cannot be obtained, the environment characteristics detected by each laser radar are reserved as far as possible, and a large memory is consumed for storing the fused point cloud; the second implementation method has the advantages that three-dimensional space neighborhood information of each point cloud of the point cloud frame is kept (whether any two fused point cloud data are adjacent or not can be known through the three-dimensional space neighborhood information, and in which direction the two fused point cloud data are adjacent can be known), and the defects that the balance between space complexity and environment authenticity cannot be well realized, and partial features detected by a certain laser radar can be lost are overcome.
In summary, the first implementation manner is more suitable for a scenario in which the point cloud data processed by the travelable region is directly used by the subsequent sensing task; the second implementation mode is more suitable for a scene that the point cloud data processed by the travelable area is not directly used by a subsequent perception task, the performance is better, and unnecessary performance waste can be avoided.
It should be noted that the above two fusion processing methods are merely examples, and other fusion methods may be adopted, which is not limited in the present application.
In an alternative embodiment, the process of determining non-ground point cloud data from the fused point cloud set in step S103 according to the z-coordinate value of the fused point cloud data included in the fused point cloud set is described.
The present embodiment may segment the cloud set of fusion points into non-ground point cloud data and ground point cloud data.
Optionally, the process of determining non-ground point cloud data from the fused point cloud set according to the z-coordinate value of the fused point cloud data included in the fused point cloud set in this embodiment includes the following steps b1 to b 5:
and b1, eliminating fused point cloud data where the abnormal z-direction coordinate values in the fused point cloud set are located, and taking the fused point cloud set with the abnormal data eliminated as a target fused point cloud set.
Due to factors such as uneven ground (such as protruding in the middle and low in both sides), radar-vehicle and radar-vehicle mounting accuracy which cannot guarantee 100% level, environmental noise (mirror reflection noise point) exists in the fused point cloud set, and the difference between the z-direction coordinate value of the environmental noise and the z-direction coordinate value of the non-noise fused point cloud data is usually large, for example, if the z-direction coordinate values of the fused point cloud data in the fused point cloud set are respectively-20, -1, 2 and 3, the fused point cloud data with the z-direction coordinate value of-20 is abnormal data, and can be removed from the fused point cloud set to obtain a target fused point cloud set.
And b2, determining ground point cloud data from the target fusion point cloud set according to the z-direction coordinate value of the fusion point cloud data contained in the target fusion point cloud set, so as to obtain a ground point cloud set to be updated, wherein the ground point cloud set consists of the determined ground point cloud data.
In this step, a z-direction coordinate value of the fused point cloud data is related to the ground clearance of the fused point cloud data, and according to the z-direction coordinate value of each fused point cloud data included in the target fused point cloud set, whether each fused point cloud data is ground point cloud data or non-ground point cloud data can be determined, so that a ground point cloud set to be updated can be obtained.
Optionally, the process of "determining ground point cloud data from the target fused point cloud set according to the z-coordinate value of the fused point cloud data included in the target fused point cloud set" in this step includes the following steps b21 to b 22:
b21, selecting a preset number of fused point cloud data from the target fused point cloud set according to the z-direction coordinate value of the fused point cloud data contained in the target fused point cloud set, and calculating the z-direction average value of the preset number of fused point cloud data.
In the step, the z-direction average value of the fused point cloud data with the preset number reflects the height difference between the ground and the center of the rear axle of the vehicle to a certain extent.
Optionally, in this step, the fused point cloud data in the target fused point cloud set may be sorted according to the sequence of the z-direction coordinate values of the fused point cloud data from small to large, then the preset number of fused point cloud data with the smallest z-direction coordinate value is selected, and the average value of the z-direction coordinate values of the preset number of fused point cloud data is calculated to obtain the z-direction average value.
Step b22, aiming at each fused point cloud data in the target fused point cloud set, if the difference value between the z-direction coordinate value and the z-direction average value of the fused point cloud data is smaller than a preset difference value threshold, determining that the fused point cloud data is ground point cloud data.
Optionally, on the premise that the fused point cloud data in the target fused point cloud set is sorted in the step b21 according to the sequence from small to large of the z-coordinate value of the fused point cloud data, the step may sequentially calculate the difference between the z-coordinate value and the z-average value of each fused point cloud data according to the sorting sequence, and compare the calculated difference with a preset difference threshold (which may be used in this embodiment)
Figure BDA0003649929780000121
Indicating the difference threshold, optionallyThe difference threshold value is set at the ground clearance of the center of the rear axle of the passing vehicle) to be compared so as to enable the difference to be smaller than the difference threshold value
Figure BDA0003649929780000122
The fused point cloud data is determined as ground point cloud data.
Because the step can carry out difference calculation and comparison according to the sorting sequence, if the difference value between the z-direction coordinate value and the z-direction average value of the fused point cloud data is greater than the difference threshold value
Figure BDA0003649929780000123
The difference between the z-coordinate value and the z-average value of the subsequent fused point cloud data is necessarily larger than the difference threshold
Figure BDA0003649929780000124
Therefore, the difference value between the z-direction coordinate value and the z-direction average value of a fused point cloud data is calculated to be larger than the difference threshold value
Figure BDA0003649929780000125
And in time, subsequent calculation is not needed, so that the calculation time is saved, and the calculation efficiency is improved.
And b3, determining a fitting plane equation according to the ground point cloud set to be updated.
In the step, plane fitting can be realized by combining a principal component analysis method, and a plane model (namely a fitting plane equation) of the ground point cloud set to be updated is solved, wherein the plane model is that the ground point cloud set to be updated is fitted into a plane, and the fitting plane is expressed in a form of ax + by + cz + d ═ 0.
Optionally, the process of determining a fitting plane equation according to the cloud set of ground points to be updated in this step may include steps b31 to b 34:
and b31, calculating the average value in the x direction, the average value in the y direction and the average value in the z direction of the cloud set of the ground points to be updated.
Specifically, each piece of ground point cloud data included in the ground point cloud set to be updated is ground point cloud data represented by a three-dimensional coordinate value (i.e., an x-direction coordinate value, a y-direction coordinate value, and a z-direction coordinate value), i.e., (x, y, z).
And b32, determining a covariance matrix according to the x-direction average value, the y-direction average value and the z-direction average value of the ground point cloud set to be updated, and solving a plurality of eigenvalues and eigenvectors of the covariance matrix.
In this step, a covariance matrix (3 × 3 matrix) of the cloud set of ground points to be updated with respect to the x, y, and z directions may be obtained according to the x-direction average value, the y-direction average value, and the z-direction average value of the cloud set of ground points to be updated, and a plurality of eigenvalues and eigenvectors of the covariance matrix may be solved. Here, 3 eigenvalues and corresponding 3 eigenvectors can generally be solved.
And b33, determining the minimum characteristic value from the plurality of characteristic values, and taking the characteristic vector corresponding to the minimum characteristic value as the normal vector of the fitting plane.
It can be understood that the smaller the eigenvalue is, the smaller the degree of discrimination of the point cloud in the dimension direction is (the smaller the degree of dispersion is), and therefore, the eigenvector corresponding to the smallest eigenvalue can be used as the normal vector of the fitting plane.
And b34, solving a fitting plane equation according to the ground point cloud set to be updated and the normal vector of the fitting plane.
Specifically, in this step, a fitting plane equation in the form of ax + by + cz + d being 0 may be solved according to the barycenter (i.e., the x-direction average value, the y-direction average value, and the z-direction average value) of all the ground point cloud data included in the ground point cloud set to be updated, in combination with the normal vector of the fitting plane.
B4, calculating the distance between each fused point cloud data in the target fused point cloud set and the fitting plane equation, determining the fused point cloud data with the calculated distance smaller than a preset first distance threshold value as ground point cloud data, and taking the determined ground point cloud data as the updated ground point cloud set.
Optionally, the step may determine the first distance threshold according to the ground point cloud segmentation requirement, and for convenience of subsequent description, the first distance threshold is expressed as
Figure BDA0003649929780000131
After determining the first distance threshold
Figure BDA0003649929780000141
Then, if the distance between the fused point cloud data and the fitting plane equation is less than a first distance threshold value
Figure BDA0003649929780000142
The fusion point cloud data is the ground point cloud data with high probability, so that the updated ground point cloud set can be obtained.
The method for calculating the distance between the fused point cloud data and the fitting plane equation in the step is the prior art and is not described in detail herein.
B5, adding 1 to the updating iteration frequency, judging whether the updating iteration frequency added with 1 reaches the preset total iteration frequency, if not, taking the updated ground point cloud set as the ground point cloud data to be updated, returning to execute the determination of the fitting plane equation according to the ground point cloud set to be updated, and taking other point cloud data in the target fusion point cloud set except the updated ground point cloud set as non-ground point cloud data when the updating iteration frequency reaches the total iteration frequency.
Wherein the initial number of update iterations is 0.
In order to make the determined fitting plane equation more accurate, the present embodiment may perform a plurality of iterations, for example, perform an iterative update process for 2 to 3 times. Based on this, the initial update iteration number is set to 0, when the ground point cloud set to be updated is updated every time, the update iteration number is added by 1, and whether the update iteration number after the addition of 1 reaches the preset total iteration number is judged (since the process of determining the fitting plane equation and b4 is relatively time-consuming and limited to real-time, the iteration number is difficult to be set too large, the total iteration number is generally set to 2 or 3 in the embodiment), if the update iteration number after the addition of 1 is less than or equal to the total iteration number, the iteration process is performed again, that is, the updated ground point cloud set is used as the ground point cloud data to be updated, and the fitting plane equation is determined again according to the ground point cloud set to be updated.
When the number of updating iterations after adding 1 is larger than the total number of iterations, the iteration is not needed, the updated ground point cloud set determined by the last iteration can be used as the final ground point cloud set, and other point cloud data in the target fusion point cloud set except the final ground point cloud set are determined to be non-ground point cloud data.
The process of determining non-ground point cloud data shown in the steps b1 to b5 is only an example and is not limited to the present application, and for example, the non-ground point cloud data may be directly determined only according to the steps b1 to b 2.
The embodiment provides the method for determining the non-ground point cloud data under the condition that the point cloud data under the same timestamp comprise multi-frame point cloud data, the detection blind area of a vehicle can be reduced as much as possible by adopting the multi-frame point cloud data, the determined non-ground point cloud data are more accurate, and the subsequently obtained travelable area is more accurate.
In the following alternative embodiment, with reference to the foregoing embodiment, a process of mapping the non-ground point cloud sets to the pre-established target polar coordinate system in step S104, and determining the respective closest distances between the obstacles around the vehicle and the vehicle at the full angle according to the non-ground point cloud sets in the target polar coordinate system will be described.
The target polar coordinate system established in this step is shown in fig. 3, and the target polar coordinate system includes a plurality of angle grids, each angle grid includes a plurality of radial grids, and the angle grids are based on a preset angle resolution res θ The division is carried out, the radial grid is based on a preset radial distance resolution res dist Dividing to obtain a target polar coordinate system with the maximum distance of R max
Here, the maximum distance is determined by hardware parameters of the laser radar itself, and is generally 80m, that is, the center of the rear axle of the vehicle is taken as the center of a circle, and all the distances within 80m of the radius are detection ranges. Generally speaking, the farther a lidar detects a location, the more likely the returned point cloud is noise. The maximum detection distance of the mechanical lidar is generally 100-150, in this embodiment, the maximum detection distance is generally set to 80-100, and the environmental information about 3s can be predicted in advance by taking the high-speed maximum vehicle speed of 120km/h as an example.
The optional embodiment can map the non-ground point cloud set to a pre-established target polar coordinate system, and then determine the nearest distance from the obstacle to the vehicle within the range of 0-360 degrees according to the non-ground point cloud data in the target polar coordinate system.
Specifically, the process of mapping the non-ground point cloud sets to a pre-established target polar coordinate system to determine the respective closest distances between the vehicle and the obstacles at the full angle around the vehicle according to the non-ground point cloud sets in the target polar coordinate system includes the following steps c1 to c 3:
and c1, solving a horizontal angle and a two-dimensional projection distance corresponding to each non-ground point cloud data according to the x-direction coordinate value and the y-direction coordinate value of each non-ground point cloud data in the non-ground point cloud set.
Optionally, in this step, a formula θ ═ tan may be adopted according to the x-direction coordinate value and the y-direction coordinate value of each piece of non-ground point cloud data in the non-ground point cloud set -1 (y/x) calculating the horizontal angle theta corresponding to each non-ground point cloud data, and adopting a formula
Figure BDA0003649929780000151
And calculating the two-dimensional projection distance d corresponding to each non-ground point cloud data.
And c2, mapping each non-ground point cloud data to a target polar coordinate system according to the horizontal angle and the two-dimensional projection distance corresponding to each non-ground point cloud data in the non-ground point cloud set so as to obtain the non-ground point cloud set in the target polar coordinate system.
As shown in fig. 3, the target polar coordinate system may include a plurality of radial grids, and this step may map each non-ground point cloud data included in the non-ground point cloud set into a radial grid in the target polar coordinate system according to the horizontal angle θ and the vertical angle d.
Step c3, for each angle grid included in the target polar coordinate system, determining a non-empty radial grid closest to the pole from the angle grid, and calculating a distance average value between non-ground point cloud data included in the non-empty radial grid and the pole as the closest distance between the obstacle and the vehicle at the angle corresponding to the angle grid; so as to obtain the nearest distance between the obstacles around the vehicle at all angles and the vehicle respectively.
In this step, the pole of the target polar coordinate system is located at the center of the rear axle of the vehicle, and after all the non-ground point cloud data are mapped to the target polar coordinate system in the aforementioned step c2, the non-empty radial grids closest to the pole (i.e., the center of the rear axle of the vehicle) in each angular grid can be determined respectively. Since the determined non-empty radial grids may contain a plurality of non-ground point cloud data, the step may calculate a distance average value between the non-ground point cloud data contained in the non-empty radial grids and the poles, and use the determined distance average value as a closest distance between the obstacle and the vehicle at the corresponding angle.
For example, the angle grids within the range of 0 to 30 ° include radial grids 1, 2, 3, 4, and 5 in order from near to far from the pole, where the radial grids 1 and 2 do not include non-ground point cloud data, and the radial grids 3 to 5 include non-ground point cloud data, and then the non-empty radial grid closest to the pole in the angle grids within the range of 0 to 30 ° is the radial grid 3. Assuming that the radial grid 3 includes non-ground point cloud data 1-10, the average of the distances between the non-ground point cloud data 1-10 and the pole can be used as the closest distance between the obstacle and the vehicle at the angle of 0-30 °.
The closest distance in this embodiment is essentially boundary information of a travelable region, that is, obstacle distance information in which a certain angular direction is closest to the center of the rear axle of the vehicle. The embodiment adopts a polar coordinate nearest distance expression mode, and the measurement precision of the nearest distance is improved to a centimeter level, so that the calculation of the nearest distance is more accurate. By calculating the nearest distance, on one hand, the boundary of the travelable area can be determined, and on the other hand, a basis is provided for subsequently calculating the probability that each grid in the grid map contains the obstacle, so that the subsequently calculated probability is more accurate.
In an alternative embodiment, a process of determining a probability that each grid in the preset grid map includes an obstacle according to the respective closest distances between the obstacles around the vehicle and the vehicle at all angles in step S105 will be described with reference to the foregoing embodiments.
Optionally, the process of determining the probability that each grid in the preset grid map contains the obstacle according to the respective closest distances between the obstacles around the vehicle at all angles and the vehicle in the step "may specifically include the following steps d1 to d 2:
and d1, determining the probability that each grid in the preset grid map contains the obstacle according to the non-ground point cloud sets and the nearest distance.
In this step, a grid map may be set in an area around the vehicle that can be acquired by the lidar, and the map may be divided into several grids by a specified single-frame map size (i.e., the area size acquired by the lidar) and a grid resolution. Then, this step may map each non-ground point cloud set included in the non-ground point cloud sets into a grid to obtain a probability that each grid is occupied by an obstacle.
Considering that the closest distance can represent the boundary of the travelable region, the probability of the grid inside the travelable region being occupied by an obstacle is low and the probability of the grid outside the travelable region being occupied by an obstacle is high.
Based on this, when the probability that each grid is occupied by the obstacle is comprehensively considered, the nearest distance needs to be considered in addition to the non-ground point cloud set, so that the probability that each grid in the grid map contains the obstacle is obtained. Here, the probability that a grid contains an obstacle can characterize the probability that a vehicle can pass through the grid.
And d2, determining the drivable area of the vehicle according to the probability that each grid in the grid map contains the obstacle.
This step determines the probability that each grid in the grid map contains an obstacle by determining the probability that non-ground point cloud data occupies the grid in the grid map.
Optionally, with reference to the foregoing embodiment, this step may determine, according to the non-ground point cloud set, the target fitting plane equation, and the 0360 ° minimum distance, a probability that each grid in the grid map contains an obstacle. Here, the target fitting plane equation refers to a fitting plane equation determined in the last iteration.
Optionally, the specific process of determining the probability that each grid in the grid map contains the obstacle according to the non-ground point cloud set, the target fitting plane equation and the closest distance of 0 to 360 ° includes the following steps d21 to d 25:
and d21, mapping each non-ground point cloud data to a corresponding grid contained in the grid map according to the x-direction coordinate value and the y-direction coordinate value of each non-ground point cloud data in the non-ground point cloud set and the preset grid resolution.
The grid resolution is used for dividing the grid map into a plurality of grids.
In this step, a coordinate transformation relationship Γ between the rear axle center and the origin of the grid map may be set, and each non-ground point cloud data in the non-ground point cloud set is mapped into a corresponding grid included in the grid map according to coordinates (x, y) (note that the non-ground point cloud data is data represented by three-dimensional coordinates, and only x-direction coordinate values and y-direction coordinate values in the three-dimensional coordinates are considered here).
Specifically, using a cell i,j The grid (i.e. grid with coordinates (i, j)) in the ith row and the jth column included in the grid map is represented by res cell Representing grid resolution, then non-ground point cloud data p k (x k ,y k ) And cell i,j When the following relation is satisfied, p is considered to be k ∈cell i,j
Figure BDA0003649929780000181
And d22, determining the non-ground point cloud data contained in the grid map as low point data or high point data according to the target fitting plane equation and a preset second distance threshold.
Optionally, the specific implementation process of this step includes: and calculating the distance between the non-ground point cloud data and a target fitting plane equation according to each non-ground point cloud data (namely each non-ground point cloud data in the non-ground point cloud set) contained in the raster map, determining the non-ground point cloud data as low point data if the calculated distance is less than a second distance threshold, and otherwise, determining the non-ground point cloud data as high point data.
It should be noted that the second distance threshold in this step may be the same as the first distance threshold, or may be different from the first distance threshold, and specifically needs to be determined according to actual situations, which is not limited in this application.
And d23, determining the mixed occupation probability corresponding to each grid according to the low-point data and the high-point data contained in each grid in the grid map, the low-point data contained in the adjacent grid, the preset grid occupation probability and the target fitting plane equation.
Here, the preset grid occupancy probability refers to a probability that mapping of one high-point data into a grid causes the grid to be occupied. For the convenience of the following description, the predetermined grid occupation probability may be denoted as α.
In an optional embodiment, the process of determining the mixed occupation probability corresponding to each grid according to the low point data and the high point data contained in each grid in the grid map, the low point data contained in the adjacent grid, the preset grid occupation probability and the target fitting plane equation in the step "may include the following steps d231 to d 233:
and d231, calculating the number of high point data contained in each grid in the grid map, and determining the high point occupation probability corresponding to each grid in the grid map according to the number and the preset grid occupation probability.
Optionally, for any grid included in the grid map, if n high point data exists in the grid, the high point occupation probability corresponding to the grid is: p high =1-(1-α) n In the formula, P high Is a grid cell i,j The corresponding high point occupation probability.
And d232, determining the low point occupation probability corresponding to each grid in the grid map according to the low point data contained in each grid in the grid map, the low point data contained in the adjacent grids and the target fitting plane equation.
Optionally, in consideration of a precision factor of point cloud segmentation, the determined non-ground point cloud set may not be very accurate, so that the low-point data may be non-ground point cloud data or ground point cloud data (that is, the low-point data is actually ground point cloud data, but is determined as non-ground point cloud data in the present application), and therefore, in order to more accurately determine the low-point occupancy probability corresponding to the grid, the occupancy probability of the grid by the low-point data may be represented by using a gradient change of the grid height.
Based on this, a formula can be adopted
Figure BDA0003649929780000191
To determine the low point occupation probability corresponding to each grid in the grid map, wherein P is low Is a grid cell i,j The probability of the corresponding low point occupancy,
Figure BDA0003649929780000192
representing grid cell i,j The distance mean of all low-point data contained to the target fitting plane equation,
Figure BDA0003649929780000193
representing grid cell k,l The distance means of all low point data contained to the target fitting plane equation, where (k, l) e { (i-1, j-1), (i-1, j), (i-1, j +1), (i, j-1), (i, j +1), (i +1, j-1), (i +1, j), (i +1, j +1) }.
And d233, determining the mixed occupation probability corresponding to each grid according to the high point occupation probability and the low point occupation probability corresponding to each grid in the grid map.
It is understood that there may be a grid containing both high-point data and low-point data, and the mixed occupation probability corresponding to the grid can be calculated through the present step.
Alternatively, it can be represented by formula P mix =1-(1-P high )(1-P low ) To determine a grid cell i,j Corresponding mixed occupation probability.
And d24, calculating the distance between each grid contained in the grid map and the center of the rear axle of the vehicle based on the coordinate transformation relation between the center of the rear axle of the vehicle and the origin of the grid map, and determining the initial occupation probability corresponding to each grid according to the calculated distance and the closest distance between the obstacle and the vehicle at the corresponding angle of each grid.
In this step, as described in the foregoing embodiment, the shortest distance is the boundary of the travelable area, that is, for one angle grid, the map grid behind the shortest distance between the obstacle and the vehicle at the corresponding angle is the impassable area.
Therefore, in the step, firstly, based on the coordinate transformation relationship gamma between the center of the rear axle of the vehicle and the origin of the grid map, the distance d from each grid contained in the grid map to the center of the rear axle of the vehicle is solved tf Then, each grid (e.g. cell) is calculated according to the above-mentioned calculation formula of the horizontal angle θ i,j ) Corresponding horizontal angle theta, thereby obtaining the corresponding nearest distance of each grid
Figure BDA0003649929780000201
After that, this step may be based on the distance d of each grid from the center of the rear axle of the vehicle tf The closest distance corresponding to each grid
Figure BDA0003649929780000202
The initial occupation probability corresponding to each grid contained in the grid map is calculated by adopting the following formula.
Figure BDA0003649929780000203
In the formula, P init Is a grid cell i,j Corresponding initial occupancy probability.
And d25, determining the probability that each grid contained in the grid map contains the obstacle according to the mixed occupation probability and the initial occupation probability corresponding to each grid contained in the grid map.
The step can synthesize the two types of probabilities of mixed occupation probability and initial occupation probability to obtain the final occupation probability, and the final occupation probability is the probability that the grid contains the obstacles.
Alternatively, it can be represented by formula P final =1-(1-P mix )(1-P init ) To determine a grid cell i,j The probability of containing an obstacle.
It should be noted that the higher the probability value determined in this step is, the higher the probability that the grid is occupied by an obstacle is, and the lower the probability of being passable is.
In the embodiment, the internal area of the travelable area boundary is quantitatively evaluated through the grid map, the whole detection range is rasterized, the passable probability of the grids contained in the grid map is represented by a probability value, and the calculated probability is more accurate.
In summary, the present application provides a method for detecting a drivable area for intelligent driving based on multiple laser radars, which greatly increases the detection distance and has good environmental adaptability compared to a target detection method based on a camera; compared with a detection method based on a laser radar and a high-precision map, the method does not depend on road information, in other words, only depends on laser radar perception information, does not need additional condition assistance such as the high-precision map and the structured road, and is high in autonomous detection performance.
Further, detect the blind area through controlling the left and right sides of radar replenishment vehicle, characteristics such as nearest distance and occupation probability map all can be used to control the radar replenishment among the technical scheme, if only use main laser radar, especially in scenes such as automatic parking, main radar can have the blind area, calculate at the nearest distance on left side and right side, occupy the barrier that probability calculation can ignore in the blind area, therefore, on this basis, utilize left radar and right radar to reduce the perception blind area, from directly perceivedly, the replenishment of controlling the radar can make the vehicle left side and right side point cloud information more, the point cloud density is bigger.
From the calculation result of the drivable area, the calculation of the nearest distance and the occupied grid map is more accurate; the method adopts a polar coordinate nearest distance expression mode to improve the measuring precision of the nearest distance to a centimeter level; finally, rasterization of the single-frame point cloud is achieved, and path planning and control of intelligent driving can be served.
The embodiment of the present application further provides an intelligent driving travelable area detection device, which is described below, and the intelligent driving travelable area detection device described below and the intelligent driving travelable area detection method described above may be referred to in correspondence with each other.
Referring to fig. 4, a schematic structural diagram of the intelligent driving feasible region detection apparatus provided by the embodiment of the present application is shown, and as shown in fig. 4, the intelligent driving feasible region detection apparatus may include: the system comprises a point cloud data acquisition module 401, a point cloud data fusion module 402, a non-ground point cloud set determination module 403, a closest distance determination module 404 and a travelable region determination module 405.
A point cloud data acquisition module 401, configured to acquire point cloud data of multiple lidar at the same time stamp, where the point cloud data is acquired by the lidar mounted on the vehicle, and the point cloud data is point cloud data transformed to a vehicle coordinate system; the point cloud data comprises a plurality of frames of point cloud data.
And the point cloud data fusion module 402 is configured to perform fusion processing on the multi-frame point cloud data to obtain a fusion point cloud set.
And a non-ground point cloud set determining module 403, configured to determine non-ground point cloud data from the fused point cloud set according to a z-coordinate value of the fused point cloud data included in the fused point cloud set, so as to obtain a non-ground point cloud set composed of the determined non-ground point cloud data.
The nearest distance determining module 404 is configured to map the non-ground point cloud sets to a target polar coordinate system established in advance, and determine nearest distances between the vehicle and the obstacles around the vehicle at all angles according to the non-ground point cloud sets in the target polar coordinate system, where the target polar coordinate system is a polar coordinate system established with an origin of a vehicle coordinate system as a pole.
The travelable region determining module 405 is configured to determine a travelable region of the vehicle according to respective closest distances between obstacles around the vehicle at all angles and the vehicle.
The device for detecting the intelligent driving travelable area comprises the steps of firstly obtaining point cloud data of a plurality of laser radars under the same timestamp, then carrying out fusion processing on multi-frame point cloud data to obtain a fusion point cloud set, then determining non-ground point cloud data from the fusion point cloud set according to a z-direction coordinate value of the fusion point cloud data contained in the fusion point cloud set to obtain a non-ground point cloud set formed by the determined non-ground point cloud data, then mapping the non-ground point cloud set to a pre-established target polar coordinate system to determine the nearest distances between obstacles around the vehicle under the full angle and the vehicle according to the non-ground point cloud set under the target polar coordinate system, and finally determining the travelable area of the vehicle according to the nearest distances between the obstacles around the vehicle under the full angle and the vehicle. According to the method, a plurality of laser radars are adopted to collect point cloud data, so that the detection distance is greatly increased, and the environmental adaptability is better; the method and the device can determine the nearest distance between each barrier and the vehicle at all angles around the vehicle, can accurately obtain the boundary of the drivable area at all angles around the vehicle based on the nearest distance, only rely on the point cloud data acquired by the laser radar in the whole process, do not need additional condition assistance such as a high-precision map and a structured road, and have high autonomous detection performance.
In a possible implementation manner, the point cloud data fusion module 402 may be specifically configured to splice multiple frames of point cloud data into one frame of point cloud data according to a preset laser radar sequence to obtain a fusion point cloud set, or calculate a horizontal angle and a vertical angle corresponding to the point cloud data for each point cloud data included in the multiple frames of point cloud data, and fill the point cloud data into a corresponding point cloud grid according to the horizontal angle and the vertical angle corresponding to the point cloud data; filling each point cloud data contained in the multi-frame point cloud data into the corresponding point cloud grids; for each point cloud grid, if the point cloud grid contains a plurality of point cloud data, taking the gravity centers of the point cloud data as fused point cloud data under the point cloud grid; so as to obtain a fused point cloud set consisting of fused point cloud data under all point cloud grids.
In a possible implementation manner, the non-ground point cloud set determining module 403 may include: the system comprises an abnormal point cloud removing module, a ground point cloud set determining module to be updated, a fitting plane equation determining module, an updated ground point cloud set determining module and an updating circulation module.
And the abnormal point cloud eliminating module is used for eliminating the fused point cloud data where the abnormal z-direction coordinate values in the fused point cloud set are located, and taking the fused point cloud set with the abnormal data eliminated as a target fused point cloud set.
And the ground point cloud set to be updated determining module is used for determining ground point cloud data from the target fusion point cloud set according to the z-direction coordinate value of the fusion point cloud data contained in the target fusion point cloud set so as to obtain the ground point cloud set to be updated consisting of the determined ground point cloud data.
And the fitting plane equation determining module is used for determining a fitting plane equation according to the ground point cloud set to be updated.
And the updated ground point cloud set determining module is used for calculating the distance between each fused point cloud data in the target fused point cloud set and the fitting plane equation, determining the fused point cloud data of which the calculated distance is smaller than a preset first distance threshold value as the ground point cloud data, and taking the determined ground point cloud data as the updated ground point cloud set.
And the updating circulation module is used for adding 1 to the updating iteration frequency and judging whether the updating iteration frequency added with 1 reaches the preset total iteration frequency, if not, the updated ground point cloud set is used as the ground point cloud data to be updated, the fitting plane equation determining module is returned to execute, and other point cloud data except the updated ground point cloud set in the target fusion point cloud set are used as non-ground point cloud data until the updating iteration frequency reaches the total iteration frequency, wherein the initial updating iteration frequency is 0.
In a possible implementation manner, the ground point cloud set to be updated determining module may include: the device comprises a z-direction average value calculation module and a difference value comparison module.
The z-direction average value calculating module is used for selecting a preset number of fused point cloud data from the target fused point cloud set according to the z-direction coordinate value of the fused point cloud data contained in the target fused point cloud set and calculating the z-direction average value of the preset number of fused point cloud data.
And the difference comparison module is used for determining that the fused point cloud data is ground point cloud data if the difference between the z-direction coordinate value and the z-direction average value of the fused point cloud data is less than a preset difference threshold value aiming at each fused point cloud data in the target fused point cloud set.
In a possible implementation manner, the fitting plane equation determining module may include: the device comprises a three-way average value calculation module, a characteristic information solving module, a normal vector determination module and a fitting plane equation solving module.
The three-way average value calculation module is used for calculating the x-direction average value, the y-direction average value and the z-direction average value of the ground point cloud set to be updated.
And the characteristic information solving module is used for determining a covariance matrix according to the x-direction average value, the y-direction average value and the z-direction average value of the ground point cloud set to be updated and solving a plurality of characteristic values and characteristic vectors of the covariance matrix.
And the normal vector determining module is used for determining the minimum characteristic value from the plurality of characteristic values and taking the characteristic vector corresponding to the minimum characteristic value as the normal vector of the fitting plane.
And the fitting plane equation solving module is used for solving a fitting plane equation according to the ground point cloud set to be updated and the normal vector of the fitting plane.
In one possible implementation, the target polar coordinate system includes a plurality of angle grids, and each angle grid includes a plurality of radial grids.
Based on this, the foregoing closest distance determining module 403 may include: the device comprises an angle distance solving module, a first point cloud mapping module and a distance mean value calculating module.
The angle and distance solving module is used for solving a horizontal angle and a two-dimensional projection distance corresponding to each non-ground point cloud data according to an x-direction coordinate value and a y-direction coordinate value of each non-ground point cloud data in the non-ground point cloud set.
And the first point cloud mapping module is used for mapping each non-ground point cloud data in the non-ground point cloud set to a target polar coordinate system according to the horizontal angle and the two-dimensional projection distance corresponding to each non-ground point cloud data in the non-ground point cloud set so as to obtain the non-ground point cloud set in the target polar coordinate system.
The distance mean value calculation module is used for determining a non-empty radial grid closest to a pole from the angle grids for each angle grid contained in the target polar coordinate system, and calculating the distance mean value between non-ground point cloud data contained in the non-empty radial grid and the pole as the closest distance between an obstacle and a vehicle at an angle corresponding to the angle grid; so as to obtain the nearest distance between the obstacles around the vehicle at all angles and the vehicle respectively.
In one possible implementation, the travelable region determining module 404 may include: an occupation probability determination module and an occupation probability application module.
The occupation probability determining module is used for determining the probability that each grid in the preset grid map contains the obstacle according to the non-ground point cloud set and the nearest distance.
And the occupation probability application module is used for determining the drivable area of the vehicle according to the probability that each grid in the grid map contains the obstacle.
In a possible implementation manner, the occupancy probability determination module may include: the device comprises a second point cloud mapping module, a high-low point data determining module, a mixed occupation probability determining module, an initial occupation probability determining module and a final occupation probability determining module.
And the second point cloud mapping module is used for mapping each non-ground point cloud data into a corresponding grid contained in the grid map according to the x-direction coordinate value and the y-direction coordinate value of each non-ground point cloud data in the non-ground point cloud set and a preset grid resolution, wherein the grid resolution is used for dividing the grid map into a plurality of grids.
And the high-low point data determining module is used for determining non-ground point cloud data contained in the grid map as low point data or high point data according to a target fitting plane equation and a preset second distance threshold, wherein the target fitting plane equation is a fitting plane equation determined by the last iteration.
And the mixed occupation probability determining module is used for determining the mixed occupation probability corresponding to each grid according to the low-point data and the high-point data contained in each grid in the grid map, the low-point data contained in the adjacent grids, the preset grid occupation probability and the target fitting plane equation.
And the initial occupation probability determining module is used for calculating the distance between each grid contained in the grid map and the center of the rear axle of the vehicle based on the coordinate transformation relation between the center of the rear axle of the vehicle and the origin of the grid map, and determining the initial occupation probability corresponding to each grid according to the calculated distance and the closest distance between the obstacle and the vehicle at the corresponding angle of each grid.
And the final occupation probability determining module is used for determining the probability that each grid contained in the grid map contains the obstacles according to the mixed occupation probability and the initial occupation probability corresponding to each grid contained in the grid map.
In a possible implementation manner, the mixed occupation probability determining module may include: the device comprises a high point occupation probability determination module, a low point occupation probability determination module and a mixed occupation probability calculation module.
The high point occupation probability determining module is used for calculating the number of high point data contained in each grid in the grid map and determining the high point occupation probability corresponding to each grid in the grid map according to the number and the preset grid occupation probability.
And the low-point occupation probability determining module is used for determining the low-point occupation probability corresponding to each grid in the grid map according to the low-point data contained in each grid in the grid map, the low-point data contained in the adjacent grids and the target fitting plane equation.
And the mixed occupation probability calculation module is used for determining the mixed occupation probability corresponding to each grid according to the high point occupation probability and the low point occupation probability corresponding to each grid in the grid map.
The embodiment of the application also provides intelligent driving area detection equipment. Alternatively, fig. 5 is a block diagram showing a hardware configuration of the intelligent driving feasible region detection apparatus, and referring to fig. 5, the hardware configuration of the intelligent driving feasible region detection apparatus may include: at least one processor 501, at least one communication interface 502, at least one memory 503, and at least one communication bus 504;
in the embodiment of the present application, the number of the processor 501, the communication interface 502, the memory 503 and the communication bus 504 is at least one, and the processor 501, the communication interface 502 and the memory 503 complete the communication with each other through the communication bus 504;
the processor 501 may be a central processing unit CPU, or an application Specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present invention, etc.;
the memory 503 may include a high-speed RAM memory, and may further include a non-volatile memory (non-volatile memory) or the like, such as at least one disk memory;
wherein the memory 503 stores a program and the processor 501 may call the program stored in the memory 503 for:
acquiring point cloud data of a plurality of laser radars under the same timestamp, wherein the point cloud data are acquired by the laser radars installed on a vehicle, and the point cloud data are all point cloud data transformed to a vehicle coordinate system; the point cloud data comprises multi-frame point cloud data;
performing fusion processing on multi-frame point cloud data to obtain a fusion point cloud set;
determining non-ground point cloud data from the fused point cloud set according to a z-direction coordinate value of fused point cloud data contained in the fused point cloud set so as to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data;
mapping the non-ground point cloud sets to a pre-established target polar coordinate system, and determining the closest distances between the obstacles around the vehicle under the full angle and the vehicle respectively according to the non-ground point cloud sets under the target polar coordinate system, wherein the target polar coordinate system is a polar coordinate system established by taking the origin of a vehicle coordinate system as a pole;
and determining the travelable area of the vehicle according to the closest distances between the obstacles around the vehicle at all angles and the vehicle.
Alternatively, the detailed function and the extended function of the program may refer to the above description.
The embodiment of the application also provides a readable storage medium, wherein a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the intelligent driving travelable area detection method is realized.
Alternatively, the detailed function and the extended function of the program may be as described above.
Finally, it is further noted that, herein, relational terms such as, for example, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An intelligent driving travelable area detection method, comprising:
acquiring point cloud data of a plurality of laser radars under the same timestamp, wherein the point cloud data are acquired by the laser radars installed on a vehicle, and the point cloud data are all point cloud data transformed to a vehicle coordinate system; the point cloud data comprises multi-frame point cloud data;
performing fusion processing on the multi-frame point cloud data to obtain a fusion point cloud set;
determining non-ground point cloud data from the fused point cloud set according to a z-direction coordinate value of fused point cloud data contained in the fused point cloud set so as to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data;
mapping the non-ground point cloud sets to a pre-established target polar coordinate system, and determining the nearest distances between the obstacles around the vehicle under the full angle and the vehicle respectively according to the non-ground point cloud sets under the target polar coordinate system, wherein the target polar coordinate system is a polar coordinate system established by taking the origin of the vehicle coordinate system as a pole;
and determining the travelable area of the vehicle according to the closest distance between the obstacles around the vehicle at all angles and the vehicle.
2. The method for detecting the intelligent driving drivable area as claimed in claim 1, wherein the step of performing fusion processing on the multi-frame point cloud data to obtain a fused point cloud set comprises the following steps:
splicing the multi-frame point cloud data into one frame of point cloud data according to a preset laser radar sequence to obtain a fusion point cloud set;
or the like, or a combination thereof,
aiming at each point cloud data contained in the multi-frame point cloud data, calculating a horizontal angle and a vertical angle corresponding to the point cloud data, and filling the point cloud data into a corresponding point cloud grid according to the horizontal angle and the vertical angle corresponding to the point cloud data; filling each point cloud data contained in the multi-frame point cloud data into the corresponding point cloud grids;
for each point cloud grid, if a plurality of point cloud data are contained in the point cloud grid, taking the gravity centers of the plurality of point cloud data as fused point cloud data under the point cloud grid; so as to obtain a fusion point cloud set consisting of fusion point cloud data under all point cloud grids.
3. The method for detecting the intelligent driving travelable area according to claim 1, wherein the determining non-ground point cloud data from the fused point cloud set according to the z-coordinate value of the fused point cloud data included in the fused point cloud set comprises:
removing fused point cloud data where abnormal z-direction coordinate values in the fused point cloud set are located, and taking the fused point cloud set with abnormal data removed as a target fused point cloud set;
according to the z-direction coordinate value of the fused point cloud data contained in the target fused point cloud set, determining ground point cloud data from the target fused point cloud set to obtain a ground point cloud set to be updated, wherein the ground point cloud set consists of the determined ground point cloud data;
determining a fitting plane equation according to the ground point cloud set to be updated;
calculating the distance between each fused point cloud data in the target fused point cloud set and the fitting plane equation, determining the fused point cloud data with the calculated distance being smaller than a preset first distance threshold value as ground point cloud data, and taking the determined ground point cloud data as an updated ground point cloud set;
and adding 1 to the updating iteration frequency, judging whether the updating iteration frequency after the 1 addition reaches the preset total iteration frequency, if not, taking the updated ground point cloud set as the ground point cloud data to be updated, returning to execute the step of determining a fitting plane equation according to the ground point cloud set to be updated, and taking other point cloud data except the updated ground point cloud set in the target fusion point cloud set as non-ground point cloud data when the updating iteration frequency reaches the total iteration frequency, wherein the initial updating iteration frequency is 0.
4. The method for detecting the intelligent driving drivable area as claimed in claim 1, wherein the determining ground point cloud data from the target fused point cloud set according to the z-coordinate value of the fused point cloud data contained in the target fused point cloud set comprises:
selecting a preset number of fused point cloud data from the target fused point cloud set according to the z-direction coordinate value of the fused point cloud data contained in the target fused point cloud set, and calculating the z-direction average value of the preset number of fused point cloud data;
and aiming at each fused point cloud data in the target fused point cloud set, if the difference value between the z-direction coordinate value of the fused point cloud data and the z-direction average value is smaller than a preset difference value threshold value, determining that the fused point cloud data is the ground point cloud data.
5. The intelligent driving drivable area detection method as claimed in claim 3, wherein determining a fitting plane equation from the cloud set of ground points to be updated comprises:
calculating the average value in the x direction, the average value in the y direction and the average value in the z direction of the ground point cloud set to be updated;
determining a covariance matrix according to the x-direction average value, the y-direction average value and the z-direction average value of the ground point cloud set to be updated, and solving a plurality of eigenvalues and eigenvectors of the covariance matrix;
determining a minimum characteristic value from the plurality of characteristic values, and taking a characteristic vector corresponding to the minimum characteristic value as a normal vector of a fitting plane;
and solving the fitting plane equation according to the ground point cloud set to be updated and the normal vector of the fitting plane.
6. The intelligent driving drivable area detection method as claimed in claim 3, characterized in that said target polar coordinate system comprises a plurality of angular grids, each angular grid comprising a plurality of radial grids;
the mapping the non-ground point cloud sets to a pre-established target polar coordinate system to determine the respective closest distances between the vehicle and the obstacles around the vehicle at the full angle according to the non-ground point cloud sets in the target polar coordinate system includes:
solving a horizontal angle and a two-dimensional projection distance corresponding to each non-ground point cloud data according to the x-direction coordinate value and the y-direction coordinate value of each non-ground point cloud data in the non-ground point cloud set;
mapping each non-ground point cloud data to the target polar coordinate system according to the horizontal angle and the two-dimensional projection distance corresponding to each non-ground point cloud data in the non-ground point cloud set to obtain a non-ground point cloud set in the target polar coordinate system;
for each angle grid contained in the target polar coordinate system, determining a non-empty radial grid closest to a pole from the angle grid, and calculating a distance average value between non-ground point cloud data contained in the non-empty radial grid and the pole to serve as a closest distance between an obstacle and the vehicle at an angle corresponding to the angle grid; so as to obtain the nearest distance between the obstacles around the vehicle under the full angle and the vehicle respectively.
7. The intelligent driving travelable region detection method according to claim 6, wherein the determining the travelable region of the vehicle according to the respective closest distances of the obstacles around the vehicle at all angles to the vehicle comprises:
determining the probability that each grid in a preset grid map contains an obstacle according to the non-ground point cloud set and the closest distance;
and determining a drivable area of the vehicle according to the probability that each grid in the grid map contains the obstacle.
8. The method of claim 7, wherein the determining the probability that each grid in a preset grid map contains an obstacle according to the non-ground point cloud set and the closest distance comprises:
mapping each non-ground point cloud data to a corresponding grid contained in the grid map according to an x-direction coordinate value and a y-direction coordinate value of each non-ground point cloud data in the non-ground point cloud set and a preset grid resolution, wherein the grid resolution is used for dividing the grid map into a plurality of grids;
determining non-ground point cloud data contained in the grid map as low point data or high point data according to a target fitting plane equation and a preset second distance threshold, wherein the target fitting plane equation is a fitting plane equation determined by the last iteration;
determining the mixed occupation probability corresponding to each grid according to the low point data and the high point data contained in each grid in the grid map, the low point data contained in the adjacent grids, the preset grid occupation probability and the target fitting plane equation;
calculating the distance between each grid contained in the grid map and the center of the rear axle of the vehicle based on the coordinate transformation relation between the center of the rear axle of the vehicle and the origin of the grid map, and determining the initial occupation probability corresponding to each grid according to the calculated distance and the closest distance between the obstacle at the corresponding angle of each grid and the vehicle;
and determining the probability that each grid contained in the grid map contains the obstacle according to the mixed occupation probability and the initial occupation probability corresponding to each grid contained in the grid map.
9. The intelligent drivable driving area detection method of claim 8, wherein determining the mixed occupation probability corresponding to each grid according to the low-point data and the high-point data contained in each grid in the grid map, and the low-point data, the preset grid occupation probability and the target fitting plane equation contained in the adjacent grids comprises:
calculating the number of high point data contained in each grid in the grid map, and determining the high point occupation probability corresponding to each grid in the grid map according to the number and the preset grid occupation probability;
determining the low point occupation probability corresponding to each grid in the grid map according to the low point data contained in each grid in the grid map, the low point data contained in adjacent grids and the target fitting plane equation;
and determining the mixed occupation probability corresponding to each grid according to the high point occupation probability and the low point occupation probability corresponding to each grid in the grid map.
10. An intelligent driving travelable area detection device, comprising:
the system comprises a point cloud data acquisition module, a point cloud data acquisition module and a data processing module, wherein the point cloud data acquisition module is used for acquiring point cloud data under the same timestamp of a plurality of laser radars, the point cloud data are acquired by the laser radars installed on a vehicle, and the point cloud data are all point cloud data transformed to a vehicle coordinate system; the point cloud data comprises multi-frame point cloud data;
the point cloud data fusion module is used for carrying out fusion processing on the multi-frame point cloud data to obtain a fusion point cloud set;
the non-ground point cloud set determining module is used for determining non-ground point cloud data from the fused point cloud set according to the z-direction coordinate value of the fused point cloud data contained in the fused point cloud set so as to obtain a non-ground point cloud set consisting of the determined non-ground point cloud data;
the system comprises a minimum distance determining module, a maximum distance determining module and a minimum distance determining module, wherein the minimum distance determining module is used for mapping the non-ground point cloud sets to a pre-established target polar coordinate system so as to determine the minimum distances between obstacles around the vehicle under a full angle and the vehicle according to the non-ground point cloud sets under the target polar coordinate system, and the target polar coordinate system is a polar coordinate system established by taking the origin of the vehicle coordinate system as a pole;
and the travelable area determining module is used for determining the travelable area of the vehicle according to the closest distances between the obstacles around the vehicle at all angles and the vehicle.
CN202210539241.8A 2022-05-18 2022-05-18 Intelligent driving travelable area detection method and device Pending CN114994635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210539241.8A CN114994635A (en) 2022-05-18 2022-05-18 Intelligent driving travelable area detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210539241.8A CN114994635A (en) 2022-05-18 2022-05-18 Intelligent driving travelable area detection method and device

Publications (1)

Publication Number Publication Date
CN114994635A true CN114994635A (en) 2022-09-02

Family

ID=83027527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210539241.8A Pending CN114994635A (en) 2022-05-18 2022-05-18 Intelligent driving travelable area detection method and device

Country Status (1)

Country Link
CN (1) CN114994635A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116197910A (en) * 2023-03-16 2023-06-02 江苏集萃清联智控科技有限公司 Environment sensing method and device for wind power blade wheel type mobile polishing robot
CN116533998A (en) * 2023-07-04 2023-08-04 深圳海星智驾科技有限公司 Automatic driving method, device, equipment, storage medium and vehicle of vehicle
CN117671648A (en) * 2024-02-02 2024-03-08 深圳市其域创新科技有限公司 Obstacle point detection method, obstacle point detection device and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116197910A (en) * 2023-03-16 2023-06-02 江苏集萃清联智控科技有限公司 Environment sensing method and device for wind power blade wheel type mobile polishing robot
CN116197910B (en) * 2023-03-16 2024-01-23 江苏集萃清联智控科技有限公司 Environment sensing method and device for wind power blade wheel type mobile polishing robot
CN116533998A (en) * 2023-07-04 2023-08-04 深圳海星智驾科技有限公司 Automatic driving method, device, equipment, storage medium and vehicle of vehicle
CN116533998B (en) * 2023-07-04 2023-09-29 深圳海星智驾科技有限公司 Automatic driving method, device, equipment, storage medium and vehicle of vehicle
CN117671648A (en) * 2024-02-02 2024-03-08 深圳市其域创新科技有限公司 Obstacle point detection method, obstacle point detection device and storage medium
CN117671648B (en) * 2024-02-02 2024-04-26 深圳市其域创新科技有限公司 Obstacle point detection method, obstacle point detection device and storage medium

Similar Documents

Publication Publication Date Title
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN114994635A (en) Intelligent driving travelable area detection method and device
CN108802785B (en) Vehicle self-positioning method based on high-precision vector map and monocular vision sensor
CN109074490B (en) Path detection method, related device and computer readable storage medium
US9361696B2 (en) Method of determining a ground plane on the basis of a depth image
CN110807806B (en) Obstacle detection method and device, storage medium and terminal equipment
CN110674705A (en) Small-sized obstacle detection method and device based on multi-line laser radar
CN115032651A (en) Target detection method based on fusion of laser radar and machine vision
KR101030317B1 (en) Apparatus for tracking obstacle using stereo vision and method thereof
CN113989758A (en) Anchor guide 3D target detection method and device for automatic driving
CN114882701B (en) Parking space detection method and device, electronic equipment and machine readable storage medium
CN110989619B (en) Method, apparatus, device and storage medium for locating objects
CN112581543A (en) Visual navigation method for moving robot
CN114170188A (en) Target counting method and system for overlook image and storage medium
CN112699748A (en) Human-vehicle distance estimation method based on YOLO and RGB image
Gao et al. Design and implementation of autonomous mapping system for ugv based on lidar
US20220404506A1 (en) Online validation of lidar-to-lidar alignment and lidar-to-vehicle alignment
CN111462321B (en) Point cloud map processing method, processing device, electronic device and vehicle
CN116385997A (en) Vehicle-mounted obstacle accurate sensing method, system and storage medium
CN117170354A (en) Wheel type robot positioning method and system
CN116863325A (en) Method for multiple target detection and related product
CN114359891A (en) Three-dimensional vehicle detection method, system, device and medium
CN114280583A (en) Laser radar positioning precision verification method and system under condition of no GPS signal
CN114140659A (en) Social distance monitoring method based on human body detection under view angle of unmanned aerial vehicle
CN112508970A (en) Point cloud data segmentation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination