CN109238221B - Method and device for detecting surrounding environment of vehicle - Google Patents

Method and device for detecting surrounding environment of vehicle Download PDF

Info

Publication number
CN109238221B
CN109238221B CN201710556525.7A CN201710556525A CN109238221B CN 109238221 B CN109238221 B CN 109238221B CN 201710556525 A CN201710556525 A CN 201710556525A CN 109238221 B CN109238221 B CN 109238221B
Authority
CN
China
Prior art keywords
grid
processed
vehicle
points
marking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710556525.7A
Other languages
Chinese (zh)
Other versions
CN109238221A (en
Inventor
李星河
孙银建
张显宏
徐向敏
刘奋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Motor Corp Ltd
Original Assignee
SAIC Motor Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Motor Corp Ltd filed Critical SAIC Motor Corp Ltd
Priority to CN201710556525.7A priority Critical patent/CN109238221B/en
Publication of CN109238221A publication Critical patent/CN109238221A/en
Application granted granted Critical
Publication of CN109238221B publication Critical patent/CN109238221B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention discloses a method for detecting the surrounding environment of a vehicle, which comprises the following steps: obtaining a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor; judging whether the height difference of the recording points in the grid to be processed is smaller than a first judgment threshold value; when the height difference of the recording points in the grid to be processed is smaller than a first judgment threshold, if the distance between the grid to be processed and the vehicle is smaller than a preset distance threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is smaller than a second judgment threshold; when the lowest height of the recording points in the grid to be processed is smaller than a second judgment threshold, marking the grid to be processed as a ground grid; wherein the first judgment threshold and the second judgment threshold are in direct proportion to the distance between the grid to be processed and the vehicle. According to the embodiment of the invention, the height of the point cloud data is taken as a basis, the distance between the point cloud and the vehicle is synthesized, and the point cloud data representing the ground is accurately screened out by utilizing two judgment thresholds, so that an accurate decision basis is provided for automatic driving control.

Description

Method and device for detecting surrounding environment of vehicle
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for detecting surrounding environment.
Background
An Advanced Driver Assistance System (ADAS) is an active safety technology that enables a Driver to detect a possible danger at the fastest time to attract attention and improve safety, and is one of basic functional requirements for automatic driving, which mainly uses various sensors mounted on a vehicle to collect environmental data inside and outside the vehicle and perform technical processes such as identification, detection, and tracking of static and dynamic objects. ADAS typically includes functions such as adaptive cruise, lane keeping, automatic braking systems, etc.
The ADAS driving assistance function may directly default that the lane line is in or the vehicle driving area is flat when the vehicle is located in the structured road area when the surroundings of the vehicle are identified, that is, the safe driving of the vehicle may be controlled. However, when the vehicle is located in an area lacking auxiliary facilities such as a lane line, a road marking, etc., even in an unpaved road area, safe driving of the vehicle is required to be achieved by relying on recognition of the ground in the sensor data.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for detecting a surrounding environment of a vehicle, which can identify a ground surface according to data returned by a sensor.
The embodiment of the invention provides a method for detecting the surrounding environment of a vehicle, which comprises the following steps:
obtaining a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor;
judging whether the height difference of recording points in a grid to be processed is smaller than a first judgment threshold value or not, wherein the grid map comprises the grid to be processed;
when the height difference of the recording points in the grid to be processed is smaller than the first judgment threshold, if the distance between the grid to be processed and the vehicle is smaller than a preset distance threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is smaller than a second judgment threshold;
when the lowest height of the recording points in the grid to be processed is smaller than the second judgment threshold, marking the grid to be processed as a ground grid;
wherein the first and second determination thresholds are proportional to a distance between the grid to be processed and the vehicle.
Optionally, the method further includes:
when the height difference of the recording points in the grid to be processed is greater than or equal to the first judgment threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is less than the second judgment threshold;
when the lowest height of the recording points in the grid to be processed is smaller than the second judgment threshold, marking the grid to be processed as a ground obstacle grid;
when the lowest height of the recording points in the grid to be processed is larger than or equal to the second judgment threshold, marking the grid to be processed as an obstacle grid.
Optionally, the method further includes:
judging whether the difference between the height of the recording point to be processed in the ground obstacle grid and the lowest height of the recording point in the grid is smaller than a third judgment threshold value or not;
if yes, marking the recording points to be processed as ground obstacle reflection points;
and if not, marking the recording points to be processed as barrier reflection points.
Optionally, the method further includes:
if the distance between the grid to be processed and the vehicle is larger than or equal to the preset distance threshold, marking the grid to be processed as a ground grid.
Optionally, the method further includes:
marking the recorded points in the ground grid as ground reflection points;
marking the recorded points within the barrier grid as barrier reflection points.
The embodiment of the invention provides a detection device for the surrounding environment of a vehicle, which comprises: the map acquisition module, the first judgment module, the second judgment module and the grid marking module;
the map acquisition module is used for acquiring a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor;
the first judging module is used for judging whether the height difference of the recording points in the grid to be processed is smaller than a first judging threshold value or not, and the grid map comprises the grid to be processed;
the second judging module is configured to, when the first judging module judges that the height difference between the recording points in the grid to be processed is smaller than the first judging threshold, if the distance between the grid to be processed and the vehicle is smaller than a preset distance threshold, continue to judge whether the lowest height of the recording points in the grid to be processed is smaller than a second judging threshold;
the grid marking module is configured to mark the grid to be processed as a ground grid when the second determination module determines that the minimum height of the recording point in the grid to be processed is smaller than the second determination threshold;
wherein the first and second determination thresholds are proportional to a distance between the grid to be processed and the vehicle.
Alternatively to this, the first and second parts may,
the second judging module is further configured to, when the first judging module judges that the height difference between the recording points in the grid to be processed is greater than or equal to the first judging threshold, continue to judge whether the lowest height of the recording points in the grid to be processed is less than the second judging threshold;
the grid marking module is further configured to mark the grid to be processed as a ground obstacle grid when the second determination module determines that the minimum height of the recording point in the grid to be processed is smaller than the second determination threshold; and the second judging module is further used for marking the grid to be processed as an obstacle grid when the second judging module judges that the lowest height of the recording points in the grid to be processed is greater than or equal to the second judging threshold.
Optionally, the apparatus further includes: a third judging module and a point marking module;
the third judging module is used for judging whether the difference between the height of the recording point to be processed in the ground obstacle grid and the lowest height of the recording point in the grid is smaller than a third judging threshold value;
the point marking module is used for marking the recording points to be processed as ground obstacle reflection points when the judgment result of the third judgment module is yes; and the recording point to be processed is marked as an obstacle reflection point when the judgment result of the third judgment module is negative.
Optionally, the apparatus further includes:
the grid marking module is further configured to mark the grid to be processed as a ground grid if the distance between the grid to be processed and the vehicle is greater than or equal to the preset distance threshold.
Optionally, the apparatus further includes:
the point marking module is also used for marking the recording points in the ground grid as ground reflection points; and for marking the recorded points within the barrier grid as barrier reflection points.
Compared with the prior art, the invention has at least the following advantages:
in the embodiment of the invention, firstly, a grid map of the surrounding environment of the vehicle with the ground projection point of the center of the rear axle of the vehicle as the origin is drawn according to the point cloud data which is returned by the sensor and takes the center of the sensor as the origin. Then, according to the height of the recorded point in each grid, and a first judgment threshold and a second judgment threshold which are in direct proportion to the distance between the grid and the vehicle, judging whether the point cloud data recorded in the grid is the ground around the vehicle, identifying a ground area around the vehicle which can pass safely, and providing a basis for decision judgment of the driving assistance function of the ADAS. According to the embodiment of the invention, the height of the point cloud data is taken as a basis, the distance between the point cloud data and the vehicle is integrated, and the point cloud data representing the ground is accurately screened out by utilizing two judgment thresholds, so that an accurate decision basis is provided for automatic driving control of the vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flow chart of a method for detecting a vehicle surroundings according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the working principle of the three-dimensional laser range finder in the embodiment of the present invention;
fig. 3 is a schematic flow chart of drawing a grid map according to an embodiment of the present invention;
FIG. 4 is a top view of a grid map provided by an embodiment of the present invention;
FIG. 5 is a schematic flow chart illustrating another method for detecting the environment around a vehicle according to an embodiment of the present invention;
fig. 6 is a schematic flowchart of a method for detecting a vehicle surroundings according to another embodiment of the present invention;
fig. 7 is a schematic structural diagram of a detection device for detecting an environment around a vehicle according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of another vehicle surroundings detection apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method comprises the following steps:
referring to fig. 1, the figure is a schematic flow chart of a method for detecting a vehicle surroundings according to an embodiment of the present invention.
The method for detecting the environment around the vehicle provided by the embodiment comprises the steps S101-S104.
S101: and obtaining a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor.
In the existing automatic driving system, the vehicle-mounted three-dimensional distance meter is widely applied as the environment perception input of automatic driving. Therefore, in the embodiment of the present invention, the sensor may be an existing environmental sensing sensor such as a three-dimensional laser range finder, which is not listed here.
Taking a three-dimensional laser range finder as an example, as shown in fig. 2, the three-dimensional laser range finder is installed on the upper part (such as the roof) of a vehicle, and a plurality of groups of laser emitting and receiving devices are arranged in the vertical direction at certain angular intervals, each group of laser emitting and receiving devices can measure the distance of an obstacle in front of a laser path by using the time-of-flight ranging principle, and a dotted line in fig. 2 is an illustration of a laser irradiation path. The laser emitting receiver of the three-dimensional distance meter is fixedly connected to a structure which is driven by a motor to rotate around the vertical direction. The motor rotates to drive the laser transmitting and receiving device to rotate and measure together, three-dimensional scanning of the surrounding environment of the vehicle is completed, and a scanning result is returned in a three-dimensional point cloud mode (namely point cloud data). In practical application, the number of the laser emitting receiver groups of the three-dimensional laser range finder can be 64, the rotation frequency is 10Hz, the effective detection radius is 60 meters, and the point cloud scale is about 130000 points/frame (pts/frame).
It can be understood that the point cloud data returned by the sensor is recorded according to its own coordinate system, and if the point cloud data in the sensor coordinate system is directly used as the basis to identify the ground around the vehicle, the accuracy and precision of the identification of the installation position and the installation angle of the sensor are affected. Therefore, it is necessary to convert the point cloud data in the sensor coordinate system into the vehicle body coordinate system of the vehicle itself, and divide the data in the vehicle body coordinate system into a plurality of grids, and obtain a grid map around the vehicle for data processing.
With continued reference to FIG. 2, the sensor coordinate system of the three-dimensional laser rangefinder is o1x1y1z1Origin o1At the optical centre, x, of the laser rangefinder1Axis, y1Axis and z1The axes are respectively defined by the three-dimensional laser range finder and follow the x, y and z directions of the three-dimensional laser range finder base. In the present embodiment, a coordinate system o of the vehicle body is defined2x2y2z2Wherein the origin o2Projection point, x, on the ground at the center of the rear axle of the vehicle2Axis, y2Axis and z2The shafts respectively point to the right front part, the left side direction and the vertical ground of the head of the vehicle. In practical operation, the rigid body rotation translation matrix can be adopted to realize the point cloud data from the sensor coordinate system o1x1y1z1To vehicle body coordinate system o2x2y2z2The conversion of (1).
In practical operation, the process of drawing the grid map may be specifically as shown in fig. 3, and includes the following steps:
s301: in the vehicle body coordinate system o2x2y2z2X of2y2In the plane, a grid map required for output is initialized.
Referring to fig. 4, the grid map is a circular area having a radius R, and divides the space around the vehicle into corresponding grid elements. The grid division rule may be: firstly, dividing a space into circular ring areas with equal width according to radial resolution delta r, and establishing a radius index from small to large according to the size of the circular ring areas to serve as a first-dimension index value of a grid; then from x2Starting in the positive direction and starting counterclockwise from the origin o2Drawing a plurality of lines as o in the figure2p rays, dividing the annular region into small sectors, from x, according to an angular resolution Δ θ2Starting from the positive direction of the axis, sorting the sector indexes in the reverse time to be used as a second dimension index of the grid, wherein the sector of the shaded part in the figure 4 is one grid in the grid map. The grid is the minimum unit of the grid map, the two-dimensional index values correspond to the grids one to one, and the specified grid can be accessed through the unique two-dimensional index value. In practical applications, if the map radius R is 60 meters, the radial resolution Δ R may be set to 0.2 meters and the angular resolution Δ θ may be 0.5 °.
S302: and obtaining a rigid body rotation translation matrix from the distance meter coordinate system to the vehicle body coordinate system according to the external parameters of the sensor.
It can be understood that the difference between the sensor coordinate system and the vehicle body coordinate system is related to the installation position and the installation attitude angle of the sensor in the vehicle body coordinate system, and the installation position and the installation attitude angle can be obtained by a direct measurement or parameter calibration method, which is not described herein again. The rigid body rotation translation matrix of the sensor coordinate system and the vehicle body coordinate system can be obtained from the installation position and the installation attitude angle
Figure BDA0001345903420000061
Figure BDA0001345903420000071
Where R is a 3 × 3 rotation matrix and T is a 3 × 1 translation matrix.
Suppose the sensor has an external parameter of (x)L,yL,zL,yawL,pitchL,rollL) Wherein x isL、yLAnd zLRespectively representing the three-dimensional mounting position, yaw, of the sensor in the vehicle body coordinate systemL、pitchLAnd rollLRespectively representing yaw, pitch and roll of the sensor.
Then, the rotation matrix
Figure BDA0001345903420000072
In (1),
R00=cos(yawL)*cos(pitchL),
R01=cos(yawL)*sin(pitchL)*sin(rollL)-sin(yawL)*cos(rollL),
R02=sin(yawL)*sin(rollL)+cos(yawL)*sin(pitchL)*cos(rollL);
R10=sin(yawL)*cos(pitchL),
R11=cos(yawL)*cos(rollL)+sin(yawL)*sin(pitchL)*sin(rollL),
R12=sin(yawL)*sin(pitchL)*cos(rollL)-cos(yawL)*sin(rollL);
R20=-sin(pitchL),
R21=cos(pitchL)*sin(rollL),
R22=cos(pitchL)*cos(rollL)。
translation matrix
Figure BDA0001345903420000073
S303: inputting three-dimensional point cloud data obtained by measuring by a three-dimensional laser range finder, traversing the unordered input point cloud, and performing steps S304-S306 in each access.
S304: it is determined for each input point whether it is a noise point.
In this embodiment, the noise point is determined according to the following criteria: the sensor is used as a vertex, and the vertex of the cone formed by projecting the vehicle is projected to cover the point covered by the ground projection and the point higher than the safe passing height of the vehicle.
S305: and performing rotation translation transformation on the non-noise points, and transforming the non-noise points in the input point cloud to a vehicle body coordinate system.
The coordinate of the non-noise point in the sensor coordinate system is (px)1,py1,pz1) The coordinate in the vehicle body coordinate system is (px)2,py2,pz2) Its rotational translation may be as in the following equation (1):
Figure BDA0001345903420000081
s306: and determining grid index numbers corresponding to the non-noise points in the grid map according to the coordinates of the non-noise points in the vehicle body coordinate system. The point is placed in the grid as a recording point within the grid according to the index number.
S307: and after the traversal operation of the point cloud is completed, the input point cloud is classified into a grid map in a corresponding grid to obtain the grid map.
Through the steps, the grid map of the surrounding environment of the vehicle can be obtained according to the data returned by the sensor, and preparation is made for subsequent steps.
S102: judging whether the height difference of the recording points in the grid to be processed is less than a first judgment threshold value THobj
It will be understood that the grid to be processed may be any grid in the grid map, and the height difference of the recorded points, i.e. the z of the recorded points in the grid to be processed2The difference between the maximum and minimum of the axis coordinates.
In the present embodimentFirst judgment threshold value THobjThe height for determining whether the recorded points in the grid to be processed represent obstacles, namely judging whether the vehicle can not safely pass through the area represented by the grid, and regarding the height which the vehicle can safely pass through as the ground. Since the relative height of the ground or obstacle with respect to the vehicle is related to the relative position between it and the vehicle, i.e. the closer to the vehicle, the higher the relative height of the ground or obstacle; conversely, the smaller. Therefore, in order to ensure that the ground around the vehicle can be accurately identified, the first judgment threshold THobjProportional to the distance between the grid to be treated and the vehicle.
As an example, in actual operation in particular, the first judgment threshold THobjCan be obtained from the formula (2)
THobj=aobj+Dist×Δr×bobj (2)
Wherein Dist is the distance between the grid to be processed and the vehicle (namely the origin of the vehicle body coordinate system); Δ r is the radial resolution; a isobjAnd bobjFor parameters, which are dependent on the noise behavior of the sensors and the height of the vehicle chassis that can be traveled, it can be empirically determined that, for example, if a three-dimensional laser distance measuring device with 64 laser transmitter/receiver groups, a rotation frequency of 10Hz, an effective detection radius of 60m, and a point cloud size of approximately 130000pts/frame is used in the above example, then aobj=0.15,bobj0.06. First judging threshold THobjIn meters.
S103: when the height difference of the recording points in the grid to be processed is smaller than a first judgment threshold value THobjIf the distance between the grid to be processed and the vehicle is smaller than the preset distance threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is smaller than a second judgment threshold THgnd
It is understood that, for the grids to be processed farther from the vehicle, it can be directly considered that the height difference is smaller than the first judgment threshold THobjThe grid to be processed can safely pass through the vehicle and is a ground grid, namely when the distance between the grid to be processed and the vehicle is greater than or equal to a preset distance threshold value, the grid to be processed is marked as the ground gridGrid (e.g., set the attribute tag of the grid to be processed to the GROUND grid GT _ group). However, the grid to be processed, which is closer to the vehicle (the distance from the vehicle is less than the preset threshold), is affected by the distance, and it cannot be accurately determined whether the vehicle can safely pass through the area represented by the grid to be processed only by the height difference of the recording points in the grid to be processed, so that the actual height of the recording points in the grid to be processed needs to be determined.
In this embodiment, the minimum height of the recording point in the grid to be processed is determined and the second determination threshold TH is usedgndThe lowest height is smaller than a second judgment threshold value THgndThe vehicle can safely pass through the grid.
And a first judgment threshold value THobjSimilarly, the second judgment threshold THgndAnd is also proportional to the distance between the grid to be processed and the vehicle. As an example, the second judgment threshold THgndCan be obtained from the formula (3)
THgnd=agnd+Dist×bgnd (3)
Wherein, agndAnd bgndFor parameters, which are dependent on the noise behavior of the sensors and the height of the vehicle chassis that can be traveled, it can be empirically determined that, for example, if a three-dimensional laser distance measuring device with 64 laser transmitter/receiver groups, a rotation frequency of 10Hz, an effective detection radius of 60m, and a point cloud size of approximately 130000pts/frame is used in the above example, then aobj=0.15,bobj0.04. First judging threshold THobjIn meters.
S104: when the lowest height of the recording points in the grid to be processed is less than a second judgment threshold THgndAnd marking the grid to be processed as a ground grid.
It will be appreciated that the minimum height of a recording point in the grid to be processed is greater than or equal to the second decision threshold THgndThe vehicle cannot pass through safely and is an obstacle area; otherwise, the vehicle can safely pass through the grid, and the grid is determined as the ground grid for the ground area.
In this embodiment, first, a grid map of the vehicle surroundings with the ground projection point of the vehicle rear axle center as the origin is drawn according to the point cloud data returned by the sensor with the center of the sensor as the origin. Then, according to the height of the recorded point in each grid, and a first judgment threshold and a second judgment threshold which are in direct proportion to the distance between the grid and the vehicle, judging whether the point cloud data recorded in the grid is the ground around the vehicle, identifying a ground area around the vehicle which can pass safely, and providing a basis for decision judgment of the driving assistance function of the ADAS. According to the method, the height of the point cloud data is used as a basis, the distance between the point cloud data and the vehicle is integrated, the point cloud data representing the ground is accurately screened out by utilizing two judgment thresholds, and an accurate decision basis is provided for automatic driving control of the vehicle.
Referring to fig. 5, a schematic flow chart of another method for detecting the environment around the vehicle according to the embodiment of the invention is shown. Through the embodiment, the area which can be safely passed by the vehicle in the grid map can be identified, in the embodiment, the grid map can be further detected, the obstacle in the grid map can be identified, and data support is provided for a subsequent algorithm (such as an obstacle identification algorithm).
In the present embodiment, after step S202, the following steps S501 to S503 are also included.
S501: when the height difference of the recording points in the grid to be processed is greater than or equal to a first judgment threshold THobjThen, whether the lowest height of the recording points in the grid to be processed is smaller than a second judgment threshold TH is continuously judgedgnd
It can be understood that when the height difference of the recording points in the grid to be processed is greater than or equal to the first judgment threshold THobjIn the meantime, it is described that there is an area in the grid to be processed, through which the vehicle cannot safely pass, that is, the grid includes an obstacle. However, since the grid is pre-divided, there may also be areas within the grid that include obstacles through which vehicles may safely pass (i.e., the ground). Therefore, in order to improve the accuracy, it is necessary to further judge whether or not the grid to be processed includes the ground.
In this embodiment, the second determination threshold value may be used as wellTHgndFor a specific description, refer to the above embodiments, and details are not described herein.
S502: when the lowest height of the recording points in the grid to be processed is greater than or equal to the second judgment threshold THgndAnd marking the grid to be processed as an obstacle grid.
It will be appreciated that the minimum height of a recording point in the grid to be processed is greater than or equal to the second decision threshold THgndIt means that the vehicle cannot safely pass through the grid to be processed, and the area represented by the grid is an obstacle, and is determined as an obstacle grid (for example, the attribute tag of the grid to be processed is set as the obstacle grid GT _ OBJECT).
S503: when the lowest height of the recording points in the grid to be processed is less than a second judgment threshold THgndAnd marking the grid to be processed as a ground obstacle grid.
It will be appreciated that if the lowest height of the recording points in the grid to be processed is less than the second decision threshold THgndIt means that the vehicle can safely pass through the grid to be processed, and the area represented by the grid is the obstacle and GROUND combination area, which is determined as the GROUND obstacle grid (for example, the attribute tag of the grid to be processed is set as the GROUND obstacle grid GT _ OBJECT _ ON _ group).
In this embodiment, the attribute tags of each grid in the grid map, such as the GROUND grid GT _ group, the obstacle grid GT _ OBJECT, and the GROUND obstacle grid GT _ OBJECT _ ON _ group, are determined, so as to provide data support for the subsequent algorithm.
Referring to fig. 6, the present invention provides a schematic flow chart of a method for detecting a vehicle surroundings according to another embodiment of the present invention. In this embodiment, the attribute of the recorded point in the grid map may be further identified, and it is determined whether the vehicle can safely pass through the recorded point, so as to provide data support for subsequent processing (such as an obstacle identification algorithm).
In this embodiment, after determining the attributes of the grid to be processed, the following steps S601-S605 are also included. It is understood that the step of identifying the attribute of the recording point in the grid may be performed immediately after determining the attribute of the grid to which the recording point belongs, may be performed in subsequent steps, for example, after identifying the attributes of all grids in the grid map, and steps S601 to S603 and steps S604 and S605 may be performed in any order or in parallel, which is not limited in this embodiment of the present invention.
S601: the recorded points within the ground grid are marked as ground reflection points.
For example, the attribute tag of the recording point to be processed is set to the GROUND reflection point PT _ group.
S602: the recorded points within the barrier grid are marked as barrier reflection points.
For example, the attribute tag of the recording point to be processed is set to the obstacle point PT _ OBJECT.
S603: judging whether the difference between the height of the recording point to be processed in the grid of the ground obstacle and the lowest height of the recording point in the grid is less than a third judgment threshold THpoint_ob. If yes, go to step S604; if not, go to step S605.
S604: and marking the recording points to be processed as ground obstacle reflection points.
S605: and marking the recording points to be processed as barrier reflection points.
It is to be understood that the third judgment threshold THpoint_obBy distinguishing between the points of reflection of obstacles in the grid of obstacles on the ground and the points of reflection of the area of connection between the ground and the obstacles, if the height of the recorded point to be processed (and its z)2Axial coordinate) is less than the third judgment threshold THpoint_obIf the recording point belongs to the reflection point of the GROUND and obstacle connection area, recording as a GROUND obstacle reflection point (for example, setting the attribute tag of the recording point to be processed as a GROUND obstacle point PT _ OBJECT _ ON _ group); otherwise, the recording point to be processed belongs to the obstacle and is recorded as an obstacle reflection point.
In this embodiment, attribute labels are set for the recording points in the grid map, such as the GROUND reflection point PT _ group, the GROUND obstacle reflection point PT _ OBJECT _ ON _ group, and the obstacle reflection point PT _ OBJECT, to provide data support for the subsequent obstacle identification algorithm.
In the embodiment of the present application, by using the point cloud data returned by the vehicle-mounted three-dimensional laser range finder, a ground area, an obstacle area, and a mixed area of an obstacle and the ground in a range of a radius of 60m around the vehicle can be calculated, and the input point cloud is marked according to the ground point cloud, the obstacle point cloud, and the obstacle ground transition area point cloud. Under the input scale that the input point cloud is 130000pts/frame @10Hz, the single calculation time of the algorithm of the industrial personal computer with 2.6GHZ multiplied by 4 nucleus is less than 50 milliseconds, and the real-time requirement of automatic driving can be met. After being processed by the method provided by the embodiment, each grid element in the grid map includes the following information:
firstly, a point cloud sequence of which the spatial position belongs to a grid and an attribute label of each point in the point cloud sequence, wherein the point attribute labels are three types: the GROUND reflection point PT _ GROUND, the obstacle reflection point PT _ OBJECT and the reflection point of the GROUND and obstacle joint area, namely the GROUND obstacle reflection point PT _ OBJECT _ ON _ GROUND.
Second, the maximum and minimum height values of the point cloud within the grid.
And thirdly, the distance value Dist of the grid from the origin O of the vehicle body coordinate system.
Fourthly, attribute labels of the grids are also provided with three types: GROUND grid GT _ group, obstacle grid GT _ OBJECT, and a mixed grid of GROUND and obstacles, i.e., GROUND obstacle grid GT _ OBJECT _ ON _ group.
Based on the detection method of the vehicle surrounding environment provided by the embodiment, the embodiment of the invention also provides a detection device of the vehicle surrounding environment.
The embodiment of the device is as follows:
referring to fig. 7, the figure is a schematic structural diagram of a detection device for detecting an environment around a vehicle according to an embodiment of the present invention.
The device provided by the embodiment comprises: the map acquisition module 100, the first judgment module 201, the second judgment module 202 and the grid marking module 300.
And the map acquisition module 100 is configured to acquire a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor.
The first determining module 201 is configured to determine whether a height difference between recording points in a grid to be processed is smaller than a first determining threshold, where the grid map includes the grid to be processed.
The second determining module 202 is configured to, when the first determining module 201 determines that the height difference between the recording points in the grid to be processed is smaller than the first determining threshold, if the distance between the grid to be processed and the vehicle is smaller than the preset distance threshold, continue to determine whether the minimum height of the recording points in the grid to be processed is smaller than the second determining threshold.
The grid marking module 300 is configured to mark the grid to be processed as a ground grid when the second determining module 202 determines that the minimum height of the recording point in the grid to be processed is smaller than the second determining threshold.
Wherein the first judgment threshold and the second judgment threshold are in direct proportion to the distance between the grid to be processed and the vehicle.
In this embodiment, first, a grid map of the vehicle surroundings with the ground projection point of the vehicle rear axle center as the origin is drawn according to the point cloud data returned by the sensor with the center of the sensor as the origin. Then, according to the height of the recorded point in each grid, and a first judgment threshold and a second judgment threshold which are in direct proportion to the distance between the grid and the vehicle, judging whether the point cloud data recorded in the grid is the ground around the vehicle, identifying a ground area around the vehicle which can pass safely, and providing a basis for decision judgment of the driving assistance function of the ADAS. According to the method, the height of the point cloud data is used as a basis, the distance between the point cloud data and the vehicle is integrated, the point cloud data representing the ground is accurately screened out by utilizing two judgment thresholds, and an accurate decision basis is provided for automatic driving control of the vehicle.
In a possible implementation manner of this embodiment, the grid map may be further detected to identify an obstacle therein, so as to provide data support for a subsequent algorithm (such as an obstacle identification algorithm).
Specifically, the second determining module 202 is further configured to, when the first determining module determines that the height difference of the recording points in the grid to be processed is greater than or equal to the first determining threshold, continue to determine whether the lowest height of the recording points in the grid to be processed is less than the second determining threshold.
The grid marking module 300 is further configured to mark the grid to be processed as a ground obstacle grid when the second determining module determines that the minimum height of the recording point in the grid to be processed is smaller than the second determining threshold. And the second judging module is also used for marking the grid to be processed as the barrier grid when judging that the lowest height of the recording points in the grid to be processed is greater than or equal to the second judging threshold.
In this embodiment, the attribute label of each grid in the grid map is determined, so that data support can be provided for a subsequent algorithm.
Referring to fig. 8, the schematic view is a schematic structural diagram of another vehicle surroundings detection apparatus according to an embodiment of the present invention. In this embodiment, the attribute of the recorded point in the grid map may be further identified, and it is determined whether the vehicle can safely pass through the recorded point, so as to provide data support for subsequent processing (such as an obstacle identification algorithm).
The detection apparatus for the vehicle surroundings according to the present embodiment further includes, on the basis of fig. 7: a third decision block 203 and a point marking block 400.
The third judging module 203 is configured to judge whether a difference between a height of a recording point to be processed in the grid of the ground obstacle and a minimum height of the recording point in the grid is smaller than a third judging threshold.
And a point marking module 400, configured to mark the recording point to be processed as a ground obstacle reflection point when the determination result of the third determining module 203 is yes. And is further configured to mark the recording point to be processed as an obstacle reflection point when the determination result of the third determining module 203 is negative.
In a possible implementation manner of this embodiment, the grid marking module 300 is further configured to mark the grid to be processed as a ground grid if the distance between the grid to be processed and the vehicle is greater than or equal to a preset distance threshold.
The point marking module 400 is further configured to mark the recorded points within the ground grid as ground reflection points. And also for marking the recorded points within the barrier grid as barrier reflection points.
In this embodiment, an attribute tag is set for a recording point in a grid map, and data support is provided for a subsequent obstacle identification algorithm.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant part can be referred to the method part for description.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The foregoing is merely a preferred embodiment of the invention and is not intended to limit the invention in any manner. Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make numerous possible variations and modifications to the present teachings, or modify equivalent embodiments to equivalent variations, without departing from the scope of the present teachings, using the methods and techniques disclosed above. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention are still within the scope of the protection of the technical solution of the present invention, unless the contents of the technical solution of the present invention are departed.

Claims (6)

1. A method of detecting an environment around a vehicle, the method comprising:
obtaining a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor;
judging whether the height difference of recording points in a grid to be processed is smaller than a first judgment threshold value or not, wherein the grid map comprises the grid to be processed;
when the height difference of the recording points in the grid to be processed is greater than or equal to the first judgment threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is less than a second judgment threshold;
when the lowest height of the recording points in the grid to be processed is smaller than the second judgment threshold, marking the grid to be processed as a ground obstacle grid;
when the lowest height of the recording points in the grid to be processed is greater than or equal to the second judgment threshold, marking the grid to be processed as an obstacle grid;
when the height difference of the recording points in the grid to be processed is smaller than the first judgment threshold, if the distance between the grid to be processed and the vehicle is smaller than a preset distance threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is smaller than the second judgment threshold;
if the distance between the grid to be processed and the vehicle is larger than or equal to the preset distance threshold, marking the grid to be processed as a ground grid;
when the lowest height of the recording points in the grid to be processed is smaller than the second judgment threshold, marking the grid to be processed as a ground grid;
wherein the first and second determination thresholds are proportional to a distance between the grid to be processed and the vehicle.
2. The method of detecting the environment around the vehicle according to claim 1, characterized by further comprising:
judging whether the difference between the height of the recording point to be processed in the ground obstacle grid and the lowest height of the recording point in the grid is smaller than a third judgment threshold value or not;
if yes, marking the recording points to be processed as ground obstacle reflection points;
and if not, marking the recording points to be processed as barrier reflection points.
3. The method of detecting the vehicle surroundings according to claim 1 or 2, characterized by further comprising:
marking the recorded points in the ground grid as ground reflection points;
marking the recorded points within the barrier grid as barrier reflection points.
4. An apparatus for detecting the environment around a vehicle, the apparatus comprising: the map acquisition module, the first judgment module, the second judgment module and the grid marking module;
the map acquisition module is used for acquiring a grid map of the surrounding environment of the vehicle according to the point cloud data returned by the sensor;
the first judging module is used for judging whether the height difference of the recording points in the grid to be processed is smaller than a first judging threshold value or not, and the grid map comprises the grid to be processed;
the second judging module is configured to, when the first judging module judges that the height difference between the recording points in the grid to be processed is smaller than the first judging threshold, if the distance between the grid to be processed and the vehicle is smaller than a preset distance threshold, continue to judge whether the lowest height of the recording points in the grid to be processed is smaller than a second judging threshold; when the first judging module judges that the height difference of the recording points in the grid to be processed is greater than or equal to the first judging threshold, continuously judging whether the lowest height of the recording points in the grid to be processed is smaller than the second judging threshold;
the grid marking module is configured to mark the grid to be processed as a ground grid when the second determination module determines that the minimum height of the recording point in the grid to be processed is smaller than the second determination threshold; when the second judging module judges that the lowest height of the recording points in the grid to be processed is smaller than the second judging threshold, marking the grid to be processed as a ground obstacle grid; when the second judging module judges that the lowest height of the recording points in the grid to be processed is greater than or equal to the second judging threshold, marking the grid to be processed as an obstacle grid; if the distance between the grid to be processed and the vehicle is larger than or equal to the preset distance threshold, marking the grid to be processed as a ground grid;
wherein the first and second determination thresholds are proportional to a distance between the grid to be processed and the vehicle.
5. The apparatus for detecting an environment around a vehicle according to claim 4, further comprising: a third judging module and a point marking module;
the third judging module is used for judging whether the difference between the height of the recording point to be processed in the ground obstacle grid and the lowest height of the recording point in the grid is smaller than a third judging threshold value;
the point marking module is used for marking the recording points to be processed as ground obstacle reflection points when the judgment result of the third judgment module is yes; and the recording point to be processed is marked as an obstacle reflection point when the judgment result of the third judgment module is negative.
6. The apparatus for detecting an environment around a vehicle according to claim 5, characterized by further comprising:
the point marking module is also used for marking the recording points in the ground grid as ground reflection points; and for marking the recorded points within the barrier grid as barrier reflection points.
CN201710556525.7A 2017-07-10 2017-07-10 Method and device for detecting surrounding environment of vehicle Active CN109238221B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710556525.7A CN109238221B (en) 2017-07-10 2017-07-10 Method and device for detecting surrounding environment of vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710556525.7A CN109238221B (en) 2017-07-10 2017-07-10 Method and device for detecting surrounding environment of vehicle

Publications (2)

Publication Number Publication Date
CN109238221A CN109238221A (en) 2019-01-18
CN109238221B true CN109238221B (en) 2021-02-26

Family

ID=65083424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710556525.7A Active CN109238221B (en) 2017-07-10 2017-07-10 Method and device for detecting surrounding environment of vehicle

Country Status (1)

Country Link
CN (1) CN109238221B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7147651B2 (en) * 2019-03-22 2022-10-05 トヨタ自動車株式会社 Object recognition device and vehicle control system
CN110286387B (en) * 2019-06-25 2021-09-24 深兰科技(上海)有限公司 Obstacle detection method and device applied to automatic driving system and storage medium
CN115151954A (en) * 2020-02-29 2022-10-04 华为技术有限公司 Method and device for detecting a drivable region
CN111652060B (en) * 2020-04-27 2024-04-19 宁波吉利汽车研究开发有限公司 Laser radar-based height limiting early warning method and device, electronic equipment and storage medium
CN114675290A (en) * 2022-02-25 2022-06-28 中国第一汽车股份有限公司 Ground data detection method, detection device and processor

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101975951A (en) * 2010-06-09 2011-02-16 北京理工大学 Field environment barrier detection method fusing distance and image information
CN104933708A (en) * 2015-06-07 2015-09-23 浙江大学 Barrier detection method in vegetation environment based on multispectral and 3D feature fusion
CN106199558A (en) * 2016-08-18 2016-12-07 宁波傲视智绘光电科技有限公司 Barrier method for quick
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method
US9555740B1 (en) * 2012-09-27 2017-01-31 Google Inc. Cross-validating sensors of an autonomous vehicle
CN106772434A (en) * 2016-11-18 2017-05-31 北京联合大学 A kind of unmanned vehicle obstacle detection method based on TegraX1 radar datas

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101975951A (en) * 2010-06-09 2011-02-16 北京理工大学 Field environment barrier detection method fusing distance and image information
US9555740B1 (en) * 2012-09-27 2017-01-31 Google Inc. Cross-validating sensors of an autonomous vehicle
CN104933708A (en) * 2015-06-07 2015-09-23 浙江大学 Barrier detection method in vegetation environment based on multispectral and 3D feature fusion
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method
CN106199558A (en) * 2016-08-18 2016-12-07 宁波傲视智绘光电科技有限公司 Barrier method for quick
CN106772434A (en) * 2016-11-18 2017-05-31 北京联合大学 A kind of unmanned vehicle obstacle detection method based on TegraX1 radar datas

Also Published As

Publication number Publication date
CN109238221A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN109238221B (en) Method and device for detecting surrounding environment of vehicle
CN109215083B (en) Method and device for calibrating external parameters of vehicle-mounted sensor
US10705220B2 (en) System and method for ground and free-space detection
EP3361278B1 (en) Autonomous vehicle localization based on walsh kernel projection technique
KR102420568B1 (en) Method for determining a position of a vehicle and vehicle thereof
EP3620823B1 (en) Method and device for detecting precision of internal parameter of laser radar
US9298992B2 (en) Geographic feature-based localization with feature weighting
CN110794406B (en) Multi-source sensor data fusion system and method
CN110867132B (en) Environment sensing method, device, electronic equipment and computer readable storage medium
US20210004566A1 (en) Method and apparatus for 3d object bounding for 2d image data
CN113673282A (en) Target detection method and device
Cao et al. Obstacle detection for autonomous driving vehicles with multi-lidar sensor fusion
US11892300B2 (en) Method and system for determining a model of the environment of a vehicle
CN114442101A (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
CN110927762A (en) Positioning correction method, device and system
CN112710339A (en) Method and apparatus for calibrating vehicle sensors
US8031908B2 (en) Object recognizing apparatus including profile shape determining section
JP6263453B2 (en) Momentum estimation device and program
CN113434788B (en) Picture construction method and device, electronic equipment and vehicle
US20230342434A1 (en) Method for Fusing Environment-Related Parameters
CN110023781A (en) Method and apparatus for determining the accurate location of vehicle according to the radar of vehicle-periphery signature
CN110095776B (en) Method for determining the presence and/or the characteristics of an object and surrounding identification device
RU2757037C1 (en) Method and system for identifying the presence of a track on the current terrain
Lee et al. Identifying Puddles based on Intensity Measurement using LiDAR
CN110341716B (en) Vehicle speed calculation method and device, automatic driving system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant