CN114660568B - Laser radar obstacle detection method and device - Google Patents

Laser radar obstacle detection method and device Download PDF

Info

Publication number
CN114660568B
CN114660568B CN202210155073.2A CN202210155073A CN114660568B CN 114660568 B CN114660568 B CN 114660568B CN 202210155073 A CN202210155073 A CN 202210155073A CN 114660568 B CN114660568 B CN 114660568B
Authority
CN
China
Prior art keywords
obstacle
coordinate system
laser radar
pixel
grid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210155073.2A
Other languages
Chinese (zh)
Other versions
CN114660568A (en
Inventor
李庭潘
刘平
孙金泉
蔡登胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Liugong Machinery Co Ltd
Original Assignee
Guangxi Liugong Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Liugong Machinery Co Ltd filed Critical Guangxi Liugong Machinery Co Ltd
Priority to CN202210155073.2A priority Critical patent/CN114660568B/en
Publication of CN114660568A publication Critical patent/CN114660568A/en
Application granted granted Critical
Publication of CN114660568B publication Critical patent/CN114660568B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a laser radar obstacle detection method and device, and relates to the field of intelligent crane control. The device is used for solving the problem that in the prior art, the excavator has a certain pitch angle on uneven ground, so that the obstacle is relatively close to the excavator. Comprising the following steps: cutting laser point cloud data of the obstacle detected by the laser radar and then projecting the cut laser point cloud data onto a grid map; determining the corresponding position of each grid in the binary image and the gray value of the position according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data; hole filling is carried out on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and the position coordinate information of the obstacle in a laser radar coordinate system is corrected according to a second distance; and converting the corrected position coordinate information of the obstacle in the laser radar coordinate system into the map coordinate system to obtain the position and the shape of the obstacle.

Description

Laser radar obstacle detection method and device
Technical Field
The invention relates to the field of intelligent control of cranes, in particular to a laser radar obstacle detection method and device.
Background
The detection method of the obstacle based on the laser radar mainly comprises two types at present, wherein one type is a detection method based on a grid map, and the other type is a detection method based on clustering; the detection method based on the grid map has the advantages of simplicity, small operation amount, high speed and the like. The grid map-based laser radar obstacle detection needs to project a 3D point cloud onto a plane, generally directly onto a laser radar coordinate system XOY plane, and converts the plane into a 2D image, so that the obstacle detection based on the image can be realized.
In the prior art, because the 3D point cloud shot by the laser radar is directly projected onto the XOY plane of the laser radar in a projection mode. Therefore, certain information is lost, such as losing the height information of the obstacle, and the obstacle is close to the laser radar when the complete machine has a certain pitch angle.
Disclosure of Invention
The embodiment of the invention provides a method and a device for detecting a laser radar obstacle, which are used for solving the problem that the obstacle is relatively close to the machine because the excavator in the prior art has a certain pitch angle on uneven ground.
The embodiment of the invention provides a laser radar obstacle detection method, which comprises the following steps:
Cutting laser point cloud data of the obstacles detected by the laser radar, projecting the cut laser point cloud data onto a grid map, and determining the height of the obstacles in each grid;
Determining the corresponding position of each grid in the binary image and the gray value of the position according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data;
Hole filling is carried out on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and the position coordinate information of the obstacle in the binary pixel coordinate system is converted into a laser radar coordinate system to obtain position coordinate information of the obstacle in the laser radar coordinate system;
Determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance;
And converting the corrected position coordinate information of the obstacle in the laser radar coordinate system into the map coordinate system to obtain the position and the shape of the obstacle.
Preferably, the determining the second distance between the first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to the first distance between the obstacle and the coordinate origin of the laser radar coordinate system specifically includes:
determining a first distance between the obstacle and the coordinate origin of the laser radar coordinate system according to the pitch angle of the mobile carrier and a third distance between a second projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system;
And determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system and the pitch angle of the mobile carrier.
Preferably, the clipping the laser point cloud data of the obstacle detected by the laser radar, and then projecting the clipped laser point cloud data onto a grid map, and determining the height of the obstacle in each grid, which specifically includes:
Acquiring laser point cloud data of an object detected by a laser radar, and cutting the laser point cloud data according to an area set by a laser radar coordinate system;
Establishing a grid map, projecting each sampling point included in the cut laser point cloud data onto each grid in the grid map, and determining the height of an obstacle in each grid;
The height of the obstacle in each grid is:
height(i,j)=z1 max(i,j)-z1 min(i,j)
wherein height (i, j) represents the height of the obstacle in the (i, j) th grid, Z1 max (i,j) represents the maximum height value of the (I, j) th grid, z1 min (i,j) represents the minimum height value of the (I, j) th grid, I i,j represents all three-dimensional points in the j-th row and I-th column grid, P represents one three-dimensional point, P z1 represents the coordinate value of the point in the z1 axis direction, the z1 axis is located in a laser radar coordinate system, and the laser radar coordinate system comprises an x1 axis, a y1 axis, a z1 axis and an o1 point.
Preferably, the determining, according to the relationship between the lidar coordinate system and the binarized pixel coordinate system, the corresponding position of each grid in the binarized image and the gray value of the position of each grid specifically includes:
The laser radar coordinate system comprises an x1 axis, a y1 axis, a z1 axis and an o1 point, the binarization pixel coordinate system comprises an x2 axis, a y2 axis and an o2 point, the directions of the x1 axis and the x2 axis are consistent, and the directions of the y1 axis and the y2 axis are opposite;
The corresponding position of the (i, j) th grid in the binarized image is the j th row and the i th column, and the gray value is determined by the following formula:
f (i, j) represents a gray value of the (i, j) th grid, and when the obstacle height of the (i, j) th grid is greater than the threshold value thresh, the gray value of the (i, j) th grid is 1; when the obstacle height of the (i, j) th grid is smaller than the threshold value thresh, the gray value of the (i, j) th grid is 0.
Preferably, the hole filling is performed on the binary image to obtain position coordinate information of the obstacle in the binary image, which specifically includes:
Performing boundary expansion on the binarized image to obtain f extern_1; taking the (0, 0) th pixel of the binarized image as a seed point, and filling in the diffuse water to obtain f extern_2; removing boundary pixels of the f extern_2 to obtain an image f 2 with the same size as the binarized image; the binarized image after hole filling is obtained by the following formula:
f3=f1|(~f2)
the center coordinates, length and width of the obstacle in the binarized image are determined by the following formula:
Wherein f 1 represents a binary image of laser point cloud data, f 3 represents a binary image after hole filling, f 2 represents a gray value of all pixels, P center represents a center coordinate of an obstacle in the binary image, P i represents each pixel in an obstacle pixel set, pixel_width represents a width of the obstacle in the binary image, pixel_length represents a length of the obstacle in the binary image, P is a pixel in an obstacle pixel set S, P x2 represents a coordinate of the pixel on an x2 axis included in a binary pixel coordinate system, and P y2 represents a coordinate of the pixel on a y2 axis included in the binary pixel coordinate system, wherein the binary pixel coordinate system includes the x2 axis, the y2 axis and the o2 point.
Preferably, the converting the position coordinate information of the obstacle in the binarized pixel coordinate system to the laser radar coordinate system to obtain the position coordinate information of the obstacle in the laser radar coordinate system specifically includes:
The position coordinate information of the obstacle under the laser radar coordinate system is obtained through the following formula:
Wherein, P x1,Py1,Pz1 represents the coordinates of the obstacle in the laser radar coordinate system, P x2Py2 represents the coordinates of the obstacle in the binarized pixel coordinate system, P x1_min represents the minimum x1 axis coordinates of the laser point cloud data when clipping, P y1_min represents the minimum y1 axis coordinates of the laser point cloud data when clipping, d represents the grid pitch set when the laser point cloud data is projected onto the grid map, obs_width represents the width of the obstacle in the laser radar coordinate system, obs_length represents the length of the obstacle in the laser radar coordinate system, pixel_width represents the width of the obstacle in the binarized pixel coordinate system, and pixel_length represents the length of the obstacle in the binarized pixel coordinate system.
The embodiment of the invention also provides a laser radar obstacle detection device, which comprises:
The determining unit is used for cutting laser point cloud data of the obstacles detected by the laser radar, projecting the cut laser point cloud data onto the grid map and determining the height of the obstacles in each grid;
The first obtaining unit is used for determining the corresponding position of each grid in the binary image and the gray value of the position of each grid according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data;
The second obtaining unit is used for filling holes in the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and converting the position coordinate information of the obstacle in the binary pixel coordinate system into a laser radar coordinate system to obtain the position coordinate information of the obstacle in the laser radar coordinate system;
The correction unit is used for determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance;
and a third obtaining unit for converting the corrected position coordinate information of the obstacle under the laser radar coordinate system to the map system to obtain the position and shape of the obstacle.
Preferably, the correction unit is specifically configured to:
determining a first distance between the obstacle and the coordinate origin of the laser radar coordinate system according to the pitch angle of the mobile carrier and a third distance between a second projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system;
And determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system and the pitch angle of the mobile carrier.
Preferably, the second obtaining unit is specifically configured to:
Performing boundary expansion on the binarized image to obtain f extern_1; taking the (0, 0) th pixel of the binarized image as a seed point, and filling in the diffuse water to obtain f extern_2; removing boundary pixels of the f extern_2 to obtain an image f 2 with the same size as the binarized image; the binarized image after hole filling is obtained by the following formula:
f3=f1|(~f2)
the center coordinates, length and width of the obstacle in the binarized image are determined by the following formula:
Wherein f 1 represents a binary image of laser point cloud data, f 3 represents a binary image after hole filling, f 2 represents a gray value of all pixels, P center represents a center coordinate of an obstacle in the binary image, P i represents each pixel in an obstacle pixel set, pixel_width represents a width of the obstacle in the binary image, pixel_length represents a length of the obstacle in the binary image, P is a pixel in an obstacle pixel set S, P x2 represents a coordinate of the pixel on an x2 axis included in a binary pixel coordinate system, and P y2 represents a coordinate of the pixel on a y2 axis included in the binary pixel coordinate system, wherein the binary pixel coordinate system includes the x2 axis, the y2 axis and the o2 point.
Preferably, the second obtaining unit is specifically configured to:
The position coordinate information of the obstacle under the laser radar coordinate system is obtained through the following formula:
Wherein, P x1,Py1,Pz1 represents the coordinates of the obstacle in the laser radar coordinate system, P x2Py2 represents the coordinates of the obstacle in the binarized pixel coordinate system, P x1_min represents the minimum x1 axis coordinates of the laser point cloud data when clipping, P y1_min represents the minimum y1 axis coordinates of the laser point cloud data when clipping, d represents the grid pitch set when the laser point cloud data is projected onto the grid map, obs_width represents the width of the obstacle in the laser radar coordinate system, obs_length represents the length of the obstacle in the laser radar coordinate system, pixel_width represents the width of the obstacle in the binarized pixel coordinate system, and pixel_length represents the length of the obstacle in the binarized pixel coordinate system.
The embodiment of the invention provides a method and a device for detecting a laser radar obstacle, wherein the method comprises the following steps: cutting laser point cloud data of the obstacles detected by the laser radar, projecting the cut laser point cloud data onto a grid map, and determining the height of the obstacles in each grid; determining the corresponding position of each grid in the binary image and the gray value of the position according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data; hole filling is carried out on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and the position coordinate information of the obstacle in the binary pixel coordinate system is converted into a laser radar coordinate system to obtain position coordinate information of the obstacle in the laser radar coordinate system; determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance; and converting the corrected position coordinate information of the obstacle under the laser radar coordinate system into the mobile carrier coordinate system to obtain the position and the shape of the obstacle. The method improves the laser radar obstacle detection method based on the grid map, and corrects the problem that the distance between the obstacle and the mobile carrier is close due to the loss of some information of the grid map, so that the accuracy of obstacle detection can be improved; the problem that in the prior art, the excavator has a certain pitch angle on uneven ground, so that obstacles are relatively close to the excavator is solved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for detecting a lidar obstacle according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a relationship between a laser radar coordinate system and a mobile carrier coordinate system according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of the relationship between a map coordinate system, a laser radar coordinate system and a mobile carrier coordinate system according to an embodiment of the present invention;
FIG. 4 is a top view of a laser point cloud data clipping range provided by an embodiment of the present invention;
FIG. 5 is a schematic diagram of a relationship between a laser radar coordinate system and a binarized pixel coordinate system according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a binarized image f 1 according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a binarized image f extern_1 according to an embodiment of the present invention;
FIG. 8 is a diagram of a binarized image f extern_2 according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a binarized image f 2 according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a binarized image f 3 according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of detecting a lidar obstacle based on a grid map according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a laser radar obstacle detection device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic flow chart of a method for detecting a lidar obstacle according to an embodiment of the present invention, and the method for detecting a lidar obstacle is described in detail below according to the flowchart provided in fig. 1.
As shown in fig. 1, the method mainly comprises the following steps:
Step 101, cutting laser point cloud data of an obstacle detected by a laser radar, projecting the cut laser point cloud data onto a grid map, and determining the height of the obstacle in each grid;
step 102, determining the corresponding position of each grid in the binary image and the gray value of the position according to the relation between the laser radar coordinate system and the binary pixel coordinate system, and obtaining a binary image of laser point cloud data;
Step 103, hole filling is carried out on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and the position coordinate information of the obstacle in the binary pixel coordinate system is converted into a laser radar coordinate system to obtain the position coordinate information of the obstacle in the laser radar coordinate system;
104, determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance;
And 105, converting the corrected position coordinate information of the obstacle in the laser radar coordinate system into the map coordinate system to obtain the position and the shape of the obstacle.
Before describing the method for detecting the laser radar obstacle provided by the embodiment of the invention, the positional relationship between the laser radar detector and the mobile carrier is described first, as shown in fig. 2, an x1 axis, a y1 axis, a z1 axis and an origin o1 are included in a laser radar coordinate system corresponding to the laser radar detector; the moving carrier coordinate system corresponding to the moving carrier comprises an x3 axis, a y3 axis, a z3 axis and an origin o3. Specifically, the x1 axis of the lidar coordinate system and the x3 axis of the moving carrier coordinate system are both oriented forward, the y1 and y3 axes are oriented to the left, and the z1 and z3 axes are oriented upward. It should be noted that, an RTK (Real-TIME KINEMATIC, real-time differential positioning) is further disposed on the mobile carrier, and the roll angle, pitch angle and heading angle of the mobile carrier can be obtained through the RTK. The angle of rotation about the mobile carrier x3 axis is roll angle (roll), the angle of rotation about the mobile carrier y3 axis is pitch angle (pitch), and the angle of rotation about the mobile carrier z3 axis is heading angle (yaw). As shown in fig. 3, in the same system as the laser radar coordinate system and the mobile carrier coordinate system, a map coordinate system-ENU coordinate system is further provided, wherein the map coordinate system comprises an X4 axis, a Y4 axis, a z4 axis and an origin O4, the X4O4Y4 plane of the map coordinate system is parallel to the horizontal plane, and the direction of the X4 axis is opposite to the directions of the Y1 axis and the Y3 axis; the direction of the y4 axis is consistent with the directions of the x1 axis and the x2 axis; the Z4 axis is oriented upward.
In practical application, the map coordinate system is a northeast coordinate system, the X4O4Y4 plane is parallel to the ground, the X4 axis direction points to the east, the Y4 axis direction points to the north, and the z4 axis direction points upward.
As shown in fig. 1, the coordinate system with the origin of coordinates o3 represents the moving carrier coordinate system, the origin of which is the center point of the moving carrier, and the x3 direction coincides with the advancing direction of the moving carrier. The coordinate system with the origin of o1 represents a laser radar coordinate system, the orientation of the coordinate system is consistent with that of the mobile carrier coordinate system, and a certain offset exists only in the x1 axial direction, namely, the laser radar is arranged right in front of the mobile carrier, θ is a pitch angle, and in practical application, when the two front wheels of the mobile carrier are used for pressing down broken stones, a certain pitch angle is formed.
Before step 101, a laser radar detector scans a scene in a galvanometer mode to obtain point cloud data in a laser radar coordinate system in a field of view. For each scanning result, the laser radar transmits the point cloud data to the controller in a TCP (English: transmission Control Protocol) mode, and the controller analyzes the network packet data to finally obtain a sequence of 3D coordinate points.
It should be noted that, in the method provided by the embodiment of the present invention, the execution body is a controller.
In step 101, after the controller acquires the laser point cloud data of the object detected by the laser radar, the laser point cloud data may be clipped, specifically, considering the performance of the master control, in order to save computing resources. Only considering laser point cloud data in a certain range, as shown in fig. 4, cutting a cuboid region in the positive direction of the x1 axis of the laser radar, and reserving the laser point cloud data in the region, wherein the laser point cloud data in the region meets the condition shown in a formula (1):
Wherein, for any one point P, the coordinate value P x1 in the x1 axis direction is larger than P x1_min and smaller than P x1_max; the coordinate value P y1 in the y1 axis direction is greater than P y1_min and less than P y1_max; the coordinate value P z1 in the z1 axis direction is greater than P z1_min and less than P z1_max.
Further, for the clipped laser point cloud data, rasterizing is performed according to a laser radar coordinate system from the x1 axis direction and the y1 axis direction according to the distance of d meters, then each sampling point included in the laser point cloud data is projected onto each grid in the grid map, and for any one P point, the index in the x1 axis direction and the y1 axis direction can be represented by the following formula (2):
Where i denotes a rasterization index of the point P in the x1 axis direction, P x1 denotes a coordinate of the point P in the x1 axis direction, P x1_min denotes a minimum value in the x1 axis direction set when performing point cloud clipping, and d is a rasterization pitch. Similarly, j represents a rasterization index of the point P in the y1 axis direction, P y1 represents the coordinate of the point P in the y1 axis direction, and P y1_min represents the minimum value in the y1 axis direction set when performing point cloud clipping. In the time application, the index is an integer, so in the embodiment of the invention, the obtained result is rounded, the grid of the ith row and the ith column is represented by (I, j), and all three-dimensional points in the grid of the ith row and the ith column are represented by I i,j.
Further, for each grid, it is necessary to determine the maximum value in the z1 axis direction, the minimum value in the z1 axis direction and the obstacle height, specifically as shown in the following formula:
height(i,j)=z1 max(i,j)-z1 min(i,j) (5)
wherein height (i, j) represents the height of the obstacle in the (i, j) th grid, Z1 max (i,j) represents the maximum height value of the (I, j) th grid, z1 min (i,j) represents the minimum height value of the (I, j) th grid, I i,j represents all three-dimensional points in the j-th row and I-th column grid, P represents one three-dimensional point, and P z1 represents the coordinate value of the point in the z1 axis direction.
In step 102, before determining the gray value of each grid corresponding to and at the position in the binary image, the relationship between the lidar coordinate system and the binary pixel coordinate system needs to be determined, as shown in fig. 5, where the directions of the x1 axis and the x2 axis included in the lidar coordinate system and the binary pixel coordinate system are always, but the directions of the y1 axis and the y2 axis are opposite.
Specifically, the j-th row and the i-th column of pixels in the binarized image corresponding to the (i, j) -th grid, the gray value of which is represented by the following formula (6):
For the (i, j) th grid, the gray value f (i, j) of the j-th row and i-th column on the binarized image is determined by comparing the height value with a threshold value, and if the height of the (i, j) th grid is greater than the threshold value thresh, the gray value is 1, otherwise, the gray value is 0.
In practical applications, the expansion operation is a morphological operation, the gray value of the structural element b is 1, and the structural element b is symmetrical, and the origin is located at the center. When the origin of b is located at (x 2, y 2) of the binarized image, the expansion of the image f at (x 2, y 2) with the structural element b is defined as the maximum of the overlapping region with b in the image f, i.e
Wherein,The expansion result at the (x 2, y 2) position is shown, and the coordinates of the structural element b are shown by (s, t).
In step 103, hole filling is performed on the binary image to obtain position coordinate information of the obstacle in the binary pixel coordinate system, which specifically includes:
As shown in fig. 6, in the binary image f 1 provided by the embodiment of the present invention, the peripheral boundary of the binary image is extended, specifically, a row of pixels with gray values of 0 is added to the front of the first row and the rear of the last row of the binary image f 1; a new binarized image, denoted by f extern_1, can be obtained by adding a row of pixels with gray values of 0 to the left of the first row and to the right of the last row, respectively, as shown in fig. 7.
Further, the (0, 0) th pixel of the binarized image f 1 shown in fig. 6 is used as a seed point, and the filling is performed by flooding, so long as the right and lower pixels are the same as the current gray value, the filling is white, and in this cycle, only the gray value in the hole of the obstacle is kept (0), the gray values of other pixels are all changed to 1, and finally a new image is obtained, which is denoted by f extern_2, as shown in fig. 8.
Further, boundary pixels of the binarized image represented by f extern_2 are removed, resulting in an image f 2 having the same size as the binarized image shown in fig. 6, as shown in fig. 9.
Further, the binarized image f 3 after hole filling is obtained by the following formula, as shown in fig. 10:
f3=f1|(~f2) (8)
Wherein f 1 represents a binarized image of laser point cloud data, f 3 represents a binarized image after hole filling, and f 2 represents the gray values of all pixels to be inverted.
Further, the pixels with gray values of 1 in the binarized image (the binarized image f 1 after hole filling) denoted by f 3 are classified, different connected areas form different barriers, so that a plurality of pixel sets of the barriers can be obtained, and for each pixel set of the barrier, the center coordinates, the length and the width of the barrier in the binarized image can be determined by the following formula:
Wherein P center represents the center coordinate of the obstacle in the binarized image, P i represents each pixel in the set of obstacle pixels, pixel_width represents the width of the obstacle in the binarized image, pixel_length represents the length of the obstacle in the binarized image, P is a pixel in one set of obstacle pixels S, P x2 represents the coordinate of the pixel on the x3 axis included in the binarized pixel coordinate system, and P y2 represents the coordinate of the pixel on the y3 axis included in the binarized pixel coordinate system, wherein the binarized pixel coordinate system includes the x2 axis, the y2 axis, the z2 axis, and the o2 point.
In the embodiment of the invention, the directions of the x1 axis and the x2 axis of the laser radar coordinate system and the directions of the y1 axis and the directions of the y2 axis are consistent, but the directions of the x1 axis and the directions of the x2 axis are opposite. Therefore, the y2 value of the center coordinates of the obstacle in the binary image coordinate system needs to be inverted, and a certain scale factor and a certain offset are applied to convert the obstacle into the laser radar coordinate system. And the x2 value is directly multiplied by the scale factor plus the offset.
The position coordinate information of the obstacle under the laser radar coordinate system is obtained through the following formula:
Wherein, P x1,Py1,Pz1 represents the coordinates of the obstacle in the laser radar coordinate system, P x2Py2 represents the coordinates of the obstacle in the binarized pixel coordinate system, P x1_min represents the minimum x1 axis coordinates of the laser point cloud data when clipping, P y1_min represents the minimum y1 axis coordinates of the laser point cloud data when clipping, d represents the grid pitch set when the laser point cloud data is projected onto the grid map, obs_width represents the width of the obstacle in the laser radar coordinate system, obs_length represents the length of the obstacle in the laser radar coordinate system, pixel_width represents the width of the obstacle in the binarized pixel coordinate system, and pixel_length represents the length of the obstacle in the binarized pixel coordinate system.
In the above steps, the road surface unevenness and the pitch angle are not considered in the obstacle detection, so that the detected obstacle is close to the vehicle end, and may cause unnecessary obstacle avoidance. In the embodiment of the present invention, the above-determined numerical value needs to be corrected in consideration of the pitch angle based on the original calculation result.
Specifically, as shown in fig. 11, the lidar is mounted on the moving carrier, and the directions of the coordinate axes are identical to those of the coordinate axes corresponding to the moving carrier. The X4O4Y4 plane of the map coordinate system is parallel to the ground when the moving carrier is at a pitch angle θ (obtained by RTK) at that moment. P is the actual obstacle position, and when detecting an obstacle using a grid map-based method, P will project to point P 1 on the plane of the lidar coordinate system X1O1Y1, and the altitude information has been lost. The X4O4Y4 plane of the map coordinate system is kept horizontal to the ground, and at this time, when the detected obstacle center point P 1 is converted into the map coordinate system, the horizontal distance of the obstacle from the lidar is S1 but the actual horizontal distance is S2 due to the existence of the pitch angle. In order to restore the horizontal distance of the obstacle from the origin of the laser radar coordinates to S2, that is, to correct the point P 1 to the point P 2, the following procedure is adopted:
According to the pitch angle theta of the moving carrier, the third distance L between the second projection point P 1 of the obstacle projected onto the laser radar coordinate system and the coordinate origin o1 of the laser radar coordinate system, determining the first distance S2 between the obstacle P and the coordinate origin o1 of the laser radar coordinate system by the following formula (14):
S2=L/cosθ (14)
according to a first distance S2 between the obstacle P and a coordinate origin o1 of the laser radar coordinate system and a pitch angle theta of the moving carrier, a first projection point P 2 of the obstacle projected onto the laser radar coordinate system and a second distance L 'between the obstacle P and the coordinate origin of the laser radar coordinate system are determined, wherein the second distance L' is specifically shown as a formula (15):
L'=S2/cosθ=L/cos2θ (15)
Wherein S2 is a first distance between the obstacle P and the origin o1 of the lidar coordinate system, θ is a pitch angle of the moving carrier, L is a third distance between the second projection point P 1 of the obstacle projected onto the lidar coordinate system and the origin o1 of the lidar coordinate system, and L' is a second distance between the first projection point P 2 of the obstacle projected onto the lidar coordinate system and the origin of the lidar coordinate system.
Further, according to the second distance between the first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system, which is determined by the steps, the position coordinate information of the obstacle under the laser radar coordinate system is corrected.
Step 105, through the above steps, the information of the center point, length and width of the obstacle in the laser radar coordinate system has been obtained. According to calibration, a transformation matrix from the laser radar coordinate system to the mobile carrier coordinate system can be determined to be T cl, wherein the matrix is a matrix of 4x4, the matrix comprises rotation and translation, the first three rows and the third columns are rotation matrices, the first three rows and the third columns of the last column are translation amounts, the fourth row and the fourth column are 1, and the other rows and the fourth columns are 0. Meanwhile, according to the positioning information of the RTK, the pose T mc of the whole machine under the map coordinate system can be known, the definition of the transformation matrix is the same as that of T cl, the transformation matrix comprises rotation and translation, and the following transformation is carried out to convert the central point of the obstacle into the position under the map coordinate system:
in the method, in the process of the invention, Representing homogeneous coordinates of the center point of the obstacle in the map coordinate system,/>Representing homogeneous coordinates of the center point of the obstacle in the laser radar coordinate system.
In summary, an embodiment of the present invention provides a method and an apparatus for detecting a lidar obstacle, where the method includes: cutting laser point cloud data of the obstacles detected by the laser radar, projecting the cut laser point cloud data onto a grid map, and determining the height of the obstacles in each grid; determining the corresponding position of each grid in the binary image and the gray value of the position according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data; hole filling is carried out on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and the position coordinate information of the obstacle in the binary pixel coordinate system is converted into a laser radar coordinate system to obtain position coordinate information of the obstacle in the laser radar coordinate system; determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance; and converting the corrected position coordinate information of the obstacle under the laser radar coordinate system into the mobile carrier coordinate system to obtain the position and the shape of the obstacle. The method improves the laser radar obstacle detection method based on the grid map, and corrects the problem that the distance between the obstacle and the mobile carrier is close due to the loss of some information of the grid map, so that the accuracy of obstacle detection can be improved; the problem that in the prior art, the excavator has a certain pitch angle on uneven ground, so that obstacles are relatively close to the excavator is solved.
Based on the same inventive concept, the embodiment of the invention provides an expressway green channel traffic management system based on artificial intelligence, and because the principle of solving the technical problem of the system is similar to that of an expressway green channel traffic management method based on artificial intelligence, the implementation of the system can be referred to the implementation of the method, and the repetition is omitted.
Fig. 12 is a schematic structural diagram of a lidar obstacle detection device according to an embodiment of the present invention, where, as shown in fig. 12, the device mainly includes: a determining unit 201, a first obtaining unit 202, a second obtaining unit 203, a correcting unit 204 and a third obtaining unit 205.
A determining unit 201, configured to clip laser point cloud data of an obstacle detected by the laser radar, then project the clipped data onto a grid map, and determine an obstacle height in each grid;
A first obtaining unit 202, configured to determine, according to a relationship between the laser radar coordinate system and the binarized pixel coordinate system, a position corresponding to each grid in the binarized image and a gray value of the position where each grid is located, so as to obtain a binarized image of laser point cloud data;
A second obtaining unit 203, configured to perform hole filling on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and convert the position coordinate information of the obstacle in the binary pixel coordinate system to a laser radar coordinate system to obtain position coordinate information of the obstacle in the laser radar coordinate system;
A correction unit 204, configured to determine a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and a coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correct position coordinate information of the obstacle under the laser radar coordinate system according to the second distance;
and a third obtaining unit 205, configured to convert the corrected position coordinate information of the obstacle in the laser radar coordinate system into the map coordinate system, so as to obtain the position and shape of the obstacle.
Further, the correction unit 204 is specifically configured to:
Determining a first distance between the obstacle and the coordinate origin of the laser radar coordinate system according to the pitch angle of the moving carrier and a third distance between a second projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system (S2);
And determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system and the pitch angle of the mobile carrier.
Further, the second obtaining unit 203 is specifically configured to:
Performing boundary expansion on the binarized image to obtain f extern_1; taking the (0, 0) th pixel of the binarized image as a seed point, and filling in the diffuse water to obtain f extern_2; removing boundary pixels of the f extern_2 to obtain an image f 2 with the same size as the binarized image; the binarized image after hole filling is obtained by the following formula:
f3=f1|(~f2)
the center coordinates, length and width of the obstacle in the binarized image are determined by the following formula:
Wherein f 1 represents a binary image of laser point cloud data, f 3 represents a binary image after hole filling, f 2 represents a gray value of all pixels, P center represents a center coordinate of an obstacle in the binary image, P i represents each pixel in an obstacle pixel set, pixel_width represents a width of the obstacle in the binary image, pixel_length represents a length of the obstacle in the binary image, P is a pixel in an obstacle pixel set S, P x2 represents a coordinate of the pixel on an x3 axis included in a binary pixel coordinate system, and P y2 represents a coordinate of the pixel on a y3 axis included in the binary pixel coordinate system, wherein the binary pixel coordinate system includes an x2 axis, a y2 axis and o2 points.
Further, the second obtaining unit 203 is specifically configured to:
The position coordinate information of the obstacle under the laser radar coordinate system is obtained through the following formula:
/>
Wherein, P x1,Py1,Pz1 represents the coordinates of the obstacle in the laser radar coordinate system, P x2Py2 represents the coordinates of the obstacle in the binarized pixel coordinate system, P x1_min represents the minimum x1 axis coordinates of the laser point cloud data when clipping, P y1_min represents the minimum y1 axis coordinates of the laser point cloud data when clipping, d represents the grid pitch set when the laser point cloud data is projected onto the grid map, obs_width represents the width of the obstacle in the laser radar coordinate system, obs_length represents the length of the obstacle in the laser radar coordinate system, pixel_width represents the width of the obstacle in the binarized pixel coordinate system, and pixel_length represents the length of the obstacle in the binarized pixel coordinate system.
It should be understood that the above laser radar obstacle detecting apparatus includes units that are only logically divided according to functions implemented by the device, and in practical applications, the above units may be stacked or split. The function achieved by the laser radar obstacle detecting device provided in this embodiment corresponds to one-to-one with the laser radar obstacle detecting method provided in the above embodiment, and the detailed processing flow achieved by the device is described in detail in the above method embodiment one, and will not be described in detail here.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. A method for detecting a lidar obstacle, comprising:
Cutting laser point cloud data of the obstacles detected by the laser radar, projecting the cut laser point cloud data onto a grid map, and determining the height of the obstacles in each grid;
Determining the corresponding position of each grid in the binary image and the gray value of the position according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data;
Hole filling is carried out on the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and the position coordinate information of the obstacle in the binary pixel coordinate system is converted into a laser radar coordinate system to obtain position coordinate information of the obstacle in the laser radar coordinate system;
Determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance;
And converting the corrected position coordinate information of the obstacle in the laser radar coordinate system into the map coordinate system to obtain the position and the shape of the obstacle.
2. The method for detecting a lidar obstacle according to claim 1, wherein the determining the second distance between the first projection point of the obstacle projected onto the lidar coordinate system and the origin of coordinates of the lidar coordinate system based on the first distance between the obstacle and the origin of coordinates of the lidar coordinate system comprises:
determining a first distance between the obstacle and the coordinate origin of the laser radar coordinate system according to the pitch angle of the mobile carrier and a third distance between a second projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system;
And determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system and the pitch angle of the mobile carrier.
3. The method for detecting a lidar obstacle according to claim 1, wherein the step of clipping laser point cloud data of the obstacle detected by the lidar and projecting the clipped data onto a grid map and determining the height of the obstacle in each grid comprises the steps of:
Acquiring laser point cloud data of an object detected by a laser radar, and cutting the laser point cloud data according to an area set by a laser radar coordinate system;
Establishing a grid map, projecting each sampling point included in the cut laser point cloud data onto each grid in the grid map, and determining the height of an obstacle in each grid;
The height of the obstacle in each grid is:
height(i,j)=z1max(i,j)-z1min(i,j)
wherein height (i, j) represents the height of the obstacle in the (i, j) th grid, Z1max (i,j) represents the maximum height value of the (I, j) th grid, z1min (i,j) represents the minimum height value of the (I, j) th grid, I i,j represents all three-dimensional points in the j-th row and I-th column grid, P represents one three-dimensional point, P z1 represents the coordinate value of the point in the z1 axis direction, the z1 axis is located in a laser radar coordinate system, and the laser radar coordinate system comprises an x1 axis, a y1 axis, a z1 axis and an o1 point.
4. The method for detecting a lidar obstacle according to claim 1, wherein the determining the corresponding position of each grid in the binary image and the gray value of the position according to the relationship between the lidar coordinate system and the binary pixel coordinate system specifically comprises:
The laser radar coordinate system comprises an x1 axis, a y1 axis, a z1 axis and an o1 point, the binarization pixel coordinate system comprises an x2 axis, a y2 axis and an o2 point, the directions of the x1 axis and the x2 axis are consistent, and the directions of the y1 axis and the y2 axis are opposite;
The corresponding position of the (i, j) th grid in the binarized image is the j th row and the i th column, and the gray value is determined by the following formula:
f (i, j) represents a gray value of the (i, j) th grid, and when the obstacle height of the (i, j) th grid is greater than the threshold value thresh, the gray value of the (i, j) th grid is 1; when the obstacle height of the (i, j) th grid is smaller than the threshold value thresh, the gray value of the (i, j) th grid is 0.
5. The method for detecting a lidar obstacle according to claim 1, wherein the hole filling is performed on the binarized image to obtain position coordinate information of the obstacle in the binarized image, specifically comprising:
Performing boundary expansion on the binarized image to obtain f extern_1; taking the (0, 0) th pixel of f extern_1 as a seed point, and filling with water to obtain f extern_2; removing boundary pixels of the f extern_2 to obtain an image f 2 with the same size as the binarized image; the binarized image after hole filling is obtained by the following formula:
f3=f1|(~f2)
the center coordinates, length and width of the obstacle in the binarized image are determined by the following formula:
Wherein f 1 represents a binarized image of laser point cloud data, f 3 represents a binarized image after hole filling, f 2 represents a gray value of all pixels to be inverted, P center represents a center coordinate of an obstacle in the binarized image, P i represents each pixel in an obstacle pixel set, pixel_width represents a width of the obstacle in the binarized image, pixel_length represents a length of the obstacle in the binarized image, P is a pixel in one obstacle pixel set S, P x2 represents a coordinate of the pixel on an x2 axis included in a binarized pixel coordinate system, and P y2 represents a coordinate of the pixel on a y2 axis included in the binarized pixel coordinate system, wherein the binarized pixel coordinate system includes the x2 axis, the y2 axis and the o2 point.
6. The method for detecting a lidar obstacle according to claim 1, wherein the converting the position coordinate information of the obstacle in the binarized pixel coordinate system into the lidar coordinate system to obtain the position coordinate information of the obstacle in the lidar coordinate system specifically comprises:
The position coordinate information of the obstacle under the laser radar coordinate system is obtained through the following formula:
Wherein, P x1,Py1,Pz1 represents the coordinates of the obstacle in the laser radar coordinate system, P x2Py2 represents the coordinates of the obstacle in the binarized pixel coordinate system, P x1_min represents the minimum x1 axis coordinates of the laser point cloud data when clipping, P y1_min represents the minimum y1 axis coordinates of the laser point cloud data when clipping, d represents the grid pitch set when the laser point cloud data is projected onto the grid map, obs_width represents the width of the obstacle in the laser radar coordinate system, obs_length represents the length of the obstacle in the laser radar coordinate system, pixel_width represents the width of the obstacle in the binarized pixel coordinate system, and pixel_length represents the length of the obstacle in the binarized pixel coordinate system.
7. A lidar obstacle detection device, comprising:
The determining unit is used for cutting laser point cloud data of the obstacles detected by the laser radar, projecting the cut laser point cloud data onto the grid map and determining the height of the obstacles in each grid;
The first obtaining unit is used for determining the corresponding position of each grid in the binary image and the gray value of the position of each grid according to the relation between the laser radar coordinate system and the binary pixel coordinate system to obtain a binary image of laser point cloud data;
The second obtaining unit is used for filling holes in the binary image of the laser point cloud data to obtain position coordinate information of the obstacle in a binary pixel coordinate system, and converting the position coordinate information of the obstacle in the binary pixel coordinate system into a laser radar coordinate system to obtain the position coordinate information of the obstacle in the laser radar coordinate system;
The correction unit is used for determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system, and correcting position coordinate information of the obstacle under the laser radar coordinate system according to the second distance;
and the third obtaining unit is used for converting the corrected position coordinate information of the obstacle in the laser radar coordinate system into the map coordinate system to obtain the position and the shape of the obstacle.
8. The lidar obstacle detection device according to claim 7, wherein the correction unit is specifically configured to:
determining a first distance between the obstacle and the coordinate origin of the laser radar coordinate system according to the pitch angle of the mobile carrier and a third distance between a second projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system;
And determining a second distance between a first projection point of the obstacle projected onto the laser radar coordinate system and the coordinate origin of the laser radar coordinate system according to a first distance between the obstacle and the coordinate origin of the laser radar coordinate system and the pitch angle of the mobile carrier.
9. The lidar obstacle detection device according to claim 7, wherein the second obtaining unit is specifically configured to:
Performing boundary expansion on the binarized image to obtain f extern_1; taking the (0, 0) th pixel of the binarized image as a seed point, and filling in the diffuse water to obtain f extern_2; removing boundary pixels of the f extern_2 to obtain an image f 2 with the same size as the binarized image; the binarized image after hole filling is obtained by the following formula:
f3=f1|(~f2)
the center coordinates, length and width of the obstacle in the binarized image are determined by the following formula:
Wherein f 1 represents a binary image of laser point cloud data, f 3 represents a binary image after hole filling, f 2 represents a gray value of all pixels, P center represents a center coordinate of an obstacle in the binary image, P i represents each pixel in an obstacle pixel set, pixel_width represents a width of the obstacle in the binary image, pixel_length represents a length of the obstacle in the binary image, P is a pixel in an obstacle pixel set S, P x2 represents a coordinate of the pixel on an x2 axis included in a binary pixel coordinate system, and P y2 represents a coordinate of the pixel on a y2 axis included in the binary pixel coordinate system, wherein the binary pixel coordinate system includes the x2 axis, the y2 axis and the o2 point.
10. The lidar obstacle detection device according to claim 7, wherein the second obtaining unit is specifically configured to:
The position coordinate information of the obstacle under the laser radar coordinate system is obtained through the following formula:
Wherein, P x1,Py1,Pz1 represents the coordinates of the obstacle in the laser radar coordinate system, P x2Py2 represents the coordinates of the obstacle in the binarized pixel coordinate system, P x1_min represents the minimum x1 axis coordinates of the laser point cloud data when clipping, P y1_min represents the minimum y1 axis coordinates of the laser point cloud data when clipping, d represents the grid pitch set when the laser point cloud data is projected onto the grid map, obs_width represents the width of the obstacle in the laser radar coordinate system, obs_length represents the length of the obstacle in the laser radar coordinate system, pixel_width represents the width of the obstacle in the binarized pixel coordinate system, and pixel_length represents the length of the obstacle in the binarized pixel coordinate system.
CN202210155073.2A 2022-02-21 2022-02-21 Laser radar obstacle detection method and device Active CN114660568B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210155073.2A CN114660568B (en) 2022-02-21 2022-02-21 Laser radar obstacle detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210155073.2A CN114660568B (en) 2022-02-21 2022-02-21 Laser radar obstacle detection method and device

Publications (2)

Publication Number Publication Date
CN114660568A CN114660568A (en) 2022-06-24
CN114660568B true CN114660568B (en) 2024-04-30

Family

ID=82026899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210155073.2A Active CN114660568B (en) 2022-02-21 2022-02-21 Laser radar obstacle detection method and device

Country Status (1)

Country Link
CN (1) CN114660568B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115147809B (en) * 2022-06-30 2023-09-22 阿波罗智能技术(北京)有限公司 Obstacle detection method, device, equipment and storage medium
CN115100630B (en) * 2022-07-04 2023-07-14 小米汽车科技有限公司 Obstacle detection method, obstacle detection device, vehicle, medium and chip
WO2024140348A1 (en) * 2022-12-30 2024-07-04 北京石头创新科技有限公司 Obstacle recognition method, apparatus, electronic device, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106052674A (en) * 2016-05-20 2016-10-26 青岛克路德机器人有限公司 Indoor robot SLAM method and system
CN106997049A (en) * 2017-03-14 2017-08-01 奇瑞汽车股份有限公司 A kind of method and apparatus of the detection barrier based on laser point cloud data
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
WO2022022694A1 (en) * 2020-07-31 2022-02-03 北京智行者科技有限公司 Method and system for sensing automated driving environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106052674A (en) * 2016-05-20 2016-10-26 青岛克路德机器人有限公司 Indoor robot SLAM method and system
CN106997049A (en) * 2017-03-14 2017-08-01 奇瑞汽车股份有限公司 A kind of method and apparatus of the detection barrier based on laser point cloud data
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
WO2022022694A1 (en) * 2020-07-31 2022-02-03 北京智行者科技有限公司 Method and system for sensing automated driving environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
应用激光雷达与相机信息融合的障碍物识别;黄兴;应群伟;;计算机测量与控制;20200125(01);全文 *

Also Published As

Publication number Publication date
CN114660568A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN114660568B (en) Laser radar obstacle detection method and device
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
CN111311689B (en) Method and system for calibrating relative external parameters of laser radar and camera
CN111583337A (en) Omnibearing obstacle detection method based on multi-sensor fusion
WO2018060313A1 (en) Methods and systems for generating and using localisation reference data
CN112017251A (en) Calibration method and device, road side equipment and computer readable storage medium
CN117441113A (en) Vehicle-road cooperation-oriented perception information fusion representation and target detection method
JP2018533721A (en) Method and system for generating and using localization reference data
CN113496491B (en) Road surface segmentation method and device based on multi-line laser radar
CN106774296A (en) A kind of disorder detection method based on laser radar and ccd video camera information fusion
CN112464812B (en) Vehicle-based concave obstacle detection method
CN111699410B (en) Processing method, equipment and computer readable storage medium of point cloud
CN112581612A (en) Vehicle-mounted grid map generation method and system based on fusion of laser radar and look-around camera
JP2014067406A (en) Method and apparatus for detecting continuous road partition
CN110631589B (en) Method for correcting positioning track in real time
CN112414403B (en) Robot positioning and attitude determining method, equipment and storage medium
CN109948413A (en) Method for detecting lane lines based on the fusion of high-precision map
EP4403879A1 (en) Vehicle, vehicle positioning method and apparatus, device, and computer-readable storage medium
CN113989766A (en) Road edge detection method and road edge detection equipment applied to vehicle
CN113985405A (en) Obstacle detection method and obstacle detection equipment applied to vehicle
CN110208802B (en) Obstacle detection method fusing multi-view fuzzy reasoning assignment
CN111401176B (en) Road edge detection method based on multi-line laser radar
CN111380529B (en) Mobile device positioning method, device and system and mobile device
CN116385994A (en) Three-dimensional road route extraction method and related equipment
WO2022194110A1 (en) External parameter calibration method and apparatus, device, server and vehicle-mounted computing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant