CN117671648A - Obstacle point detection method, obstacle point detection device and storage medium - Google Patents
Obstacle point detection method, obstacle point detection device and storage medium Download PDFInfo
- Publication number
- CN117671648A CN117671648A CN202410150050.1A CN202410150050A CN117671648A CN 117671648 A CN117671648 A CN 117671648A CN 202410150050 A CN202410150050 A CN 202410150050A CN 117671648 A CN117671648 A CN 117671648A
- Authority
- CN
- China
- Prior art keywords
- data
- point cloud
- point
- coordinate system
- obstacle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims description 50
- 238000000034 method Methods 0.000 claims abstract description 50
- 238000004891 communication Methods 0.000 claims description 17
- 238000010586 diagram Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 3
- 230000004044 response Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 101100134058 Caenorhabditis elegans nth-1 gene Proteins 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The application relates to the technical field of data processing, and discloses a method, equipment and a storage medium for detecting obstacle points, wherein the method comprises the following steps: acquiring data of a history point cloud acquired in a first history period in the travelling process of the mobile equipment; performing plane fitting by utilizing the data of the history point cloud to obtain a target fitting plane under a target coordinate system; acquiring data of a current point cloud acquired at the current moment of mobile equipment; if the data of the current point cloud is not the data under the target coordinate system, converting the data of the current point cloud into the data under the target coordinate system; calculating first distances between each point in the current point cloud and a target fitting plane under a target coordinate system to obtain a plurality of first distances; and determining a point corresponding to a first distance greater than a first distance threshold value in the first distances as an obstacle point at the current moment. By the method, the obstacle in the travelling path of the mobile device can be effectively identified.
Description
Technical Field
The embodiment of the application relates to the technical field of data processing, in particular to a method and equipment for detecting obstacle points and a computer readable storage medium.
Background
In industries such as robots and autopilots, whether an obstacle exists in a travelling path of mobile equipment needs to be detected in real time, response actions such as avoidance and the like need to be made in time to avoid the obstacle after the obstacle exists in the travelling path of the mobile equipment is detected, and if the obstacle cannot be detected timely and accurately to avoid the obstacle, accidents can be caused.
For some low-level obstacles, because the height of the low-level obstacle is smaller than that of the mobile device, if the low-level obstacle is located in a detection blind area of a sensor (i.e., a main sensor) for detecting the obstacle in the mobile device, a phenomenon that the obstacle cannot be timely and effectively detected occurs. Thus, there is currently a lack of an effective method for detecting low-profile obstacles.
Disclosure of Invention
In view of the above problems, embodiments of the present application provide a method, an apparatus, a device, and a storage medium for detecting an obstacle point, which are used to solve the problem in the prior art that a short obstacle in a travel path of a mobile device cannot be effectively detected.
According to an aspect of the embodiments of the present application, there is provided a method for detecting an obstacle point in a travel path of a mobile device, the method including: acquiring data of a history point cloud acquired in a first history period in the travelling process of the mobile equipment; performing plane fitting by utilizing the data of the history point cloud to obtain a target fitting plane under a target coordinate system; acquiring data of a current point cloud acquired at the current moment of the mobile equipment; if the data of the current point cloud is not the data in the target coordinate system, converting the data of the current point cloud into the data in the target coordinate system; calculating first distances between each point in the current point cloud and the target fitting plane under the target coordinate system to obtain a plurality of first distances, wherein each point corresponds to one first distance; and determining a point corresponding to a first distance larger than a first distance threshold value in the first distances as an obstacle point at the current moment.
In an optional manner, the performing plane fitting by using the data of the history point cloud to obtain a target fitting plane under a target coordinate system includes: performing plane fitting by utilizing data of a plurality of points in the history point cloud to obtain a fitting plane under the target coordinate system; calculating second distances between each of a plurality of points used for fitting the fitting plane and the fitting plane to obtain a plurality of second distances, wherein each of the plurality of points corresponds to one second distance; determining points corresponding to second distances smaller than a second distance threshold value in the plurality of second distances as inner points; calculating the ratio of the number of the internal points to the number of a plurality of points for fitting the fitting plane to obtain an internal point rate; if the interior point rate is less than the interior point rate threshold, then: repeating the steps until the internal point rate is greater than or equal to the internal point rate threshold, wherein when the steps are repeatedly performed, the data of a plurality of points used in each plane fitting are not completely the same; or repeatedly executing the steps for times reaching a time threshold, and determining the fitting plane corresponding to the maximum internal point rate as the target fitting plane; and if the interior point rate is greater than or equal to the interior point rate threshold, determining the fitting plane as the target fitting plane.
In an optional manner, the data of the plurality of points in the history point cloud is executed for the nth time to perform plane fitting, and when the fitting plane step under the target coordinate system is obtained, the data of the plurality of points used for plane fitting is data of a plurality of inner points in the plurality of points used for the nth-1 time plane fitting, wherein N is a positive integer, and N is greater than or equal to 2.
In an alternative manner, the data of the history point cloud and the data of the current point cloud are data acquired by a depth sensor in the mobile device, and the target coordinate system is a coordinate system constructed by taking a target point in the mobile device as an origin, wherein the target point is a point not belonging to the depth sensor.
In an optional manner, the data of the history point cloud isWherein->The gesture of the mobile device corresponding to each point in the history point cloud is as followsWherein the plane fitting is performed by using the data of the history point cloudObtaining a target fitting plane in a target coordinate system comprises: using the formula->Will->Data converted into a history point cloud in the target coordinate system, wherein +. >For the data of the history point cloud in the target coordinate system,for the pose of the mobile device at the current time, and (2)>An external parameter for converting the depth sensor coordinate system to the target coordinate system; data using a history point cloud in the target coordinate system +.>Performing plane fitting to obtain the target fitting plane under the target coordinate system; if the data of the current point cloud is not the data in the target coordinate system, converting the data of the current point cloud into the data in the target coordinate system includes: if the data of the current point cloud is the data under the depth sensor coordinate system, the formula +.>Converting the data of the current point cloud in the depth sensor coordinate system into the data in the target coordinate system, wherein ∈>For the data of the current point cloud in the target coordinate system,/for the data of the current point cloud in the target coordinate system>Is the current point cloud in the depth sensor coordinate system obtained by the depth sensorIs a data of (a) a data of (b).
In an alternative, the method further comprises: acquiring an obstacle point cloud in a second history period in the travelling process of the mobile equipment, wherein the obstacle point cloud is a set of obstacle points at each moment in the second history period; and determining each obstacle point in the obstacle point cloud in the second history period as the obstacle point of the current moment.
In an optional manner, the data of the obstacle point cloud in the second history period isSaid->For the data in the target coordinate system, the gesture of the mobile device corresponding to each point in the obstacle point cloud in the second history period is +.>Wherein the determining each obstacle point in the obstacle point cloud in the second history period as the obstacle point at the current time includes: using the formula->Converting the data of the obstacle point cloud in the second history period into the point cloud data of the current moment, wherein +_>For the data of the obstacle point cloud at the current moment under the converted target coordinate system, < +.>A gesture of the mobile device at the current moment; will->And determining the obstacle point at the current moment as the obstacle point at the current moment.
In an optional manner, the mobile device includes a depth sensor, and the data of the historical point cloud and the data of the current point cloud are coordinate data of detection points in a detection blind area of a main sensor of the mobile device obtained through the depth sensor.
According to another aspect of the embodiments of the present application, there is provided a detection apparatus for detecting an obstacle point in a travel path of a mobile device, the apparatus including: the first acquisition module is used for acquiring data of the history point cloud acquired in a first history period in the travelling process of the mobile equipment; the plane fitting module is used for performing plane fitting by utilizing the data of the history point cloud to obtain a target fitting plane under a target coordinate system; the second acquisition module is used for acquiring the data of the current point cloud acquired at the current moment of the mobile equipment; the conversion module is used for converting the data of the current point cloud into the data of the target coordinate system if the data of the current point cloud is not the data of the target coordinate system; the calculation module is used for calculating first distances between each point in the current point cloud and the target fitting plane under the target coordinate system to obtain a plurality of first distances, wherein each point corresponds to one first distance; and the determining module is used for determining a point corresponding to a first distance larger than a first distance threshold value in the plurality of first distances as an obstacle point at the current moment.
According to another aspect of the embodiments of the present application, there is provided a detection apparatus for an obstacle point, including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus; the memory is for storing executable instructions that cause the processor to perform operations of the obstacle point detection method as described above.
According to still another aspect of the embodiments of the present application, there is provided a computer-readable storage medium having stored therein executable instructions that, when executed, perform operations of the obstacle point detection method as described above.
In the embodiment of the application, the data of the historical point cloud and the data of the current point cloud are coordinate data of detection points in a detection blind area of a main sensor obtained through a depth sensor. The target fitting plane is obtained by performing plane fitting by utilizing data of a plurality of points in the travelling path of the mobile equipment, which are acquired in a history period of time close to the current moment, so that the obtained target fitting plane is similar to the ground in the current travelling path of the mobile equipment, and further, the points which are free out of the plane can be accurately identified through the distance between the points acquired at the current moment and the target fitting plane, and the points which are free out of the plane are obstacle points. For example, the points of obstacles located on the ground and the points of the ditches located below the ground, which are located at a distance from the ground, are all free from the ground.
The foregoing description is only an overview of the technical solutions of the embodiments of the present application, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present application can be more clearly understood, and the following detailed description of the present application will be presented in order to make the foregoing and other objects, features and advantages of the embodiments of the present application more understandable.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
fig. 1 is a flow chart illustrating a method for detecting an obstacle point according to an embodiment of the present application;
fig. 2 is a schematic diagram of laser radar in a mobile device according to an embodiment of the present application emitting laser to the ground;
FIG. 3 shows a flow chart of substeps of step 120 of FIG. 1;
fig. 4 is a schematic structural diagram of a detection device for an obstacle point according to an embodiment of the present application;
fig. 5 shows a schematic structural diagram of an obstacle point detection device provided in an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein.
In the embodiment of the application, the mobile device refers to a device that autonomously plans a path and moves, such as a robot (e.g., a sweeping robot), an unmanned vehicle, and the like. Because the mobile device is a device for autonomously planning a moving path, it needs to detect whether an obstacle exists on the moving path in real time, and if the obstacle exists, the path needs to be planned again to avoid the obstacle, so as to avoid accidents.
The present inventors have found in the study that, in order to detect a low obstacle located in a detection blind area of a main sensor in a mobile device, a single-line lidar may be installed obliquely (for example, at an installation height of 20cm from the ground, with an angle of 15 ° -20 °) in the mobile device, and the low obstacle may be detected using a distance-based detection method or a template-matching-based detection method. However, when the pitch angle of the posture of the mobile device shakes, the obstacle and the ground in the travelling path cannot be effectively distinguished by the two methods, so that the situation of missing detection and false detection of the obstacle occurs.
Therefore, in order to effectively detect a short obstacle, the present application proposes a method for detecting an obstacle point, where the obstacle point is a point belonging to the obstacle, by first acquiring historical point cloud data acquired in a past period of time in a traveling process of a mobile device, where the historical point cloud data is coordinate data of a plurality of detection points in a traveling path of the mobile device through a depth sensor, then performing plane fitting by using the historical point cloud data, obtaining a fitting plane, and then determining a point, which is free from the fitting plane, in a point cloud acquired at a current moment as the obstacle point. Because the mobile equipment mainly moves on the ground, by performing plane fitting by utilizing the historical point cloud data in the past period of time, the obtained fitting plane can be used for representing the ground on which the mobile equipment moves, so that whether obstacle points exist in the point cloud acquired at the current moment can be effectively detected according to the fitting plane.
Fig. 1 shows a flowchart of a method for detecting an obstacle point in a travel path of a mobile device, where the method is performed by a terminal device, and the terminal device may be a mobile device, a controller in the mobile device, or other terminal devices communicatively connected to the mobile device. The terminal device may be a terminal device comprising one or more processors, which may be Central Processing Units (CPUs), or specific integrated circuits (Application Specific Integrated Circuit, ASICs), or one or more integrated circuits configured to implement embodiments of the present invention, without limitation. The one or more processors comprised by the terminal device may be the same type of processor, such as one or more CPUs; but may be different types of processors such as, without limitation, one or more CPUs and one or more ASICs. As shown in fig. 1, the method comprises the steps of:
step 110: and acquiring data of a historical laser point cloud acquired in a first historical period in the travelling process of the mobile equipment.
It should be noted that, in order to obtain data of a history point cloud with higher accuracy, a depth sensor may be mounted on the mobile device, and coordinate data of a detection point in a blind area of a main sensor of the mobile device may be obtained through the depth sensor, where the depth sensor may be a laser radar or a depth camera, and hereinafter, only the depth sensor is described as an example of the laser radar. If the depth camera is used to obtain the coordinate data of the detection point in the blind area of the main sensor of the mobile device, the data collected in step 110 are the data of the camera point cloud instead of the data of the laser point cloud, and the laser point clouds referred to later are both camera point clouds, and the laser points are all camera points.
In this step, the data of the historical laser point cloud collected in the first historical period refers to coordinate data of a plurality of laser points at a plurality of times collected in a past period of time, and for example, may be coordinates of a plurality of laser points at a plurality of times collected in 3 to 5 seconds from the current time. Preferably, the first history period is a history period that is closer to the current time. And transmitting laser to the detection point through the laser radar, and determining the coordinates of the detection point according to the signal reflected by the detection point received by the laser radar. It should be noted that in the embodiment of the present application, preferably, the data of the historical laser point cloud is data acquired by a laser radar in the mobile device.
For better explaining the manner of acquiring the data of the historical laser point cloud, fig. 2 shows a schematic diagram of lasing of the lidar in the mobile device to the ground according to the embodiment of the present application, where (a) in fig. 2 is a side view of lasing of the lidar to the ground, and (b) is a top view of lasing of the lidar to the ground. As shown in fig. 2 (a), the lidar 11 in the mobile device 10 emits laser light (shown by a dotted line in the figure) toward a plurality of detection points on the ground, and after the laser light reaches the detection points, coordinate data of the detection points can be determined according to signals reflected back from the detection points, and thus position information of the detection points is obtained. If the obstacle 20 is present on the ground, the laser radar emits laser light to the ground, and then the emitted laser light does not reach the ground on which the obstacle 20 is present but reaches the obstacle 20, and then coordinate data of points on the surface of the obstacle 20 is obtained.
It can be understood that, since the laser radar emits laser to a plurality of detection points on the ground within a certain range to obtain data of a history laser point cloud, if a low obstacle exists in the ground, most of the laser points included in the obtained history laser point cloud belong to the laser points on the ground, and the occupation of the belonging obstacle points is relatively small.
Step 120: and performing plane fitting by utilizing the data of the historical laser point cloud to obtain a target fitting plane under a target coordinate system.
The mobile device coordinate system is a coordinate system created by taking a point in the mobile device as an origin, and the coordinate system rotates around the origin, wherein the origin is a point not belonging to the laser radar. The lidar coordinate system refers to a coordinate system created with a point in the lidar as the origin, which is similar to the mobile device coordinate system except for the origin. In this embodiment of the present application, the target coordinate system may be set as required, and may be a world coordinate system or a mobile device coordinate system.
The data of the laser point cloud acquired by the laser radar is generally data under a laser radar coordinate system, after the data of the historical laser point cloud acquired in the step 110 is converted into data under a mobile device coordinate system or a world coordinate system, plane fitting can be performed on the converted data, and a target fitting plane under a target coordinate system is obtained.
For example, if the target coordinate system is the world coordinate system and the data of the history laser point cloud is the data in the lidar coordinate system, the data can be obtained by the formulaConverting the data of the historical laser point cloud under the laser radar coordinate system into the data of the historical laser point cloud under the world coordinate system, and further carrying out plane fitting by utilizing the data of the historical laser point cloud under the world coordinate system to obtain a target fitting plane under the world coordinate system. Wherein (1)>Data for a historical laser point cloud in world coordinate system, +.>,Is the data of each laser point in the history laser point cloud under the laser radar coordinate system, and is +.>,For the gesture of the mobile device corresponding to each laser point in the history laser point cloud, +.>External parameters for converting the laser radar coordinate system into the mobile device coordinate system comprise a rotation matrix and a translation vector.
Wherein the plane isThe expression of (2) isThe method comprises the steps of carrying out a first treatment on the surface of the Wherein A, B, C, D is a parameter to be determined, and x, y, and z are variables. The target fitting plane can be obtained by substituting the coordinate data of at least 4 laser points in the historical laser point cloud acquired in step 110 into the above formula.
It will be appreciated that if the history laser point cloud includes obstruction points, the accuracy of the obtained target fit plane may be reduced if the obstruction points in the data of the laser points used for the plane fit are relatively large. Since most of the laser points in the history laser point cloud belong to points on the ground, in a certain range, the more laser points are used for plane fitting, the smaller the duty ratio of obstacle points contained in the laser points for plane fitting is, and the higher the accuracy of the obtained target fitting plane is. To improve the accuracy of the obtained fitting plane, a plane fit may be performed using all the data of the historical laser point cloud in step 110.
Step 130: and acquiring data of the current laser point cloud acquired by the mobile equipment at the current moment.
Unless otherwise specified, the current time point in the whole text refers to the same time point.
Step 140: and judging whether the current laser point cloud data is data under a target coordinate system. If yes, go to step 160; if not, go to step 150.
If the data of the target fitting plane and the current laser point cloud are not the data in the same coordinate system, the target fitting plane cannot be used for judging whether the laser point free outside the target fitting plane exists in the current laser point cloud.
Step 150: and converting the data of the current laser point cloud into data in a target coordinate system.
If the target coordinate system is the mobile device coordinate system and the current laser point cloud data is the data under the laser radar coordinate system, the current laser point cloud data is converted into the data under the mobile device coordinate system. If the data of the target coordinate system and/or the laser point cloudData in other coordinate systems are similar and will not be described in detail. For example, if the target coordinate system is the world coordinate system and the current laser point cloud data is the data in the lidar coordinate system, the data can be obtained by the formula Converting data of a current laser point cloud in a laser radar coordinate system into data in a world coordinate system, wherein +.>Data for the current laser point cloud in world coordinate system, +.>For the pose of the mobile device in world coordinate system, comprising a rotation matrix and translation vectors, +.>Is the data of the current laser point cloud in the laser radar coordinate system.
Step 160: and calculating first distances between each laser point in the current laser point cloud and the target fitting plane under the target coordinate system to obtain a plurality of first distances, wherein each laser point corresponds to one first distance.
If the current laser point cloud comprises m laser points, respectively calculating first distances between the m laser points and a fitting plane to obtain m first distances, wherein each laser point in the m laser points corresponds to one first distance in the m first distances, m is a positive integer, and m is more than or equal to 2.
Step 170: and determining the laser points corresponding to the first distances larger than the first distance threshold value in the first distances as obstacle points at the current moment.
The first distance threshold may be set as required, for example, the range of the first distance threshold may be 5cm to 10cm, and may be set to 5cm, 8cm, 10cm, or the like. Specifically, for example, if the mobile device is a robot and the height of the robot chassis is 10cm, for some obstacles, the height of the obstacle is far smaller than that of the robot chassis, and the normal movement of the robot is not affected, that is, the robot does not need to avoid the obstacle, so the range of the first distance threshold may be 5 cm-8 cm. If the first distance corresponding to some laser points in the m laser points is greater than the first distance threshold, a certain distance exists between the objects corresponding to the laser points and the ground, and the objects are free outside the ground, namely the obstacle points. For example, the points of obstacles located on the ground and the points of the ditches located below the ground, which are located at a distance from the ground, are all free from the ground.
In the embodiment of the application, since the target fitting plane is obtained by performing plane fitting by using the data of the plurality of laser points in the travelling path of the mobile device acquired in the history period which is closer to the current moment, the obtained target fitting plane is similar to the ground in the current travelling path of the mobile device, and further, the point which is free out of the plane can be accurately identified by the distance between the laser point acquired at the current moment and the target fitting plane, and the point which is free out of the plane is the obstacle point.
In some embodiments, in order to avoid redundancy of data, which causes a large amount of calculation and reduces the efficiency of detecting the obstacle point, it is preferable that only data of the laser point cloud in front of the travel of the mobile device is collected in the moving direction of the mobile device, that is, laser is emitted to the ground within a certain angle range in front of the mobile device by the laser radar, so as to obtain data of the historical laser point cloud and data of the current laser point cloud in the range. For example, the laser radar emits laser light forward in a maximum range of 1 to 3 times the width of the mobile device. In the moving direction of the mobile device, the mobile device moves forward according to the planned path, and the forward movement of the mobile device is not influenced for the obstacle point behind the mobile device, so that the mobile device does not need to detect the obstacle point behind the mobile device to avoid the obstacle.
Fig. 3 shows a flow chart of substeps of step 120 of fig. 1. As shown in fig. 3, step 120 includes:
step 121: and performing plane fitting by utilizing the data of a plurality of laser points in the historical laser point cloud to obtain a fitting plane under the target coordinate system.
The principle and implementation of this step are similar to that of step 120, and reference is made to step 120, which is not repeated here.
Step 122: and calculating a second distance between each of the plurality of laser points used for fitting the fitting plane and the fitting plane to obtain a plurality of second distances, wherein each of the plurality of laser points corresponds to one second distance.
If the history laser point cloud includes M laser points, the history laser point cloud is utilized in step 121The data of the individual laser spots are plane-fitted, which are calculated in this step separately>A second distance between the individual laser spots and the fitting plane obtained in step 121, resulting in +.>A second distance, and this->Each laser spot in the laser spots corresponds to +.>One of the second distances, < > a second distance, < >>Is a positive integer and->≤M。
Step 123: and determining laser points corresponding to the second distances smaller than the second distance threshold value in the second distances as inner points.
The second distance threshold may be set as required, for example, the range of the second distance threshold may be 3-6 cm, and may be set to 3cm, 5cm, or 6 cm.
Step 124: and calculating the ratio of the number of the internal points to the number of the plurality of laser points used for fitting the fitting plane to obtain the internal point rate.
Wherein, if used for plane fittingThere is +.>The inner points are the inner points of +.>,Is a positive integer and->≤。
Step 125: and judging whether the interior point rate is smaller than an interior point rate threshold value. If yes, go to step 126; if not, go to step 128.
The internal point rate may be set as needed, for example, the range of the internal point rate threshold may be 80% -90%, and may be set to 80%, 85%, 90%, or the like.
If the interior point rate is small, then the method is used for plane fittingThe plurality of laser points in the laser points are free from the obtained fitting plane, i.e. the laser points may be obstacle points, so that the accuracy of the obtained fitting plane is low, and the obtained fitting plane cannot be used for accurately representing the ground in the travelling path of the mobile device.
Step 126: and judging whether the number of times of repeatedly executing the steps reaches a number threshold. If yes, go to step 127; if not, go to step 121.
If the internal point rate is smaller, the plane fitting needs to be performed iteratively, that is, steps 121 to 125 are repeatedly performed, and the internal point rate of the obtained fitting plane is gradually reduced until the internal point rate is greater than or equal to the internal point rate threshold or the number of times of repeating the plane fitting reaches the frequency threshold, so as to obtain the fitting plane with higher accuracy. The number of times threshold may be set as desired, for example, to 3 times, 5 times, 7 times, or the like.
It should be noted that, if the interior point rate is smaller than the interior point rate threshold and the number of times of repeating the step 121 does not reach the number of times threshold, the data of the plurality of laser points used for performing the planar fitting repeatedly is not identical to the data of the plurality of laser points used for performing the planar fitting at any one time. For example, if the history laser point cloud includes 100 laser points, the corresponding serial numbers are respectively 1-100, if the plane fitting is performed by using the data of the laser points with serial numbers 1-80 when step 121 is performed for the first time in the embodiment of the present application, the serial numbers of the laser points used when step 121 is performed for the second time are not completely 1-80, for example, may be the laser points with serial numbers 5-90, or the laser points with serial numbers 10-100, etc., and when step 121 is performed for the j th time, the laser points used for plane fitting are not completely the same as the laser points used when performing plane fitting for the previous j-1 times, where j is a positive integer, and j is not less than 2.
Step 127: and determining a fitting plane corresponding to the maximum interior point rate as a target fitting plane.
It will be appreciated that, for each time the step 121 is performed to perform the plane fitting, the internal point rates of the obtained fitting planes are calculated accordingly, i.e. if the step 121 is performed j times, j fitting planes and j internal point rates are obtained correspondingly, where each fitting plane corresponds to one internal point rate.
If the number of times of repeatedly executing step 121 reaches the number of times threshold, the corresponding internal point rate of each fitting plane is smaller than the internal point rate threshold, the fitting plane corresponding to the largest internal point rate is determined as the target fitting plane, so that the accuracy of the obtained fitting plane is improved.
Step 128: the fitting plane is determined as the target fitting plane.
And if the internal point rate is greater than or equal to the internal point rate threshold value, determining a fitting plane corresponding to the internal point rate as a target fitting plane.
Since the history laser point cloud may include the obstacle points, if the data of the plurality of laser points used for plane fitting includes more data of the obstacle points, the accuracy of the obtained fitting plane is low, that is, the obtained fitting plane cannot accurately represent the ground in the travelling path of the mobile device, so that if the fitting plane is used for determining the obstacle point at the current moment, the accuracy of determining the obstacle point is low. Therefore, in the embodiment of the application, after a plane fitting is performed by using data of a plurality of laser points in a history laser point cloud to obtain a fitting plane, an internal point rate of the obtained fitting plane is calculated to determine the accuracy of the obtained fitting plane according to the internal point rate, and if the internal point rate does not meet the requirement, plane fitting is performed iteratively to gradually improve the internal point rate, so that a target fitting plane with higher accuracy is obtained.
In order to improve accuracy of the obtained fitting plane, the embodiment of the application provides a way to determine the target fitting plane. Based on the embodiment provided in fig. 3, in the embodiment of the present application, when step 121 is executed for the nth time, the data of the plurality of laser points for plane fitting is the data of a plurality of inner points in the plurality of laser points for N-1 th time plane fitting, where N is a positive integer, and N is greater than or equal to 2.
In this embodiment of the present application, if the internal point rate of the fitting plane obtained by executing step 121 for the first time is smaller than the internal point rate threshold and the number of times threshold is greater than 1, then when executing step 121 for the second time, the data of the plurality of laser points for plane fitting is the data of all internal points in the plurality of laser points for plane fitting for the first time, and so on, until the obtained internal point rate of the fitting plane is greater than or equal to the internal point rate threshold or the number of times of repeatedly executing step 121 reaches the number of times threshold, thereby obtaining the target fitting plane with higher accuracy.
For example, if the data of the laser point cloud for the first time for plane fitting includes data of 100 laser points, 80 inner points are included in the 100 laser points, and the inner point rate is 80%; then, when step 121 is performed for the second time, the data of the 80 laser points are used to perform plane fitting, if 75 interior points are included in the 80 laser points, the interior point rate is 93.75%, and so on, so that the obtained interior point rate of the fitting plane is gradually improved, and the accuracy of the obtained fitting plane is also improved.
After detecting the obstacle point, in order to make the mobile device make response actions such as avoiding in time, in the embodiment provided in fig. 1, the target coordinate system is the coordinate system of the mobile device.
After the mobile device recognizes that an obstacle point exists in the travelling path, the distance between the obstacle and the mobile device and the direction of the obstacle need to be further determined, so that the path can be further planned to avoid the obstacle. For example, if the lidar is disposed at the leftmost side of the mobile device, the mobile device is currently turned and moved to the left, if the determined coordinate data of the obstacle point is the data under the lidar coordinate system, the obstacle point is located right in front of the lidar and is located at the left side of the mobile device, and if the mobile device collides with the obstacle point in the course of turning and moving to the left along the originally planned path, the mobile device needs to perform path planning again to avoid the obstacle. However, if the mobile device directly performs path planning by using the coordinate data of the obstacle point in the laser radar coordinate system to avoid the obstacle point, the mobile device may misunderstand that the obstacle point is located right in front of the mobile device, resulting in lower accuracy of the planned path and failure to avoid the obstacle.
Therefore, in the embodiment of the present application, by setting the target coordinate system as the mobile device coordinate system, the determined coordinate data of the obstacle point is also data in the mobile device coordinate system, so that the mobile device performs path planning based on the determined coordinate data of the obstacle point, thereby improving the accuracy of the obtained path, and effectively realizing obstacle avoidance.
Further, if the determined coordinate data of the obstacle point is not the data under the mobile equipment coordinate system, the mobile equipment converts the coordinate data into the data under the mobile equipment coordinate system and then performs path planning, so that the obtained path has lower efficiency, and if the obstacle is smaller, the phenomenon that the obstacle cannot be avoided in time easily occurs, and accidents occur. Therefore, in the embodiment of the application, the target coordinate system is set as the mobile equipment coordinate system, so that the mobile equipment can avoid the obstacle timely and effectively after detecting the obstacle point, and accidents are avoided.
In order to obtain the target fitting plane in the target coordinate system with higher accuracy and the data of the current laser point cloud in the target coordinate system, in the embodiment of the present application, the data of the historical laser point cloud is based on the foregoing embodiment Wherein->For the data under the laser radar coordinate system, the gesture of the mobile device corresponding to each laser point in the history laser point cloud is +.>Wherein step 120 comprises:
step a1: using the formulaWill->Data converted into a historical laser point cloud in a target coordinate system, wherein +.>Data for a historical laser point cloud in the target coordinate system, +.>For the gesture of the mobile device at the current moment, +.>Is an external parameter for converting the laser radar coordinate system into the target coordinate system.
Wherein,、,……,and respectively representing the coordinate data corresponding to each laser point in the historical laser point cloud.、,……,And respectively representing the gesture of the mobile device corresponding to each laser point in the historical laser point cloud. Specifically, laser spot->The posture of the corresponding mobile device is +.>Indicating that the laser radar detects the laser spot +.>At the time, the posture of the mobile device is +.>Other laser spots are similar and will not be described in detail here. It can be understood that if the laser radar is a single line laser radar, the coordinate data of the laser point is two-dimensional data; if the laser radar is a multi-line laser radar, the coordinate data of the laser point is three-dimensional data.Representation->Inverse of the matrix.
Step a2: data using historical laser point clouds in a target coordinate system And performing plane fitting to obtain a target fitting plane under the target coordinate system.
The step is similar to step 120, and will not be described here again.
Step 150 includes: if the current laser point cloud data is the data under the laser radar coordinate system, the formula is utilizedConverting data of a current laser point cloud in a laser radar coordinate system into data in a target coordinate system, wherein +.>Data for the current laser point cloud in the target coordinate system, +.>Is the data of the current laser point cloud in the laser radar coordinate system obtained by the laser radar.
In the embodiment of the application, since the data of the laser point cloud obtained by the laser radar is the data under the laser radar coordinate system, the data of the laser point cloud under the laser radar coordinate system can be accurately converted into the data of the laser point cloud under the mobile device coordinate system by utilizing the external parameters converted to the mobile device coordinate system by the laser radar coordinate system and the gesture of the mobile device, so that the accuracy of the obtained target fitting plane under the mobile device coordinate system is improved.
In order to avoid misjudging the obstacle point, in the embodiment of the present application, the method for detecting the obstacle point further includes the following steps based on the embodiment provided in fig. 1:
Step b1: and acquiring an obstacle point cloud in a second historical period in the travelling process of the mobile equipment, wherein the obstacle point cloud is a set of obstacle points at each moment in the second historical period.
The second history period refers to a past period of time that is closer to the current time, for example, 3 to 5 seconds in the past. The obstacle point in the second history period may be detected by using the obstacle point detection method provided in fig. 1, and an obstacle point cloud may be obtained. The second history period may be the same, different, or partially the same as the first history period.
Step b2: each obstacle point in the obstacle point cloud in the second history period is determined as an obstacle point at the current time.
After the robot detects the obstacle point, a certain time is required for reaction to perform obstacle avoidance action, if the obstacle point in the laser point cloud collected at the current moment is only determined as the obstacle point at the current moment, and the obstacle avoidance action is performed, if the obstacle is smaller, the mobile device is likely to move continuously, and the obstacle point detected at the current moment cannot be detected at the next moment, so that the mobile device is mistakenly cleared as the obstacle without performing the obstacle avoidance action, and accidents can be caused.
For example, for an autonomously moving vehicle, if a vehicle detects an obstacle at the current moment, a parking measure is immediately taken to avoid the obstacle, but the vehicle still slides a certain distance forward after taking the parking measure due to inertia action of the vehicle. If the obstacle is small, after the vehicle slides forwards for a certain distance, the obstacle is not in the detection range of the vehicle, the vehicle cannot detect the obstacle at this time, if the obstacle is mistakenly cleared, the vehicle continues to run forwards, and accordingly accidents occur.
Therefore, in the embodiment of the application, on the basis of the obstacle point detected at the current moment, the obstacle point detected in the past period is also used as the obstacle point at the current moment, so that the mobile device takes an obstacle avoidance action, and the moving route is planned again, thereby avoiding the obstacle and avoiding the situation.
Based on the foregoing embodiment, in an embodiment of the present application, the data of the obstacle point cloud in the second history period is(include->Individual handicap points), ->For the data under the target coordinate system, the gesture of the mobile device corresponding to each laser point in the obstacle point cloud in the second history period is +.>Wherein step b2 comprises:
Step b21: using the formulaConverting the data of the obstacle point cloud in the second history period into point cloud data of the current moment, wherein ∈>For the data of obstacle point cloud at the current moment under the converted target coordinate system, +.>Is the gesture of the mobile device at the current moment.
Wherein, since the mobile device is constantly moving, the data of the obstacle point detected in the past period is determined based on the past pose of the mobile device, it is necessary to convert the data of the obstacle point in the second history period into the point cloud data of the current time. Wherein,、,……,and respectively representing coordinate data corresponding to each obstacle point in the obstacle point cloud in the second historical period.、,……,And respectively representing the gesture of the mobile device corresponding to each obstacle point. Specifically, laser spot->The posture of the corresponding mobile device is +.>Indicating that the laser radar detects the laser spot +.>At the time, the posture of the mobile device is +.>Other points of obstruction are similar and will not be described in detail herein.
Step b22: will beThe obstacle point in (a) is determined as the obstacle point at the current time.
In this stepIs comprised of->The obstacle points are determined as obstacle points at the current time, and the obstacle points are added to the obstacle point set determined based on the laser point cloud data at the current time. / >
In the embodiment of the application, the coordinate data of each obstacle point in the second history period, the corresponding gesture of the mobile device and the gesture of the mobile device at the current moment are utilized to convert the data of the obstacle point in the second history period into the coordinate data of the obstacle point at the current moment, so that the accuracy of the obtained data of the obstacle point is improved.
Fig. 4 shows a schematic structural diagram of an obstacle point detection device according to an embodiment of the present application. As shown in fig. 4, the apparatus 200 includes: a first acquisition module 201, a plane fitting module 202, a second acquisition module 203, a conversion module 204, a calculation module 205, and a determination module 206.
The first acquiring module 201 is configured to acquire data of a historical laser point cloud acquired in a first historical period during a traveling process of the mobile device. The plane fitting module 202 is configured to perform plane fitting by using data of the historical laser point cloud, and obtain a target fitting plane in a target coordinate system. The second obtaining module 203 is configured to obtain data of a current laser point cloud collected at a current time of the mobile device. The conversion module 204 is configured to convert the data of the current laser point cloud into data in the target coordinate system if the data of the current laser point cloud is not the data in the target coordinate system. The calculation module 205 is configured to calculate a first distance between each laser point in the current laser point cloud and the target fitting plane in the target coordinate system, so as to obtain a plurality of first distances, where each laser point corresponds to one first distance. The determining module 206 is configured to determine, as an obstacle point at the current moment, a laser point corresponding to a first distance greater than a first distance threshold value from among the plurality of first distances.
The detection device for the obstacle point provided in this embodiment is configured to execute the technical scheme of the detection method for the obstacle point in the foregoing method embodiment, and its implementation principle and technical effect are similar, and are not described herein again.
It should be noted that, the detection device for the obstacle point provided in this embodiment further includes other modules for executing the steps of the foregoing method embodiment for detecting the obstacle point, which are not described herein in detail.
Fig. 5 shows a schematic structural diagram of an obstacle point detection device provided in an embodiment of the present application, and the specific embodiment of the present application does not limit a specific implementation of the obstacle point detection device.
As shown in fig. 5, the obstacle point detection apparatus may include: a processor (processor) 302, a communication interface (Communications Interface) 304, a memory (memory) 306, and a communication bus 308.
Wherein: processor 302, communication interface 304, and memory 306 perform communication with each other via communication bus 308. A communication interface 304 for communicating with network elements of other devices, such as clients or other servers. The processor 302 is configured to execute the program 310, and may specifically perform the relevant steps in the above-described embodiment of the method for detecting an obstacle point.
In particular, program 310 may include program code comprising computer-executable instructions.
The processor 302 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present application. The one or more processors included in the obstacle point detection device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
Memory 306 for storing program 310. Memory 306 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
An embodiment of the present application provides a computer-readable storage medium storing executable instructions that, when executed on a detection apparatus for an obstacle point, cause the detection apparatus for an obstacle point to execute the detection method for an obstacle point in any of the method embodiments described above.
The present embodiment provides a computer program that can be invoked by a processor to cause a detection device for an obstacle point to execute the method for detecting an obstacle point in any of the method embodiments described above.
The present application provides a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when run on a computer, cause the computer to perform the method for detecting an obstacle point in any of the method embodiments described above.
Claims (10)
1. A method of detecting an obstacle point, the method for detecting an obstacle point in a path traveled by a mobile device, the method comprising:
acquiring data of a history point cloud acquired in a first history period in the travelling process of the mobile equipment;
performing plane fitting by utilizing the data of the history point cloud to obtain a target fitting plane under a target coordinate system;
acquiring data of a current point cloud acquired at the current moment of the mobile equipment;
if the data of the current point cloud is not the data in the target coordinate system, converting the data of the current point cloud into the data in the target coordinate system;
calculating first distances between each point in the current point cloud and the target fitting plane under the target coordinate system to obtain a plurality of first distances, wherein each point corresponds to one first distance;
And determining a point corresponding to a first distance larger than a first distance threshold value in the first distances as an obstacle point at the current moment.
2. The method according to claim 1, wherein performing a plane fit using the data of the history point cloud to obtain a target fit plane in a target coordinate system comprises:
performing plane fitting by utilizing data of a plurality of points in the history point cloud to obtain a fitting plane under the target coordinate system;
calculating second distances between each of a plurality of points used for fitting the fitting plane and the fitting plane to obtain a plurality of second distances, wherein each of the plurality of points corresponds to one second distance;
determining points corresponding to second distances smaller than a second distance threshold value in the plurality of second distances as inner points;
calculating the ratio of the number of the internal points to the number of a plurality of points for fitting the fitting plane to obtain an internal point rate;
if the interior point rate is less than the interior point rate threshold, then: repeating the steps until the internal point rate is greater than or equal to the internal point rate threshold, wherein when the steps are repeatedly performed, the data of a plurality of points used in each plane fitting are not completely the same; or repeatedly executing the steps for times reaching a time threshold, and determining the fitting plane corresponding to the maximum internal point rate as the target fitting plane;
And if the interior point rate is greater than or equal to the interior point rate threshold, determining the fitting plane as the target fitting plane.
3. The method according to claim 2, wherein the data of the plurality of points in the history point cloud is used for the N-th time to perform the plane fitting, and the data of the plurality of points used for the plane fitting is used for the N-1-th time of the data of the plurality of inner points in the step of obtaining the fitting plane under the target coordinate system, wherein N is a positive integer, and N is equal to or greater than 2.
4. The method of claim 1, wherein the data of the historical point cloud and the data of the current point cloud are data acquired by a depth sensor in the mobile device, and the target coordinate system is a coordinate system constructed with a target point in the mobile device as an origin, wherein the target point is a point not belonging to the depth sensor.
5. The method of claim 4, wherein the data of the historical point cloud isWherein->For the data under the depth sensor coordinate system, the gesture of the mobile device corresponding to each point in the history point cloud is +. >Wherein, the method comprises the steps of, wherein,
performing plane fitting by using the data of the history point cloud to obtain a target fitting plane under a target coordinate system, including:
using the formulaWill->Data converted into a history point cloud in the target coordinate system, wherein +.>For data of a history point cloud in the target coordinate system +.>For the pose of the mobile device at the current time, and (2)>An external parameter for converting the depth sensor coordinate system to the target coordinate system;
data using the history point cloud in the target coordinate systemPerforming plane fitting to obtain the target fitting plane under the target coordinate system;
if the data of the current point cloud is not the data in the target coordinate system, converting the data of the current point cloud into the data in the target coordinate system includes:
if the data of the current point cloud is the data under the depth sensor coordinate system, using a formulaConverting the data of the current point cloud in the depth sensor coordinate system into the data in the target coordinate system, wherein ∈>For the data of the current point cloud in the target coordinate system,/for the data of the current point cloud in the target coordinate system>Is data of a current point cloud in the depth sensor coordinate system obtained by the depth sensor.
6. The method according to claim 1, wherein the method further comprises:
acquiring an obstacle point cloud in a second history period in the travelling process of the mobile equipment, wherein the obstacle point cloud is a set of obstacle points at each moment in the second history period;
and determining each obstacle point in the obstacle point cloud in the second history period as the obstacle point of the current moment.
7. The method of claim 6, wherein the data of the obstacle point cloud over the second historical period isSaid->For the data in the target coordinate system, the gesture of the mobile device corresponding to each point in the obstacle point cloud in the second history period is +.>Wherein, the method comprises the steps of, wherein,
the determining each obstacle point in the obstacle point cloud in the second history period as the obstacle point at the current time includes:
using the formulaConverting the data of the obstacle point cloud in the second history period into the point cloud data of the current moment, wherein +_>For the data of the obstacle point cloud at the current moment under the converted target coordinate system, < +.>A gesture of the mobile device at the current moment;
will be And determining the obstacle point at the current moment as the obstacle point at the current moment.
8. The method of claim 1, wherein the mobile device comprises a depth sensor, and wherein the data of the historical point cloud and the data of the current point cloud are coordinate data of detection points in a detection blind area of a main sensor of the mobile device obtained through the depth sensor.
9. A detection apparatus for an obstacle point, characterized by comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store executable instructions that cause the processor to perform the operations of the obstacle point detection method according to any one of claims 1 to 7.
10. A computer-readable storage medium, wherein executable instructions are stored in the storage medium, which executable instructions, when executed, perform the method of detecting an obstacle point according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410150050.1A CN117671648B (en) | 2024-02-02 | 2024-02-02 | Obstacle point detection method, obstacle point detection device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410150050.1A CN117671648B (en) | 2024-02-02 | 2024-02-02 | Obstacle point detection method, obstacle point detection device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117671648A true CN117671648A (en) | 2024-03-08 |
CN117671648B CN117671648B (en) | 2024-04-26 |
Family
ID=90069918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410150050.1A Active CN117671648B (en) | 2024-02-02 | 2024-02-02 | Obstacle point detection method, obstacle point detection device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117671648B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110893617A (en) * | 2018-09-13 | 2020-03-20 | 深圳市优必选科技有限公司 | Obstacle detection method and device and storage device |
CN113468941A (en) * | 2021-03-11 | 2021-10-01 | 长沙智能驾驶研究院有限公司 | Obstacle detection method, device, equipment and computer storage medium |
CN114994635A (en) * | 2022-05-18 | 2022-09-02 | 上海涵润汽车电子有限公司 | Intelligent driving travelable area detection method and device |
WO2022226831A1 (en) * | 2021-04-28 | 2022-11-03 | 深圳元戎启行科技有限公司 | Method and apparatus for detecting category-undefined obstacle, and computer device |
CN117408935A (en) * | 2022-07-07 | 2024-01-16 | 合肥智行者科技有限公司 | Obstacle detection method, electronic device, and storage medium |
-
2024
- 2024-02-02 CN CN202410150050.1A patent/CN117671648B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110893617A (en) * | 2018-09-13 | 2020-03-20 | 深圳市优必选科技有限公司 | Obstacle detection method and device and storage device |
CN113468941A (en) * | 2021-03-11 | 2021-10-01 | 长沙智能驾驶研究院有限公司 | Obstacle detection method, device, equipment and computer storage medium |
WO2022226831A1 (en) * | 2021-04-28 | 2022-11-03 | 深圳元戎启行科技有限公司 | Method and apparatus for detecting category-undefined obstacle, and computer device |
CN114994635A (en) * | 2022-05-18 | 2022-09-02 | 上海涵润汽车电子有限公司 | Intelligent driving travelable area detection method and device |
CN117408935A (en) * | 2022-07-07 | 2024-01-16 | 合肥智行者科技有限公司 | Obstacle detection method, electronic device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN117671648B (en) | 2024-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11155258B2 (en) | System and method for radar cross traffic tracking and maneuver risk estimation | |
CN109521756B (en) | Obstacle motion information generation method and apparatus for unmanned vehicle | |
US20200049511A1 (en) | Sensor fusion | |
JP5206752B2 (en) | Driving environment recognition device | |
CN107632308B (en) | Method for detecting contour of obstacle in front of vehicle based on recursive superposition algorithm | |
Weon et al. | Object Recognition based interpolation with 3d lidar and vision for autonomous driving of an intelligent vehicle | |
US10131446B1 (en) | Addressing multiple time around (MTA) ambiguities, particularly for lidar systems, and particularly for autonomous aircraft | |
CN113853533A (en) | Yaw rate from radar data | |
JP6959056B2 (en) | Mobile robot control device and control method | |
Kim et al. | Probabilistic threat assessment with environment description and rule-based multi-traffic prediction for integrated risk management system | |
GB2560618A (en) | Object tracking by unsupervised learning | |
CN112171675B (en) | Obstacle avoidance method and device for mobile robot, robot and storage medium | |
JP2019152575A (en) | Object tracking device, object tracking method, and computer program for object tracking | |
US20220245835A1 (en) | Geo-motion and appearance aware data association | |
TWI680898B (en) | Light reaching detection device and method for close obstacles | |
JP2018206038A (en) | Point group data processing device, mobile robot, mobile robot system, and point group data processing method | |
CN114030483A (en) | Vehicle control method, device, electronic apparatus, and medium | |
CN117671648B (en) | Obstacle point detection method, obstacle point detection device and storage medium | |
CN117912295A (en) | Vehicle data processing method and device, electronic equipment and storage medium | |
Hoang et al. | Proposal of algorithms for navigation and obstacles avoidance of autonomous mobile robot | |
CN116215518A (en) | Unmanned mining card front road collision risk prediction and quantification method | |
CN116224232A (en) | Radar-based object height estimation | |
CN114815809A (en) | Obstacle avoidance method and system for mobile robot, terminal device and storage medium | |
CN114839628A (en) | Object detection with multiple distances and resolutions | |
CN114730495A (en) | Method for operating an environment detection device with grid-based evaluation and with fusion, and environment detection device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |