Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
Taking the self-moving device as an example of the mowing robot, when the mowing robot moves on a lawn including a grass slope to perform mowing operation, since the mowing robot cannot accurately identify a stone with a low height on the lawn when performing obstacle identification, the mowing robot collides with the stone, so that the mowing robot cannot continue mowing operation, and the working efficiency is reduced. Or, taking the self-moving device as an unmanned vehicle as an example, when the unmanned vehicle runs on the road, a short obstacle on the road cannot be accurately identified, so that the unmanned vehicle collides with the obstacle and cannot run normally.
Therefore, a method for identifying obstacles is needed, so that the self-moving device can be prepared to identify obstacles with lower height, and the working efficiency of the self-moving device is improved.
The method provided by the embodiment of the application may be executed by a first device, or may be executed by a chip in the first device, where the first device may be a non-self-moving device, such as a server, an electronic device (e.g., a mobile phone), or a self-moving device, for example, the self-moving device may be a robot (e.g., a mowing robot, a sweeping robot, a mine clearing robot, a cruising robot, etc.), or may be a smart car, etc. When the first device is a device other than the self-mobile device, the first device may communicate with the self-mobile device, for example, an Application (APP) corresponding to the self-mobile device may be installed on the mobile phone, and the user may operate the APP in the mobile phone to trigger the mobile phone and the self-mobile device to establish a communication connection. When the first device is a server, the user may trigger the self-mobile device to report the depth image and/or the RGB image of the target area to the server in the APP of the mobile phone communicating with the self-mobile device, so that the server executes the method for identifying the obstacle provided in the embodiment of the present application. Or the self-moving device is provided with a first control, and the user triggers the first control to trigger the self-moving device to report the depth image and/or the RGB image of the target area acquired by the self-moving device to the server, so that the server executes the method for identifying the obstacle provided by the embodiment of the present application.
The method for identifying an obstacle according to an embodiment of the present application is described in detail below with reference to fig. 1.
S101, the first equipment obtains target point cloud data in a target area, wherein the target point cloud data does not include ground point cloud data.
In some embodiments, the first device may obtain a depth image from a current location of the mobile device through a depth camera mounted on the mobile device, and may determine an area other than a ground area in the depth image as a target area.
Optionally, the first device may also obtain a corresponding depth image through a depth camera, a laser scanner, or other instruments such as a laser radar, which are mounted on the self-moving device, where the depth image includes point cloud data.
S102, the first equipment performs fitting clustering processing on the target point cloud data to obtain obstacle point cloud data to be confirmed.
The obstacle point cloud data to be confirmed does not include point cloud data which can be directly determined as an obstacle in the target point cloud data and point cloud data which can be directly determined as a non-obstacle in the target point cloud data.
Specifically, the fitting clustering process includes a fitting process and a clustering process, wherein the fitting process is a fitting plane process for determining whether a target region includes a slope. When a slope exists, removing point cloud data corresponding to the slope from the target point cloud data, and carrying out clustering processing on the remaining point cloud data to obtain one or more obstacle point cloud data to be confirmed; and when no slope exists, clustering the target point cloud data to obtain a plurality of obstacle point cloud data to be confirmed.
S103, the first device obtains first color information of a plane where the cloud data of the obstacle points to be confirmed are located.
In some embodiments, the first color information is color information with the largest area ratio determined from one or more color information of a plane where the obstacle to be confirmed is located.
For example, as shown in fig. 2, when the plane on which the obstacle point cloud data 201 to be confirmed is located is a plane 202, the first color information is color information in which an area ratio of colors is largest among one or more kinds of color information of the plane 202.
S104, the first device obtains second color information of an area where the cloud data of the obstacle point to be confirmed are located.
In some embodiments, the second color information is color information with the largest area ratio among one or more color information of the area where the obstacle point cloud data to be confirmed is located.
As shown in fig. 3, when the region where the obstacle point cloud data to be confirmed is located is a region 301, the second color information is color information in which the area ratio of colors is the largest among the one or more color information of the region 301.
And S105, when the second color information is not matched with the first color information, the first equipment determines that the obstacle point cloud data to be confirmed are obstacle point cloud data.
In some embodiments, when the color difference value between the first color information and the second color information is greater than a preset color difference threshold, the second color information may be considered as not matching the first color information.
It should be understood that in the target area, the target point cloud data excluding the ground point cloud data is obtained, fitting clustering processing is performed on the target point cloud data to obtain obstacle point cloud data to be confirmed, and when first color information of a plane where an obstacle to be confirmed is located is compared with second color information of an area where the obstacle is located, the obstacle point cloud data to be confirmed is determined to be the obstacle point cloud data when the second color information is not matched with the first color information. It can be seen that, when a short obstacle exists in the target area, in the area where the obstacle point cloud data to be confirmed is located, the color area occupying ratio is the largest and is the color mainly of the obstacle, that is, the first color information, and in the plane where the obstacle point cloud data to be confirmed is located, the color occupying ratio is the largest and is the color mainly of the plane where the obstacle is located, that is, the second color information. Since there is a certain difference between the color of the obstacle and the color of the plane on which the obstacle is located, the second color information does not match the first color information. The self-moving equipment can accurately identify the cloud data of the obstacle points according to the first color information and the second color information, so that the self-moving equipment can accurately identify short obstacles, the obstacle can be accurately avoided, and the working efficiency of the self-moving equipment is effectively improved.
In one possible implementation manner of the present application, the above S101 may be implemented by S1 to S4 in fig. 4:
s1, the first equipment acquires original point cloud data, wherein the original point cloud data are point cloud data based on a coordinate system of the self-moving equipment.
In some embodiments, the raw point cloud data is point cloud data of all objects in the depth image acquired from the mobile device under the mobile device coordinate system.
For example, if the target area includes lawn, grass slope, stone, and tree, the original point cloud data includes ground point cloud data, point cloud data corresponding to grass slope, point cloud data corresponding to stone, and point cloud data corresponding to tree.
In some embodiments, the self-moving device may also acquire the raw point cloud data through a depth camera, or a laser scanner, or a lidar, among other instruments that may acquire point cloud data.
Illustratively, when a depth camera is used to obtain point cloud data, the coordinates of each image pixel point in a depth image, the depth value of the image pixel point, and the internal parameters corresponding to the camera are calculated to obtain the point cloud data of each image pixel point in the depth image in a camera coordinate system, which can be specifically obtained according to formulas (1) to (4): .
z c = d equation (3)
p c =(x c ,y c ,z c ) Formula (4)
Wherein, (u, v) represents the pixel coordinate of an image pixel point in the depth image under the image coordinate system; d is the depth value of the image pixel point; l x ,l y ,f x And f y All internal parameters are corresponding to the camera; p is a radical of c And point cloud data coordinates of an image pixel point in the depth image under a camera coordinate system are represented.
Further, each point cloud data under the camera coordinate system is converted according to formula (5) to obtain the original point cloud data under the coordinate system of the mobile device:
p r =R rc ×p c +T rc formula (5)
Wherein p is r Coordinates representing the raw point cloud data in a coordinate system of the self-moving device; r rc Representing rotation parameters from a camera coordinate system to a coordinate system of the mobile device; t is rc Representing translation parameters from a camera coordinate system to a coordinate system of the mobile device; the rotation parameters and the translation parameters may be obtained by actual measurements.
The coordinates of each image pixel point in the depth image under the coordinate system of the mobile device can be obtained through the formulas (1) to (5).
It should be understood that the coordinate system of the self-moving device takes the direction of the self-moving device as the positive direction of the X axis, the direction of the counterclockwise 90 degrees from the direction of the self-moving device as the positive direction of the Y axis, and the direction of the upward is the positive direction of the Z axis perpendicular to the plane of the self-moving device.
S2, the first equipment obtains a normal vector corresponding to each point in the original point cloud data and a first included angle between the normal vector and a preset coordinate axis on a coordinate system.
It should be understood that the normal vector of the point cloud data is a normal vector corresponding to a plane on which the current point cloud data fits at least two surrounding point cloud data. The preset coordinate may be a horizontal axis (X axis), a vertical axis (Y axis), or a vertical axis (Z axis) on the mobile device coordinate system.
It should be understood that the first angle between the normal vector and the preset coordinate axis on the self-moving device coordinate system is the angle between the normal vector and the positive direction of the Z-axis in the self-moving device coordinate system.
And S3, when a first included angle of a point in the original point cloud data is smaller than a first included angle threshold value and the height of the point is smaller than a preset height threshold value, the point cloud data to which the corresponding point belongs is taken as ground point cloud data by the first equipment.
In some embodiments, the first angle threshold may be determined according to a user setting.
It should be understood that, since in practical cases, the angles between the normal vectors of all the point cloud data corresponding to the ground and the Z-axis of the coordinate system of the mobile device are not all equal to zero, the first angle threshold may be set to a smaller value.
In some embodiments, the preset altitude threshold may be determined based on the size of the mobile device from the mobile device.
For example, taking the moving device of the mobile device as a wheel as an example, if the diameter of the wheel is 5 cm, the preset height threshold may be 5 cm.
In a possible implementation manner, a preset height threshold may be set to be 5 cm, and a preset first included angle threshold is 5 degrees, then, in the original point cloud data, a component of a Z axis under a coordinate system of the mobile device is smaller than 5 cm, and point cloud data to which a point, of which an included angle between a normal vector of the point cloud data and the Z axis in the coordinate system is smaller than or equal to 5 °, belongs may be determined as ground point cloud data.
It should be understood that if the target area includes an obstacle parallel to the ground, for example, the target area includes a table, in this case, the table is determined as the ground point cloud data only by determining the first angle of the point in the original point cloud data, or if the target area includes an obstacle with a lower height, for example, the target area includes a nail, in this case, the nail is determined as the ground point cloud data only by the height of the point in the point cloud data, and therefore, the first angle of the point in the original point cloud data may be smaller than the first angle threshold, and the point cloud data to which the point with the height smaller than the preset height threshold belongs may be determined as the ground point cloud data, so as to improve the accuracy of the identification.
And S4, removing the ground point cloud data from the original point cloud data by the first equipment to obtain target point cloud data.
It should be understood that the ground point cloud data may interfere with the process of fitting the clusters in the subsequent process of fitting the clusters because the ground is not an obstacle. Therefore, in the process of identifying the obstacle, the ground point cloud data does not need to be subjected to fitting clustering processing, and in this case, the first device can remove the ground point cloud data from the original point cloud data to obtain the target point cloud data.
When fitting processing is carried out, the ground point cloud data can interfere the process of fitting a plane, so that the interference of the ground point cloud data on obstacle identification can be effectively reduced by removing the ground point cloud data from the original point cloud data, the accuracy of the acquired obstacle point cloud data to be confirmed is improved, meanwhile, the calculated amount of irrelevant ground point cloud data is reduced, the fitting processing efficiency is improved, and the identification efficiency of the obstacle point cloud data to be confirmed is improved.
In a possible implementation manner of the present application, the above S102 may include specific implementations of S5 to S8 in fig. 5:
and S5, carrying out plane fitting on the target point cloud data by the first equipment.
For example, when performing plane fitting on the target point cloud data, the plane fitting may be performed on the target point cloud data by a random sample consensus (rannac) algorithm.
It should be understood that, in this embodiment, a ranac algorithm is taken as an example to perform fitting on the target point cloud data, and in an actual use process, an algorithm for performing plane fitting is not limited, for example, a least square method may be used to perform plane fitting on the target point cloud data, or a gray value interpolation algorithm may be used to perform plane fitting on the target point cloud data.
And S7, when the target point cloud data are fitted to obtain a fitting plane, the first equipment determines first point cloud data located on the fitting plane and a second included angle of the fitting plane relative to a preset coordinate axis on a coordinate system of the mobile equipment according to the fitting plane.
For example, point cloud data with the shortest distance to a fitting plane smaller than an interior point distance threshold in the fitting process may be referred to as interior point data, and when the target point cloud data is subjected to fitting processing, if the number of the interior point data is greater than a preset interior point threshold, the target point cloud data may be considered to have been subjected to fitting, so as to obtain a fitting plane, where the number of the interior points of the fitting plane is greater than the preset interior point threshold.
It should be understood that if the target area includes a slope, the target point cloud data includes other point cloud data in the target area in addition to the ground point cloud data. For example, the target point cloud data includes point cloud data of a slope, point cloud data of an obstacle on the ground, and/or point cloud data of an obstacle on the slope.
The slope has a large volume in the target area, so that the number of the point cloud data of the slope is large, most of the point cloud data of the slope are located on the same plane, and in the process of fitting the point cloud data, the fitting principle is that the fitting plane has as many inner points as possible, so that after the target point cloud data are subjected to plane fitting, the obtained fitting plane can be regarded as the plane of the slope, and the first point cloud data on the fitting plane can be regarded as the point cloud data of the slope.
For example, when the interior point threshold is set to eighty percent of the number of target point cloud data, that is, the number of interior points of the fitting plane is greater than eighty percent of the number of target point cloud data, the fitting plane is determined as a slope in the target area. If the target area is as shown in fig. 6, the target area includes a slope 601, the target area includes a tree on the ground, that is, an obstacle 602, and a stone on the slope, that is, an obstacle 603, the target point cloud data includes point cloud data corresponding to the slope 601, point cloud data corresponding to the obstacle 602, and point cloud data corresponding to the obstacle 603, the number of the target point cloud data is 31000, the number of the point cloud data corresponding to the slope 601 is 25990, the number of the point cloud data corresponding to the obstacle 602 and the obstacle 603 is 4010, plane fitting is performed on the target point cloud data by using a ranac algorithm, the obtained point cloud data corresponding to the fitting plane 604 is the point cloud data corresponding to the slope 601, and the number of the inner points corresponding to the fitting plane is 83.3% of the target point cloud data. Therefore, it can be seen that if the number of interior points in the fitting process is greater than or equal to the interior point threshold, a fitting plane may be obtained, that is, the target region includes a slope, and the fitting plane may be considered as a plane of the slope.
Alternatively, the inlier threshold may be determined from the number of target point cloud data, for example, the inlier threshold may be eighty percent of the number of target point cloud data.
After the fitting plane is obtained, the first point cloud data on the fitting plane and a second included angle of the fitting plane relative to a preset coordinate axis on a coordinate system of the mobile device can be determined according to the fitting plane.
It should be understood that the second included angle is an included angle between the fitting plane and a Z axis in the coordinate system of the self-moving device, and the second included angle may be used to describe an included angle between the fitting plane and a plane where the self-moving device is located, and further, the second included angle may also be used to describe a slope of a slope in the target area.
After the second angle is obtained, the operation of S8 or S9 may be performed according to a relationship between the second angle and a second angle threshold.
And S8, when the second included angle is larger than or equal to a second included angle threshold value, the first equipment determines the first point cloud data as obstacle point cloud data.
In some embodiments, when the second included angle is greater than or equal to the second included angle threshold, that is, when the slope of the slope is greater than or equal to the second included angle threshold, the first device determines the point cloud data of the slope as the point cloud data of the obstacle, thereby determining the slope as the obstacle, and may perform obstacle avoidance processing on the obstacle.
For example, the second included angle threshold is set to be 45 degrees, when the slope of the slope is greater than 45 degrees, that is, the included angle between the fitting plane corresponding to the slope and the Z axis in the coordinate system of the self-moving device is greater than 45 degrees, since the self-moving device cannot pass through the slope with the slope of greater than or equal to 45 degrees, the slope is regarded as an obstacle, and an obstacle avoidance measure needs to be performed.
S9, when the second included angle is smaller than a second included angle threshold value, the first equipment removes the first point cloud data from the target point cloud data to obtain second point cloud data;
in some embodiments, when the second included angle is smaller than the second included angle threshold, that is, when the slope of the slope is smaller than the second included angle threshold, the slope is determined to be a passable slope, in other words, that the fitting plane point cloud data is not the obstacle point cloud data. In this case, the first point cloud on the fitting plane in the target point cloud data is removed, resulting in second point cloud data.
For example, if the second included angle threshold is set to 45 degrees, when the gradient of the slope is smaller than 45 degrees, the included angle between the fitting plane corresponding to the slope and the Z axis in the coordinate system of the self-moving device is smaller than 45 degrees, and the slope with the gradient smaller than 45 degrees can be smoothly passed through by the self-moving device, so that the slope is not an obstacle. In this case, the first point cloud on the fitting plane in the target point cloud data is removed, resulting in second point cloud data.
And S10, clustering the second point cloud data to obtain obstacle point cloud data to be confirmed.
After the second point cloud data is obtained, clustering processing is performed on the second point cloud data to obtain obstacle point cloud data to be confirmed, and the specific clustering processing process is realized by S11 to S14 in fig. 8.
Based on the technical scheme, because the robot is influenced by the maximum slope angle when climbing a slope, the robot usually executes obstacle avoidance processing when the front object exceeds the maximum slope angle. Therefore, when the target area is determined to be a slope through plane fitting and the second included angle of the slope is greater than or equal to the second included angle threshold, the slope angle of the slope exceeds the maximum slope angle, so that the robot cannot pass through the slope, and the slope is identified as an obstacle. On the contrary, when the second included angle of the slope is smaller than the second included angle threshold value, the robot can pass through the slope, and at the moment, the point cloud data of the fitting plane is regarded as non-obstacle point cloud data. The point cloud data of the fitting plane is removed through the target point cloud data, more accurate obstacle point cloud data to be confirmed can be obtained, the technical problem that the self-moving equipment can mistakenly identify the slope which can pass through easily as the obstacle, misjudgment can be caused to occur on the self-moving equipment, and therefore working efficiency is low is solved, the technical effect that only the slope which cannot pass through the robot can be accurately identified as the obstacle is achieved, and working efficiency of the self-moving equipment is improved.
In one possible embodiment of the present application, after S5, the method provided in the embodiment of the present application further includes S6 in fig. 5: when the plane is not obtained through fitting, the first equipment takes the target point cloud data as second point cloud data, and carries out clustering processing on the second point cloud data to obtain obstacle point cloud data to be confirmed.
It should be understood that in the fitting process, when the number of the interior points is greater than the preset threshold value of the interior points, the plane is obtained through fitting. In this case, if the target area does not include a slope, the target point cloud data is point cloud data of one or more other obstacles in the target area except the ground point cloud data, and since heights of the obstacles in the one or more obstacles are generally different and distribution of the obstacles in the one or more obstacles is not a fixed rule, distribution of the target point cloud is relatively dispersed in a coordinate system of the mobile device, plane fitting is performed on the target point cloud data at this time, and since an inner point of a fitting plane in a fitting process cannot be greater than or equal to a preset inner point threshold, the fitting plane cannot be obtained.
For example, the internal point threshold is set to be eighty percent of the number of the target point cloud data, that is, when the internal points in the fitting process are smaller than eighty percent of the number of the target point cloud data, a plane cannot be obtained through fitting, and then the target area does not include a slope. If the target area is as shown in fig. 7, the target area does not include a slope, includes the ground 701, and includes 5 obstacles, respectively, namely, the obstacle 702, the obstacle 703, the obstacle 704, the obstacle 705, and the obstacle 706, at this time, the target point cloud data includes that the number of point cloud data corresponding to the obstacle 702 is 1030, the number of point cloud data corresponding to the obstacle 703 is 890, the number of point cloud data corresponding to the obstacle 704 is 2305, the number of point cloud data corresponding to the obstacle 705 is 760, the number of point cloud data corresponding to the obstacle 706 is 861, and the number of target point cloud data is 6000. At the moment, the Randac algorithm is used for plane fitting on the target point cloud, the maximum value of the number of the obtained interior points is 30.2% of the data of the target point cloud, and the plane cannot be obtained through fitting at the moment. Therefore, it can be seen that if the number of interior points in the process of fitting the plane is less than the interior point threshold, the plane cannot be fitted, and further, it can be determined that the target region does not include a slope.
Based on the technical scheme, when a plane is not obtained through fitting, the fact that the slope is not included in the target area is shown, and therefore the condition that the slope point cloud data interfere clustering processing does not exist.
In a possible embodiment of the present application, in S10, the clustering process is performed on the second point cloud data to obtain obstacle point cloud data to be confirmed, which may be specifically implemented by S11 to S14 in fig. 8:
and S11, clustering the second point cloud data by the first equipment to obtain different cluster groups.
In some embodiments, after the second point cloud data is clustered, one or more different cluster groups are obtained, where the cluster groups are multiple point cloud data obtained by clustering according to preset obstacle categories, and the number of points of the point cloud data included in different cluster groups is different.
Specifically, when the second point cloud data is subjected to clustering, the algorithm for performing clustering on the second point cloud data is not limited. For example, the second point cloud data may be clustered using a k-means clustering algorithm, a gaussian mixture model algorithm, an expectation maximization algorithm, or the like.
And S12, the first equipment determines suspicious obstacle cluster groups from the cluster groups according to the number of the points of the point cloud data corresponding to each cluster group.
It should be understood that if an obstacle exists in the target area, the number of points of the point cloud data corresponding to the cluster group of the obstacle exceeds a certain threshold, and therefore the cluster group in which the number of points of the point cloud data corresponding to the cluster group exceeds the threshold is determined as the suspicious obstacle cluster group.
After the suspicious obstacle cluster is determined, determining the height of the closest plane to the cluster may include S13 or S14:
and S13, when the height of the plane closest to the suspicious obstacle cluster exceeds a third height threshold, the first equipment marks the point cloud data corresponding to the suspicious obstacle cluster as the point cloud data of the obstacle.
Optionally, when the plane where the suspected obstacle cluster group is located is the ground, the height of the plane where the suspected obstacle cluster group is closest to the ground is the distance between the point of the point cloud data in the suspected obstacle cluster group and the ground, and when the plane where the suspected obstacle cluster group is located is the slope, the height of the plane where the suspected obstacle cluster group is closest to the ground is the distance between the point of the point cloud data in the suspected obstacle cluster group and the slope.
For example, if the third height threshold is set to be 20 cm, when the height of a point in the point cloud data in the suspected obstacle clustering group from the nearest plane exceeds 20 cm, the point cloud data corresponding to the suspected obstacle clustering group is calibrated as the point cloud data of the obstacle.
Optionally, when determining the height of the closest plane to the suspicious obstacle cluster group and when the target region includes a slope, the determination may be performed by the method 1:
mode 1 may be: when the target area comprises a slope, respectively calculating the distance between the suspicious obstacle cluster group and the ground and the distance between the suspicious obstacle cluster group and a slope plane, and determining the smaller distance value as the height of the plane with the nearest distance between the suspicious obstacle cluster group and the slope plane.
For example, if the distance between the suspicious obstacle and the ground is 3 meters, and the distance between the suspicious obstacle cluster and the plane of the slope is 50 centimeters, the height of the suspicious obstacle cluster to the nearest plane is 50 centimeters.
Optionally, when determining the height of the suspicious obstacle cluster from the nearest plane and when the target region does not include a slope, the determination may be performed by way of 2:
the mode 2 may be: and when the target area does not comprise the slope, determining the distance between the suspicious obstacle cluster group and the ground as the height of the plane with the nearest distance to the suspicious obstacle cluster group.
Specifically, when the self-moving device is located on the ground, when determining the distance from the suspected obstacle cluster group to the ground, the distance may be determined in any one of manner 3 to manner 7:
mode 3 may be: determining the coordinate of a center point of a suspicious obstacle cluster group according to the coordinates of a plurality of point cloud data in the suspicious obstacle cluster group under the coordinate system of the self-moving equipment, determining the distance between the suspicious obstacle cluster group and the ground according to the coordinate of the center point, and determining the shortest distance between the center point and the ground as the distance between the suspicious obstacle cluster group and the ground.
It is to be understood that, in determining the coordinates of the center point of the cluster that can be clustered with the obstacle, the coordinates of the point cloud data in all the clusters may be averaged as the coordinates of the center point of the cluster.
For example, the suspected obstacle cluster group includes 10 point cloud data, coordinates of the 10 point cloud data under the coordinates of the mobile device are (13, 54, 23), (15, 56, 44), (13, 51, 37), (16, 53, 22), (14, 54, 34), (16, 52, 71), (17, 53, 41), (19, 55, 35), (12, 49, 29), (18, 53, 49), and coordinates of the central point are (15.3, 53, 38.5), where a length corresponding to one unit under the coordinate system of the mobile device is 1 cm, and a distance from the suspected obstacle cluster group to the ground is 38.5 cm.
Mode 4 may be: according to the coordinates of the point cloud data in the suspicious obstacle cluster group under the coordinate system of the mobile device, the point cloud data with the highest height in the point cloud data of the suspicious obstacle cluster group is determined, and then according to the coordinates corresponding to the point cloud data, the shortest distance from the point of the point cloud data to the ground is determined as the distance from the suspicious obstacle cluster group to the ground.
For example, the first target object includes 10 point cloud data, coordinates of the 10 point cloud data in a coordinate system of the mobile device are (13, 54, 23), (15, 56, 44), (13, 51, 37), (16, 53, 22), (14, 54, 34), (16, 52, 71), (17, 53, 41), (19, 55, 35), (12, 49, 29), (18, 53, 49), respectively, and coordinates of the point cloud data to which a point with the highest height belongs in the point cloud data are (16, 52, 71), where a length corresponding to one unit in the coordinate system of the mobile device is 1 cm, and a distance from the suspicious obstacle cluster to the ground is 71 cm.
Mode 5 may be: and determining the average value of a plurality of components according to the heights of a plurality of point cloud data in the suspicious obstacle cluster group under the coordinate system of the mobile equipment, and determining the average value as the distance from the suspicious obstacle cluster group to the ground.
For example, the suspicious obstacle cluster group includes 10 point cloud data, and the heights of the points of the 10 point cloud data in the coordinate system of the self-moving device, that is, the Z-axis components in the coordinate system of the self-moving device are: 23. 44, 37, 22, 34, 71, 41, 35, 29, 49, the average value of the multiple components is 38.5, where the length corresponding to one unit in the coordinate system of the mobile device is 1 cm, and the distance from the suspicious obstacle cluster to the ground is 38.5 cm.
Mode 6 may be: and determining the maximum value of the multiple components according to the heights of the multiple point cloud data in the first target object under the coordinate system of the mobile device, and determining the maximum value as the distance between the suspicious obstacle cluster group and the ground.
For example, the first target object includes 10 point cloud data, and the heights of points of the 10 point cloud data in the coordinate system of the self-moving device, that is, the Z-axis components in the coordinate system of the self-moving device are: 23. 44, 37, 22, 34, 71, 41, 35, 29, 49, the maximum value of the plurality of components is 71, wherein the length corresponding to one unit under the coordinate system of the mobile device is 1 cm, and the distance of the suspicious obstacle cluster from the ground surface is 71 cm.
Mode 7 may be: according to the height of the point cloud data in the suspicious obstacle cluster group under the coordinate system of the mobile device, namely the component of the point cloud data on the Z axis under the coordinate system of the mobile device, determining the median value of the components, and determining the median value as the distance between the suspicious obstacle cluster group and the ground.
For example, the suspected obstacle cluster group includes 10 point cloud data, and the Z-axis components of the 10 point cloud data in the coordinate system of the mobile device are: 23. 44, 37, 22, 34, 71, 41, 35, 29, 49, the median is 36, wherein the length corresponding to one unit in the coordinate system of the mobile device is 1 cm, and the distance of the suspicious obstacle cluster to the ground is 36 cm.
And S14, when the height of the plane closest to the suspicious obstacle cluster group is smaller than a third height threshold, the first equipment determines point cloud data corresponding to the suspicious obstacle cluster group as obstacle point cloud data to be confirmed.
Based on the technical scheme, after the suspected obstacle cluster group is determined, the point cloud data corresponding to the suspected obstacle cluster group with the height greater than or equal to the third height threshold value is directly determined as the obstacle point cloud data according to the height of the suspected obstacle cluster group from the nearest ground level, only the point cloud data corresponding to the suspected obstacle cluster group with the height less than the third height threshold value is determined as the point cloud data to be determined, the point cloud data to be determined is further analyzed and identified by combining color information, the calculation amount required in the obstacle point cloud data identification can be reduced, and meanwhile, the accuracy of the obstacle identification is improved by means of the technical means of determining the suspected obstacle with the height greater than the third height threshold value as the obstacle point cloud data.
It should be understood that the height of the plane closest to the suspected obstacle cluster may be determined by referring to the above method, and for brevity, the details are not described here.
In a possible embodiment of the present application, S103 may be specifically implemented by:
optionally, a camera is mounted on the mobile device and used for acquiring image information of a plane where the area corresponding to the obstacle point cloud data to be confirmed is located, and image information obtained by shooting with the camera mounted on the mobile device is sent to the first device, so that the first device acquires the image information of the plane where the area corresponding to the obstacle point cloud data to be confirmed is located.
For example, after acquiring the image information, the first device may perform color extraction on the image information, acquire a plurality of pieces of color information, and screen out dominant color information from the plurality of pieces of information as the first color information.
Alternatively, the image information may be an RGB image.
For example, when color extraction is performed on image information, a plurality of pieces of color information are obtained, and main color information is screened out from the plurality of pieces of information as first color information, the first color information may be determined by:
for example, a k-means clustering algorithm is used to extract a first color from image information of a target region, an extraction termination condition and a maximum iteration number are set before extraction, and a cluster center selection mode is set.
For example, the termination condition may be that the number of points of the point cloud data reassigned to different clusters is less than 3 at the next clustering, or that the cluster center of less than 2 cluster groups is changed at the next clustering.
It should be understood that, only the determination of the first color information by k-means clustering is taken as an example for explanation here, and in practical applications, an algorithm used for extracting the color in the image is not limited, and may be any algorithm that can be used for extracting the color in the image, for example, a deep learning algorithm, a median segmentation method, and the like.
Alternatively, the first color information may be a color number of a color included in the area or a change rule of the color.
Based on the technical scheme, the image information of the plane where the corresponding area of the obstacle point cloud data to be confirmed is located may contain multiple colors, namely, the color information with interference exists, so that the interference can be effectively reduced by screening out the main color information from the multiple color information as the first color information, and the obstacle identification accuracy is improved.
In a possible embodiment of the present application, S104 may be specifically implemented by the following steps:
after the obstacle to be confirmed is obtained, according to the point cloud data in the cluster group corresponding to the obstacle to be confirmed, an area corresponding to the cluster group corresponding to the obstacle to be confirmed is found in the corresponding RGB image, and the color information of the area is determined to be second color information.
Alternatively, the second color information may be a color number of a color included in the region or a change rule of the color.
In a possible embodiment of the present application, S105 may specifically be implemented by the following steps:
and calculating a color difference value of the second color information and the first color information, and when the color difference value is greater than a preset color difference threshold value, using the obstacle point cloud data to be confirmed as obstacle point cloud data.
For example, in calculating the Color difference Value of the second Color information and the first Color information, the image may be converted into a mode of an HVS Color Space (HVS) or an LAB Color Space (LAB) to calculate the Color difference Value of the second Color information and the first Color information.
It should be understood that, since the first color information represents color information of a plane where the obstacle point cloud data to be confirmed is located, and the second color information represents color information of an area where the obstacle point cloud data to be confirmed is located, since there is a certain color difference between the color information of the point cloud data of the obstacle and the color information of the point cloud data of the plane where the obstacle is located, the obstacle point cloud data to be confirmed can be used as the obstacle point cloud data by calculating a color difference value between the second color information and the first color information when the color difference value is greater than a preset color difference threshold value.
For example, when weeding is performed on a lawn by a mobile device, if a gray stone is on the lawn, the point cloud data corresponding to the gray stone can be obtained as the obstacle point cloud data to be confirmed, the plane where the point cloud data of the stone is located is the lawn, and the area where the point cloud data of the stone is located is the stone. In this case, the first color information is green, the second color information is gray, and since a color difference between the gray and the green exceeds a preset color difference threshold, the point cloud data of the gray stone can be used as the obstacle point cloud data, so that the obstacle can be accurately identified.
By calculating the color difference value between the second color information and the first color information and using the obstacle point cloud data to be confirmed as the obstacle point cloud data when the color difference value is larger than the preset color difference threshold value, the short and low obstacles can be accurately identified, and when the color of the slope which can pass is the same as the color of the ground, the slope is prevented from being identified as the obstacle, so that the accuracy of identifying the obstacle is effectively improved.
In some embodiments, the first color information may also be a color change rule of a plane where the to-be-confirmed obstacle point cloud data is located, and the second color information may be a color change rule of an area where the to-be-confirmed point cloud data is located, in which case, if a similarity between the color change rule of the plane where the to-be-confirmed obstacle point cloud data is located and the color change rule of the area where the to-be-confirmed point cloud data is located is higher than a certain threshold, it is considered that a color difference between the second color information and the first color information is lower than the certain threshold, and in this case, it may be considered that the to-be-confirmed obstacle is not an obstacle.
In some embodiments of the application, after the suspicious obstacle cluster group is determined, obstacle point cloud data to be confirmed may be determined according to a height of a closest plane to the suspicious obstacle cluster group, and then, the obstacle point cloud data to be confirmed, in which the second color information is not matched with the first color information, may be confirmed as the obstacle point cloud data according to first color information of a plane where the point cloud data to be confirmed is located and second color information of an area where the point cloud data to be confirmed is located.
In other embodiments of the application, after the suspicious obstacle cluster is determined, the suspicious obstacle cluster in which the color information of the plane of the suspicious obstacle cluster is not matched with the color information of the area of the suspicious obstacle cluster may be determined as obstacle point cloud data to be determined according to the color information of the plane of the suspicious obstacle cluster and the color information of the area of the suspicious obstacle cluster, and then the obstacle point cloud data to be determined whose height is greater than a preset third height threshold may be determined as the obstacle point cloud data according to the height of the plane where the point cloud data to be determined is closest to the preset third height threshold.
Fig. 9 is a schematic block diagram of an obstacle identification apparatus 900 according to an embodiment of the present application, and includes an obtaining unit 901, a processing unit 902, and a confirming unit 903.
The acquiring unit 901 is configured to acquire target point cloud data in a target area, where the target point cloud data does not include ground point cloud data.
And the processing unit 902 is configured to perform fitting clustering processing on the target point cloud data to obtain obstacle point cloud data to be confirmed.
The obtaining unit 901 is further configured to obtain first color information of a plane where the cloud data of the obstacle point to be confirmed is located.
The obtaining unit 901 is further configured to obtain second color information of an area where the to-be-confirmed obstacle point cloud data is located.
A confirming unit 903, configured to confirm that the obstacle point cloud data to be confirmed is obstacle point cloud data when the second color information does not match the first color information.
Optionally, the obtaining unit 901 is further configured to obtain original point cloud data, where the original point cloud data is based on point cloud data in a coordinate system of the mobile device.
Optionally, the obtaining unit 901 is further configured to obtain a normal vector corresponding to each point in the original point cloud data, and a first included angle between the normal vector and a preset coordinate axis on the coordinate system.
Optionally, the determining unit 903 is further configured to use the point cloud data to which the corresponding point belongs as the ground point cloud data when the first included angle of the point in the original point cloud data is smaller than the first included angle threshold and the height of the point is smaller than a preset third height threshold.
Optionally, the processing unit 902 is further configured to remove the ground point cloud data from the original point cloud data to obtain target point cloud data.
Optionally, the processing unit 902 is further configured to, when the target point cloud data is fitted to obtain a fitting plane, determine, according to the fitting plane, first point cloud data located on the fitting plane and a second included angle between the fitting plane and a preset coordinate axis on the coordinate system of the mobile device.
Optionally, the determining unit 903 is further configured to determine the first point cloud data as point cloud data of an obstacle when the second included angle is greater than or equal to a second included angle threshold.
Optionally, the processing unit 902 is further configured to remove the first point cloud data from the target point cloud data when the second included angle is smaller than a second included angle threshold, so as to obtain second point cloud data.
Optionally, the processing unit 902 is further configured to perform clustering on the second point cloud data to obtain obstacle point cloud data to be confirmed. Optionally, the determining unit 903 is further configured to use the target point cloud data as the second point cloud data when a plane is not obtained through fitting.
Optionally, the determining unit 903 is further configured to perform clustering on the second point cloud data to obtain different cluster groups, where the cluster groups are multiple point cloud data obtained by clustering according to preset obstacle categories.
Optionally, the determining unit 903 is further configured to determine a suspicious obstacle cluster group from the cluster groups according to the number of points of the point cloud data corresponding to each cluster group.
Optionally, the determining unit 903 is further configured to calibrate the point cloud data corresponding to the suspicious obstacle cluster group as the point cloud data of the obstacle when the height of the suspicious obstacle cluster group from the closest plane exceeds a third height threshold.
Optionally, the confirming unit 903 is further configured to determine the point cloud data corresponding to the suspicious obstacle cluster group as the obstacle point cloud data to be confirmed when the height of the plane closest to the suspicious obstacle cluster group is smaller than the third height threshold.
Optionally, the method further includes a calculating unit, configured to calculate a color difference value between the second color information and the first color information.
Optionally, the confirming unit 903 is further configured to use the obstacle point cloud data to be confirmed as the obstacle point cloud data when the color difference value is greater than a preset color difference threshold.
Optionally, the obtaining unit 901 is further configured to obtain image information of a plane where the cloud data corresponding to the obstacle point to be confirmed is located.
Optionally, the processing unit 902 is further configured to perform color extraction on the image information to obtain a plurality of color information.
Optionally, the confirming unit 903 is further configured to screen out main color information from the plurality of pieces of color information as the first color information.
Optionally, the obtaining unit 901 is further configured to obtain laser radar point cloud data from the mobile device.
Optionally, the processing unit 902 is further configured to convert the laser radar point cloud number into a coordinate system of the mobile device, so as to obtain the original point cloud data.
It should be understood that the apparatus 900 of the embodiments of the present application may be implemented by an application-specific integrated circuit (ASIC), or a Programmable Logic Device (PLD), which may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof. The method for identifying an obstacle shown in fig. 1 may also be implemented by software, and when the method for identifying an obstacle shown in fig. 1 is implemented by software, the apparatus 900 and its various modules may also be software modules.
Fig. 10 is a schematic structural diagram of an apparatus according to an embodiment of the present application. As shown in fig. 10, the device 1000 includes a processor 1001, a memory 1002, a communication interface 1003 and a bus 1004. The processor 1001, the memory 1002, and the communication interface 1003 communicate with each other via the bus 1004, and may also communicate with each other by other means such as wireless transmission. The memory 1002 is used for storing instructions and the processor 1001 is used for executing the instructions stored by the memory 1002. The memory 1002 stores the program code 10021, and the processor 1001 may call the program code 10021 stored in the memory 1002 to execute the method of identifying an obstacle shown in fig. 1.
It should be understood that in the embodiments of the present application, the processor 1001 may be a CPU, and the processor 1001 may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or any conventional processor or the like.
The memory 1002 may include a read-only memory and a random access memory, and provides instructions and data to the processor 1001. The memory 1002 may also include non-volatile random access memory. The memory 1002 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (static RAM, SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), enhanced synchronous SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM).
The bus 1004 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. But for clarity of illustration the various busses are labeled in figure 10 as busses 1004.
It should be understood that the apparatus 1000 according to the embodiment of the present application may correspond to the device 900 in the embodiment of the present application, and may correspond to the first apparatus in the method shown in fig. 1 in the embodiment of the present application, and when the apparatus 1000 corresponds to the first apparatus in the method shown in fig. 1, the above and other operations and/or functions of each module in the apparatus 1000 are respectively for implementing the operation steps of the method executed by the first apparatus in fig. 1, and are not described herein again for brevity.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the foregoing method embodiments.
The embodiments of the present application provide a computer program product, which when running on a self-moving device, enables the self-moving device to implement the steps in the above-mentioned method embodiments when executed.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.