WO2023193567A1 - 机器人的移动控制方法和装置、存储介质及电子装置 - Google Patents

机器人的移动控制方法和装置、存储介质及电子装置 Download PDF

Info

Publication number
WO2023193567A1
WO2023193567A1 PCT/CN2023/080705 CN2023080705W WO2023193567A1 WO 2023193567 A1 WO2023193567 A1 WO 2023193567A1 CN 2023080705 W CN2023080705 W CN 2023080705W WO 2023193567 A1 WO2023193567 A1 WO 2023193567A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
robot
cloud data
point cloud
gap
Prior art date
Application number
PCT/CN2023/080705
Other languages
English (en)
French (fr)
Inventor
张陆涵
曹蒙
崔凌
Original Assignee
追觅创新科技(苏州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 追觅创新科技(苏州)有限公司 filed Critical 追觅创新科技(苏州)有限公司
Publication of WO2023193567A1 publication Critical patent/WO2023193567A1/zh

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present disclosure relates to the field of robots, and specifically to a movement control method and device for a robot, a storage medium and an electronic device.
  • the movement control method of the robot in the related art has the problem of long movement control due to the use of full image information to find gaps.
  • the purpose of this disclosure is to provide a robot movement control method and device, a storage medium and an electronic device, so as to at least solve the time-consuming problem of movement control caused by using full image information to find gaps in the robot movement control method in the related art. question.
  • a method for controlling movement of a robot including: obtaining point cloud data of a spatial environment where the target robot is located, and obtaining target point cloud data, wherein the target point cloud data includes Multiple target points; perform a clustering operation on the target point cloud data to obtain a set of clusters, wherein each cluster in the set of clusters contains at least one target point among the multiple target points. ; Select a target gap from the gap between each adjacent cluster in the group of clusters, where the target gap is a gap that allows the target robot to pass; control the target robot to move toward the Move in the direction of the target gap.
  • the point cloud data of the spatial environment where the target robot is located is obtained to obtain the target point.
  • the cloud data includes: during the process of controlling the rotation of the target robot, obtaining the point cloud data of the spatial environment where the target robot is located through the laser sensor on the target robot to obtain the target point cloud data.
  • performing a clustering operation on the target point cloud data to obtain a set of clusters includes: determining the relationship between each target point in the plurality of target points and a set of reference rays. The distance between each reference ray, wherein each reference ray corresponds to a reference angle respectively; according to the distance between each target point and each reference ray, determine the distance between each target point and each reference ray.
  • Target points matched by rays; the target points matched by each reference ray are determined as a cluster to obtain the set of clusters.
  • determining the distance between each target point in the plurality of target points and each reference ray in a set of reference rays includes: projecting each target point separately On each reference ray, the distance between each target point and each reference ray is obtained.
  • performing a clustering operation on the target point cloud data to obtain a set of clusters includes: downsampling the target point cloud data onto a two-dimensional grid map to obtain a set of clusters corresponding to the target point cloud data. a two-dimensional point corresponding to each target point in the plurality of target points; performing a clustering operation on the plurality of target points according to the two-dimensional point corresponding to each target point to obtain the set of clusters.
  • performing a clustering operation on the plurality of target points according to the two-dimensional points corresponding to each target point to obtain the set of clusters includes: determining the two-dimensional grid The grid to which the two-dimensional point corresponding to each target point in the grid map belongs, to obtain a set of target grids, wherein each target grid in the set of target grids contains the same information as the plurality of target points. two-dimensional points corresponding to at least one target point; perform a clustering operation on the multiple target points according to the two-dimensional points contained in each target grid and the neighborhood information of each target grid, and obtain The set of clusters.
  • selecting a target gap from the gap between each adjacent cluster in the set of clusters includes: converting the distance between each adjacent cluster into , determined as the size of the gap between each adjacent cluster; from the gaps between each adjacent cluster, select the gap with the largest size to obtain the target gap.
  • a mobile control device for a robot including: an acquisition unit, configured to acquire point cloud data of the spatial environment where the target robot is located, and obtain target point cloud data, wherein: The target point cloud data contains multiple target points; a clustering unit is used to perform a clustering operation on the target point cloud data to obtain a set of clusters, wherein each cluster in the set of clusters Containing at least one target point among the plurality of target points; a selection unit configured to select a target gap from the gap between each adjacent cluster in the set of clusters, wherein the target gap A gap that allows the target robot to pass; a control unit configured to control the target robot to move in the direction of the target gap.
  • the acquisition unit includes: an acquisition module, configured to acquire the spatial environment where the target robot is located through a laser sensor on the target robot during the process of controlling the rotation of the target robot. point cloud data to obtain the target point cloud data.
  • the clustering unit includes: a first determination module configured to determine a distance between each target point in the plurality of target points and each reference ray in a set of reference rays. , wherein each reference ray corresponds to a reference angle respectively; the second determination module is used to determine the distance between each target point and each reference ray according to the distance between each target point and each reference ray. Matching target points; a third determination module, configured to determine the target points matching each reference ray as a cluster respectively, to obtain the set of clusters.
  • the first determination module includes: a projection submodule, configured to project each target point onto each reference ray, to obtain the relationship between each target point and the The distance between each reference ray.
  • the clustering unit includes: a downsampling module, configured to downsample the target point cloud data onto a two-dimensional grid map to obtain the corresponding data corresponding to each of the plurality of target points. Two-dimensional points corresponding to the target points; a clustering module, configured to perform a clustering operation on the plurality of target points according to the two-dimensional points corresponding to each target point to obtain the set of clusters.
  • the clustering module includes: a determination sub-module, used to determine the grid to which the two-dimensional point corresponding to each target point in the two-dimensional grid map belongs, and obtain a set of targets. a grid, wherein each target grid in the set of target grids contains a two-dimensional point corresponding to at least one target point of the plurality of target points; The two-dimensional points contained in the grid and the neighborhood information of each target grid perform a clustering operation on the multiple target points to obtain the set of clusters.
  • the selection unit includes: a fourth determination module, configured to determine the distance between each adjacent cluster as the gap between each adjacent cluster. Size; the selection module is used to select the gap with the largest size from the gaps between each adjacent cluster to obtain the target gap.
  • a computer-readable storage medium stores a computer program, wherein the computer program is configured to execute the movement of the robot when running. Control Method.
  • an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the above-mentioned steps through the computer program.
  • Robot movement control method including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the above-mentioned steps through the computer program.
  • the point cloud data of the space environment where the robot is located is clustered, and the gaps that allow the robot to pass are selected from the gaps between adjacent clusters.
  • the point cloud data of the spatial environment is obtained to obtain the target point cloud data, in which the target point cloud data contains multiple target points; a clustering operation is performed on the target point cloud data to obtain a set of clusters, in which a set of clusters Each cluster in contains at least one target point among multiple target points; the target gap is selected from the gap between each adjacent cluster in a set of clusters, where the target gap is to allow the target robot to pass gap; control the target robot to move in the direction of the target gap.
  • the point cloud data of the spatial environment is clustered, and the gaps that allow the robot to pass are selected from the gaps between adjacent clusters, instead of using the full image information to find gaps, which can reduce the search range of the robot and reduce the cost.
  • the technical effect of time-consuming mobile control and improving space exploration efficiency thus solves the problem of long mobile control caused by using full image information to find gaps in the mobile control methods of robots in related technologies.
  • Figure 1 is a schematic diagram of the hardware environment of an optional robot movement control method according to an embodiment of the present disclosure
  • Figure 2 is a schematic flowchart of an optional robot movement control method according to an embodiment of the present disclosure
  • Figure 3 is a schematic flowchart of another optional robot movement control method according to an embodiment of the present disclosure.
  • Figure 4 is a structural block diagram of an optional robot movement control device according to an embodiment of the present disclosure.
  • FIG. 5 is a structural block diagram of an optional electronic device according to an embodiment of the present disclosure.
  • a movement control method of a robot is provided.
  • the above-mentioned robot movement control method can be applied to the hardware environment composed of the robot 102 and the server 104 as shown in FIG. 1 .
  • the robot 102 can be connected to a server 104 (for example, an Internet of Things platform or a cloud server) through a network to control the robot 102 .
  • a server 104 for example, an Internet of Things platform or a cloud server
  • the above-mentioned network may include but is not limited to at least one of the following: wired network, wireless network.
  • the above-mentioned wired network may include but is not limited to at least one of the following: wide area network, metropolitan area network, and local area network.
  • the above-mentioned wireless network may include at least one of the following: WIFI (Wireless Fidelity, Wireless Fidelity), Bluetooth, and infrared.
  • the robot 102 may include but is not limited to: cleaning robots, such as sweeping robots, floor washing robots, automatic mop washing robots, self-cleaning robots, etc.
  • the server 104 may be a server of an Internet of Things platform.
  • the movement control method of the robot in the embodiment of the present disclosure can be executed by the robot 102 and the server 104 individually, or can be executed jointly by the robot 102 and the server 104 .
  • the robot 102 may also perform the movement control method of the robot according to the embodiment of the present disclosure by a client installed thereon.
  • Figure 2 is a schematic flowchart of an optional robot movement control method according to an embodiment of the present disclosure. As shown in Figure 2, the method The process can include the following steps:
  • Step S202 Obtain point cloud data of the spatial environment where the target robot is located to obtain target point cloud data, where the target point cloud data contains multiple target points.
  • the movement control method of the robot in this embodiment can be applied to a scenario in which a short exploration of an unknown environment is realized by movement control of the robot.
  • the above-mentioned robot can be a cleaning robot (that is, a robot with a cleaning function, which can be a floor sweeping robot). (robot, it can also be a floor cleaning robot), and its corresponding unknown environment can be an area to be cleaned, it can be a flying robot, its corresponding unknown environment can be an area to be detected, or it can be other types of robots, which are not mentioned here.
  • the above-mentioned short-term exploration can be the robot's mapping operation of the space environment in which it is located, so as to avoid obstacles, etc.
  • the robot is equipped with a laser radar, which can be a radar system that detects the position, speed and other characteristics of the target by emitting laser beams. Its working principle is: emit a detection signal (laser beam) to the target, and the interface target reflects it back. Reflected signal (target echo); compare the received reflected signal with the transmitted signal. After appropriate processing, relevant information about the target can be obtained, such as target distance, orientation, height, speed, attitude, and even shape parameters. , thereby detecting, tracking and identifying targets.
  • Lidar can include a laser transmitter, an optical receiver, a turntable and an information processing system. The optical transmitter can convert electrical pulses into light pulses and send them out.
  • the optical receiver can convert the light pulses reflected back from the target into electrical pulses.
  • the point cloud data of the target can be obtained, and the obtained point cloud data can be sent to the display for display (which can be displayed in the form of a point cloud).
  • the target robot can emit a laser beam to the space environment where it is located.
  • the optical receiver on the target robot can restore the light pulse reflected from the target into an electrical pulse.
  • the obtained The point cloud data of the spatial environment in which it is located is the target point cloud data.
  • the above target point cloud data may contain multiple target points.
  • Step S204 Perform a clustering operation on the target point cloud data to obtain a set of clusters, where each cluster in the set of clusters contains at least one target point among the plurality of target points.
  • a clustering operation can be performed on the above target point cloud data to obtain a set of clusters, where each cluster in the set of clusters can contain at least one target point among multiple target points.
  • the obtained set of clusters may include at least two clusters, and each cluster may correspond to at least one obstacle in the spatial environment where the target robot is located.
  • the clustering operation on the target point cloud data can be: clustering multiple target points based on the distance between different target points in the target point cloud data.
  • the clustering method used can be one or more, which can include Not limited to: clustering methods that specify the number of clusters (for example, K-means clustering), clustering methods that do not specify the number of clusters (for example, hierarchical clustering).
  • K-means clustering can select the number of clusters for the point set of scattered target point cloud data, and then randomly initialize the center point, and iteratively calculate the distance between points within the class and increase the distance between points between classes. , thus forming a set of clusters.
  • hierarchical clustering can first process each target point as a separate cluster. If the target point cloud data contains X target points, X clusters can be obtained; then, in each iteration process , according to the distance between the two clusters, merge the two clusters that meet the merging conditions into one. When the iteration end conditions are met, the multiple clusters obtained are a set of clusters.
  • the above clustering operation can be implemented based on grids.
  • the two-dimensional space can be divided into grids first, and then the target point cloud data can be mapped to the grids, and then the point set density in each grid can be calculated. , classify the categories of grid cells based on preset thresholds, and form clusters with adjacent grid groups.
  • Step S206 Select a target gap from the gaps between each adjacent cluster in a set of clusters, where the target gap is a gap that allows the target robot to pass.
  • the gap here may correspond to the distance between obstacles, which may be the direction of the opening in an open space in the spatial environment. , for example, the distance between two obstacles, the open door of a room, etc.
  • the gap that allows the robot to pass can be selected from the gaps between each adjacent cluster in a set of clusters to obtain the target gap.
  • the selected target gap is the gap that the target robot wants to pass.
  • the above target gap may be the largest gap among all gaps that the target robot can pass, or it may be any gap among all gaps that the target robot can pass, or it may be any gap among all gaps that the target robot cannot pass. This is not limited in the examples.
  • the robot can be located in room No. 1. After acquiring the point cloud data of the environmental space it carries through its own lidar, it can cluster the acquired point cloud data to obtain a set of clusters and select a set of clusters. The largest gap between any two adjacent clusters in the cluster. The selected gap can be the open door connecting Room 1 and Room 2.
  • Step S208 Control the target robot to move in the direction of the target gap.
  • the target robot may move in the direction of the target gap. Further, after the target robot moves to the target gap, the point cloud data of the spatial environment where the target robot is currently located can be reacquired, and clustering operations can be performed on the obtained point cloud data to reselect the gap to control the target robot to move. , thereby realizing the gradual exploration of the global space.
  • the robot determines that the direction of the target gap is the open door of Room No. 1 and Room No. 2. After that, the robot can calculate the movement trajectory to the target gap and move in the direction of the target gap according to the calculated movement trajectory. After the robot moves to the door, it can determine whether to enable the exploration of Room 2 based on actual needs. If it is enabled, it can obtain the point cloud data of Room 2 through lidar and re-execute the operation of selecting the gap. in order to explore room two.
  • the point cloud data of the spatial environment where the target robot is located is obtained, and the target point cloud data is obtained, wherein the target point cloud data contains multiple target points; a clustering operation is performed on the target point cloud data, Obtain a set of clusters, where each cluster in the set of clusters contains at least one target point among multiple target points; select from the gap between each adjacent cluster in the set of clusters Target gap, where the target gap is a gap that allows the target robot to pass; controlling the target robot to move in the direction of the target gap solves the problem of movement control caused by using full image information to find gaps in the robot's movement control method in related technologies.
  • the time-consuming problem reduces the time-consuming movement control and improves the efficiency of space exploration.
  • the point cloud data of the spatial environment where the target robot is located is obtained, and the target point cloud data is obtained, including:
  • the point cloud data of the spatial environment where the target robot is located is obtained through the laser sensor on the target robot, and the target point cloud data is obtained.
  • the point cloud data of the spatial environment where the target robot is located can be obtained through a laser sensor installed on the target robot.
  • the laser sensor can be a laser radar.
  • the method of obtaining point cloud data can be: the target robot obtains the point cloud data of the spatial environment where it is located in situ.
  • the moving speed of the target robot can be zero, or it can move at a lower speed.
  • the above-mentioned movement can be forward and backward or left and right, or it can be rotation in place.
  • the protection device for the laser sensor in the target robot can be a pillar or a transparent baffle.
  • the laser sensor is protected by a pillar, it is necessary to control the rotation of the target robot.
  • the laser sensor is protected by a baffle, no control is required. By rotating the target robot, all point cloud data of the spatial environment where the target robot is located can be obtained.
  • the process of controlling the target robot to rotate can be similar to that in the previous embodiment.
  • the point cloud data of the space environment is obtained through the laser sensor on the target robot, and then the target point cloud data displayed by the laser sensor on the display is obtained.
  • the point cloud data of the space environment where the robot is located is obtained through the laser sensor, which can ensure the integrity of the point cloud data acquisition and improve the ability to explore the space environment where the robot is located. accuracy.
  • a clustering operation is performed on the target point cloud data to obtain a set of clusters, including:
  • a set of reference rays at a certain angle can be preset, and different reference rays can correspond to different angles.
  • the reference ray can be based on a preset point on the target robot as the coordinate origin and a preset direction.
  • the distance between each target point in the plurality of target points and each reference ray in a set of reference rays can be determined first, where each reference ray corresponds to a reference angle, and a set of reference rays corresponds to All preset angles in the reference coordinate system; and then determine the reference ray matching each target point according to the distance between each target point and each reference ray (the reference ray matching each target point can be the The nearest reference ray to each target point), and then determine the target point matching each reference ray, and determine the target points matching each reference ray as a cluster, thereby obtaining a set of clusters.
  • each ray corresponds to an angle.
  • the point cloud data obtained by the robot's laser sensor is mapped to the ray at each angle, and each angle is determined. Points corresponding to rays with angles, thereby determining multiple clusters.
  • this type of ray can be removed and not used in this clustering.
  • determining a distance between each of the plurality of target points and each of a set of reference rays includes:
  • each target point when determining the distance between each target point and each reference ray, each target point can be projected onto each reference ray respectively, and the distance between each target point and the corresponding projection point can be The distance (that is, the length of the line connecting the two) is determined as the distance between each target point and each ray.
  • a clustering operation is performed on the target point cloud data to obtain a set of clusters, including:
  • S42 Perform clustering operations on multiple target points based on the two-dimensional points corresponding to each target point to obtain a set of clusters.
  • the laser point cloud map stores the original scanned point cloud of the environmental space by the laser sensor. Its advantage is that the information is kept intact, but its disadvantage is that it requires a large amount of calculation and cannot be directly used for navigation and obstacle avoidance.
  • the core idea of laser point cloud rasterization is to use a grid to process the area scanned by lidar, and downsample the point cloud data into a two-dimensional raster map so that each two-dimensional raster point cloud represents A small area of space contains a part of the point cloud.
  • Point cloud rasterization processing is divided into two-dimensional rasterization and three-dimensional rasterization. Two-dimensional is actually a projection of the three-dimensional point cloud.
  • the point cloud of the environment where the robot is located can be downsampled into a two-dimensional grid map, and clustering is performed based on the information of the two-dimensional grid.
  • the target point cloud data can be down-sampled to a two-dimensional raster map.
  • each target point in multiple target points can be down-sampled to a certain grid in the two-dimensional raster map. , thereby obtaining the two-dimensional points corresponding to each target point, which can reduce the number of point clouds that need to be processed.
  • a clustering operation can be performed on the two-dimensional points corresponding to each target point to obtain the clustering results of the two-dimensional points corresponding to multiple target points.
  • the clustering result can be a set of Reference clusters, each reference cluster contains two-dimensional points corresponding to at least part of the target points among multiple target points, and the target points corresponding to the two-dimensional points contained in each reference cluster are determined as a cluster, thus obtaining The above set of clusters.
  • downsampling point cloud data onto a two-dimensional grid map can reduce the amount of data required for clustering operations, thereby improving the efficiency of clustering point cloud data, thereby improving the subsequent processing of robots. Efficiency in movement control.
  • a clustering operation is performed on multiple target points according to the two-dimensional points corresponding to each target point to obtain a set of clusters, including:
  • S51 determine the grid to which the two-dimensional point corresponding to each target point in the two-dimensional grid map belongs, and obtain a set of target rasters, where each target raster in the set of target rasters contains multiple target points.
  • S52 Perform a clustering operation on multiple target points based on the two-dimensional points contained in each target grid and the neighborhood information of each target grid to obtain a set of clusters.
  • a two-dimensional raster map can be divided into multiple rasters. Each raster can have adjacent rasters. Each of the multiple rasters can contain two-dimensional points corresponding to some target points, or it can not. Contains the 2D points corresponding to any target point.
  • clustering operations can be performed on multiple target points based on the two-dimensional points contained in each target raster and the neighborhood information of each target raster to obtain a set of clusters.
  • the above neighborhood information can be: according to each purpose After determining the grid unit with the highest local density based on the two-dimensional points contained in the standard grid, the number of two-dimensional points contained in its neighbor grid in a given neighborhood radius and the distance to the grid unit are determined.
  • a neighborhood grid clustering algorithm can be used to cluster multiple target points.
  • the original data i.e., target point cloud data
  • a grid subspace i.e., a two-dimensional grid map
  • Get a set of target rasters take the target raster with the maximum local density as the starting point, search and mark the target rasters within its neighborhood with a given neighborhood radius, and continue based on the newly added target raster. Expand the search for possible target rasters outwards until no new target rasters are added, thereby determining the target rasters belonging to the same cluster, then determining the target points belonging to the same cluster, and selecting the remaining ones with the maximum local density in turn.
  • the above process is repeated for the target raster, and a set of clusters is finally determined.
  • the efficiency of point cloud clustering can be improved, thereby improving the efficiency of subsequent movement control of the robot.
  • target gaps are selected from the gaps between each adjacent cluster in a set of clusters, including:
  • S62 Select the gap with the largest size from the gaps between each adjacent cluster to obtain the target gap.
  • the distance between each adjacent cluster ie, the inter-class distance
  • the largest gap i.e., the gap between adjacent clusters with the largest inter-class distance
  • the Euclidean distance can be used to calculate the inter-class distance of adjacent clusters.
  • the Euclidean distance can be the last distance of a cluster in the adjacent clusters. The distance between a point and the first point of another cluster in an adjacent cluster (can be calculated based on angle information).
  • cluster 1 and cluster 2 are two adjacent clusters.
  • the distance between the last point of cluster 1 and the first point of cluster 2 can be regarded as the distance between cluster 1 and cluster 2.
  • the distances between adjacent clusters are calculated sequentially to obtain a set of inter-class distances (i.e., gap sizes).
  • the maximum The inter-class distance that is, the gap with the largest size, and then the target gap is obtained.
  • the size of the gap between adjacent clusters is determined based on the distance between adjacent clusters, and the gap with the largest size is selected as the gap to be passed, which can improve the feasibility of robot movement control.
  • the robot is an LDS robot (Laser Direct Structuring, laser direct structuring technology).
  • This optional example provides a solution for a robot to quickly and briefly explore an indoor environment through local point clouds.
  • the process of the robot's movement control method in this optional example may include the following steps:
  • Step S302 Obtain the surrounding environment point cloud.
  • the LDS robot obtains the environmental point cloud (i.e., point cloud data) of the surrounding environment through lidar in situ (LDS does not need to rotate if there are no pillars, but needs to rotate if there are pillars).
  • LDS liquid crystal display
  • Step S304 Cluster the point cloud.
  • the LDS robot can cluster the obtained environment point cloud.
  • the clustering method used can be to project the environment point cloud onto rays at each angle, using angle information to assist clustering, or You can first downsample the environment point cloud into a raster, and then perform clustering based on raster neighborhood information.
  • Step S306 Calculate the inter-class distance between adjacent clusters after clustering.
  • the LDS robot can calculate the inter-class distance between adjacent clusters after clustering, and identify the gap through the inter-class distance, that is, the gap between two adjacent clusters with the largest inter-class distance is determined as the optimal gap (i.e., target gap).
  • Step S308 Control the robot to explore in the optimal gap direction.
  • the method according to the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is Better implementation.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or that contributes to the existing technology.
  • the computer software product is stored in a storage medium (such as ROM (Read-Only Memory, Read-only memory)/RAM (Random Access Memory, disk, optical disk), including a number of instructions to make a terminal device (can be a mobile phone, computer, server, or network device, etc.) to execute this
  • ROM Read-Only Memory, Read-only memory
  • RAM Random Access Memory
  • terminal device can be a mobile phone, computer, server, or network device, etc.
  • FIG. 4 is a structural block diagram of an optional robot movement control device according to an embodiment of the present disclosure. As shown in Figure 4, the device may include:
  • the acquisition unit 402 is used to acquire point cloud data of the spatial environment where the target robot is located, and obtain target point cloud data, where the target point cloud data contains multiple target points;
  • the clustering unit 404 is connected to the acquisition unit 402 and is used to perform a clustering operation on the target point cloud data to obtain a set of clusters, wherein each cluster in the set of clusters contains at least one of the plurality of target points. Target;
  • the selection unit 406 is connected to the clustering unit 404 and is used to select a target gap from the gap between each adjacent cluster in a set of clusters, where the target gap is a gap that allows the target robot to pass;
  • the control unit 408 is connected to the selection unit 406 and is used to control the target robot to move in the direction of the target gap.
  • the acquisition unit 402 in this embodiment can be used to perform the above step S202
  • the clustering unit 404 in this embodiment can be used to perform the above step S204
  • the selection unit 406 in this embodiment can be used to perform In the above step S206
  • the control unit 408 in this embodiment may be used to execute the above step S208.
  • the point cloud data of the space environment where the target robot is located is obtained, and the target point cloud data is obtained.
  • the target point cloud data contains multiple target points; clustering operations are performed on the target point cloud data to obtain a set of classes. clusters, where each cluster in a set of clusters contains at least one target point among multiple target points; the target gap is selected from the gap between each adjacent cluster in a set of clusters, where , the target gap is a gap that allows the target robot to pass; the target robot is controlled to move in the direction of the target gap, which solves the problem of long movement control due to the use of full image information to find gaps in the robot movement control methods in related technologies. , reducing the time-consuming movement control and improving the efficiency of space exploration.
  • the acquisition unit includes:
  • the acquisition module is used to obtain the point cloud data of the spatial environment where the target robot is located through the laser sensor on the target robot during the process of controlling the target robot to rotate, and obtain the target point cloud data.
  • the clustering unit includes:
  • a first determination module configured to determine the distance between each target point in the plurality of target points and each reference ray in a set of reference rays, where each reference ray corresponds to a reference angle respectively;
  • the second determination module is used to determine the target point matching each reference ray based on the distance between each target point and each reference ray;
  • the third determination module is used to determine the target points matching each reference ray as a cluster to obtain a set of clusters.
  • the first determining module includes:
  • the projection submodule is used to project each target point onto each reference ray separately to obtain the distance between each target point and each reference ray.
  • the clustering unit includes:
  • the downsampling module is used to downsample the target point cloud data onto a two-dimensional grid map to obtain a two-dimensional point corresponding to each target point in multiple target points;
  • the clustering module is used to perform clustering operations on multiple target points based on the two-dimensional points corresponding to each target point to obtain a set of clusters.
  • the clustering module includes:
  • the determination submodule is used to determine the grid to which the two-dimensional point corresponding to each target point in the two-dimensional grid map belongs, and obtain a set of target rasters, where each target raster in the set of target rasters contains A two-dimensional point corresponding to at least one target point of the plurality of target points;
  • the execution submodule is used to perform clustering operations on multiple target points based on the two-dimensional points contained in each target raster and the neighborhood information of each target raster to obtain a set of clusters.
  • the selection unit includes:
  • the fourth determination module is used to determine the distance between each adjacent cluster as the size of the gap between each adjacent cluster
  • the selection module is used to select the gap with the largest size from the gaps between each adjacent cluster to obtain the target gap.
  • the above module as part of the device, can run in the hardware environment as shown in Figure 1, and can be implemented by software or hardware, where the hardware environment includes a network environment.
  • a storage medium is also provided.
  • the above-mentioned storage medium can be used to execute the program code of any one of the above-mentioned robot movement control methods in the embodiment of the present disclosure.
  • the above storage medium may be located on at least one network device among multiple network devices in the network shown in the above embodiment.
  • the storage medium is configured to store program codes for performing the following steps:
  • S1 obtain the point cloud data of the space environment where the target robot is located, and obtain the target point cloud data, where the target point cloud data contains multiple target points;
  • S2 perform a clustering operation on the target point cloud data to obtain a set of clusters, where each cluster in the set of clusters contains at least one target point among multiple target points;
  • S3 Select a target gap from the gap between each adjacent cluster in a set of clusters, where the target gap is a gap that allows the target robot to pass;
  • the above storage medium may include but is not limited to: U disk, ROM, RAM, mobile Various media such as hard disk, magnetic disk or optical disk that can store program code.
  • an electronic device for implementing the above-mentioned movement control method of a robot.
  • the electronic device may be a server, a terminal, or a combination thereof.
  • Figure 5 is a structural block diagram of an optional electronic device according to an embodiment of the present disclosure. As shown in Figure 5, it includes a processor 502, a communication interface 504, a memory 506 and a communication bus 508. The processor 502, the communication interface 504 and memory 506 complete communication with each other through communication bus 508, where,
  • Memory 506 for storing computer programs
  • the processor 502 is used to implement the following steps when executing the computer program stored on the memory 506:
  • S1 obtain the point cloud data of the space environment where the target robot is located, and obtain the target point cloud data, where the target point cloud data contains multiple target points;
  • S2 perform a clustering operation on the target point cloud data to obtain a set of clusters, where each cluster in the set of clusters contains at least one target point among multiple target points;
  • S3 Select a target gap from the gap between each adjacent cluster in a set of clusters, where the target gap is a gap that allows the target robot to pass;
  • the communication bus may be a PCI (Peripheral Component Interconnect, Peripheral Component Interconnect Standard) bus, or an EISA (Extended Industry Standard Architecture, Extended Industry Standard Architecture) bus, etc.
  • the communication bus can be divided into address bus, data bus, control bus, etc. For ease of presentation, only one thick line is used in Figure 5, but it does not mean that there is only one bus or one type of bus.
  • the communication interface is used for communication between the above-mentioned electronic device and other equipment.
  • the above-mentioned memory may include RAM or non-volatile memory (non-volatile memory), for example, at least one disk memory.
  • the memory may also be at least one storage device located remotely from the aforementioned processor.
  • the above-mentioned memory 506 may include, but is not limited to, the acquisition unit 402, the clustering unit 404, the selection unit 406 and the control unit 408 in the control device of the above-mentioned device.
  • it may also include but is not limited to other modular units in the control device of the above equipment, which will not be described again in this example.
  • the above-mentioned processor can be a general-purpose processor, which can include but is not limited to: CPU (Central Processing Unit, central processing unit), NP (Network Processor, network processor), etc.; it can also be a DSP (Digital Signal Processing, digital signal processor) ), ASIC (Application Specific Integrated Circuit, application specific integrated circuit), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • CPU Central Processing Unit, central processing unit
  • NP Network Processor, network processor
  • DSP Digital Signal Processing, digital signal processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array, field programmable gate array
  • other programmable logic devices discrete gate or transistor logic devices, discrete hardware components.
  • the device for implementing the above-mentioned mobile control method of the robot can be a terminal device, and the terminal device can be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet Computers, handheld computers, and mobile Internet devices (Mobile Internet Devices, MID), PAD and other terminal equipment.
  • FIG. 5 does not limit the structure of the above-mentioned electronic device.
  • the electronic device may also include more or fewer components (such as network interfaces, display devices, etc.) than shown in FIG. 5 , or have a different configuration than that shown in FIG. 5 .
  • the program can be stored in a computer-readable storage medium, and the storage medium can Including: flash disk, ROM, RAM, magnetic disk or optical disk, etc.
  • the integrated units in the above embodiments are implemented in the form of software functional units and sold or used as independent products, they can be stored in the above computer-readable storage medium.
  • the technical solution of the present disclosure is essentially or contributes to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, A number of instructions are included to cause one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method described in each embodiment of the present disclosure.
  • the disclosed client can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division.
  • multiple units or components may be combined or may be Integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the units or modules may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place or distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in this embodiment.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.

Abstract

一种机器人的移动控制方法和装置、存储介质及电子装置,该方法包括:获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点(S202);对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点(S204);从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口(S206);控制目标机器人向目标缺口的方向进行移动(S208)。该方法解决了相关技术中的机器人的移动控制方法存在由于使用全图信息寻找缺口导致的移动控制的耗时长的问题。

Description

机器人的移动控制方法和装置、存储介质及电子装置
本公开要求如下专利申请的优先权:于2022年04月08日提交中国专利局、申请号为202210366407.0、发明名称为“机器人的移动控制方法和装置、存储介质及电子装置”的中国专利申请;上述专利申请的全部内容通过引用结合在本公开中。
技术领域
本公开涉及机器人领域,具体而言,涉及一种机器人的移动控制方法和装置、存储介质及电子装置。
背景技术
目前,在利用机器人对未知环境进行自主探索时,可以先获取环境数据,根据环境数据生成二维栅格地图,并通过所建的二维栅格地图的边界信息寻找缺口,按照寻找到的缺口进行移动路径规划,进而进行环境探索。
然而,对于上述基于二维栅格地图的边界信息寻找缺口控制机器人移动的方式,由于使用全图信息寻找缺口,随着二维栅格地图的增大,探索的耗时也会相应变长。
由此可见,相关技术中的机器人的移动控制方法,存在由于使用全图信息寻找缺口导致的移动控制的耗时长的问题。
发明内容
本公开的目的在于提供一种机器人的移动控制方法和装置、存储介质及电子装置,以至少解决相关技术中的机器人的移动控制方法存在由于使用全图信息寻找缺口导致的移动控制的耗时长的问题。
本公开的目的是通过以下技术方案实现:
根据本公开实施例的一个方面,提供了一种机器人的移动控制方法,包括:获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,所述目标点云数据中包含多个目标点;对所述目标点云数据执行聚类操作,得到一组类簇,其中,所述一组类簇中的每个类簇包含所述多个目标点中的至少一个目标点;从所述一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,所述目标缺口为允许所述目标机器人通过的缺口;控制所述目标机器人向所述目标缺口的方向进行移动。
在一个示例性实施例中,所述获取目标机器人所处的空间环境的点云数据,得到目标点 云数据,包括:在控制所述目标机器人进行旋转的过程中,通过所述目标机器人上的激光传感器获取所述目标机器人所处的空间环境的点云数据,得到所述目标点云数据。
在一个示例性实施例中,所述对所述目标点云数据执行聚类操作,得到一组类簇,包括:确定所述多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,其中,所述每个参考射线分别对应于一个参考角度;根据所述每个目标点与所述每个参考射线之间的距离,确定与所述每个参考射线匹配的目标点;将与所述每个参考射线匹配的目标点分别确定为一个类簇,得到所述一组类簇。
在一个示例性实施例中,所述确定所述多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,包括:将所述每个目标点分别投射到所述每个参考射线上,得到所述每个目标点与所述每个参考射线之间的距离。
在一个示例性实施例中,所述对所述目标点云数据执行聚类操作,得到一组类簇,包括:将所述目标点云数据降采样到二维栅格地图上,得到与所述多个目标点中的每个目标点对应的二维点;根据所述每个目标点对应的二维点对所述多个目标点执行聚类操作,得到所述一组类簇。
在一个示例性实施例中,所述根据所述每个目标点对应的二维点对所述多个目标点执行聚类操作,得到所述一组类簇,包括:确定所述二维栅格地图中所述每个目标点对应的二维点所属的栅格,得到一组目标栅格,其中,所述一组目标栅格中的每个目标栅格包含与所述多个目标点的至少一个目标点对应的二维点;根据所述每个目标栅格包含的二维点、以及所述每个目标栅格的邻域信息对所述多个目标点执行聚类操作,得到所述一组类簇。
在一个示例性实施例中,所述从所述一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,包括:将所述每个相邻类簇之间的距离,确定为所述每个相邻类簇之间的缺口的尺寸;从所述每个相邻类簇之间的缺口中,选取出尺寸最大的缺口,得到所述目标缺口。
根据本公开实施例的另一个方面,还提供了一种机器人的移动控制装置,包括:获取单元,用于获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,所述目标点云数据中包含多个目标点;聚类单元,用于对所述目标点云数据执行聚类操作,得到一组类簇,其中,所述一组类簇中的每个类簇包含所述多个目标点中的至少一个目标点;选取单元,用于从所述一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,所述目标缺口为允许所述目标机器人通过的缺口;控制单元,用于控制所述目标机器人向所述目标缺口的方向进行移动。
在一个示例性实施例中,所述获取单元包括:获取模块,用于在控制所述目标机器人进行旋转的过程中,通过所述目标机器人上的激光传感器获取所述目标机器人所处的空间环境 的点云数据,得到所述目标点云数据。
在一个示例性实施例中,所述聚类单元包括:第一确定模块,用于确定所述多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,其中,所述每个参考射线分别对应于一个参考角度;第二确定模块,用于根据所述每个目标点与所述每个参考射线之间的距离,确定与所述每个参考射线匹配的目标点;第三确定模块,用于将与所述每个参考射线匹配的目标点分别确定为一个类簇,得到所述一组类簇。
在一个示例性实施例中,所述第一确定模块包括:投射子模块,用于将所述每个目标点分别投射到所述每个参考射线上,得到所述每个目标点与所述每个参考射线之间的距离。
在一个示例性实施例中,所述聚类单元包括:降采样模块,用于将所述目标点云数据降采样到二维栅格地图上,得到与所述多个目标点中的每个目标点对应的二维点;聚类模块,用于根据所述每个目标点对应的二维点对所述多个目标点执行聚类操作,得到所述一组类簇。
在一个示例性实施例中,所述聚类模块包括:确定子模块,用于确定所述二维栅格地图中所述每个目标点对应的二维点所属的栅格,得到一组目标栅格,其中,所述一组目标栅格中的每个目标栅格包含与所述多个目标点的至少一个目标点对应的二维点;执行子模块,用于根据所述每个目标栅格包含的二维点、以及所述每个目标栅格的邻域信息对所述多个目标点执行聚类操作,得到所述一组类簇。
在一个示例性实施例中,所述选取单元包括:第四确定模块,用于将所述每个相邻类簇之间的距离,确定为所述每个相邻类簇之间的缺口的尺寸;选取模块,用于从所述每个相邻类簇之间的缺口中,选取出尺寸最大的缺口,得到所述目标缺口。
根据本公开实施例的又一方面,还提供了一种计算机可读的存储介质,该计算机可读的存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述机器人的移动控制方法。
根据本公开实施例的又一方面,还提供了一种电子装置,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其中,上述处理器通过计算机程序执行上述的机器人的移动控制方法。
在本公开实施例中,采用对机器人所处的空间环境的的点云数据进行聚类、并从相邻类簇之间的缺口中选取出允许机器人通过的缺口的方式,通过获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点;对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点;从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口;控制目标机器人向目标缺口的方向进行移动,由于基于机器人所 处的空间环境的点云数据进行聚类,从相邻类簇之间的缺口中选取出允许机器人通过的缺口,而不是使用全图信息寻找缺口,可以实现减少机器人搜索范围的目的,达到降低移动控制耗时、提高空间探索效率的技术效果,进而解决了相关技术中的机器人的移动控制方法存在由于使用全图信息寻找缺口导致的移动控制的耗时长的问题。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是根据本公开实施例的一种可选的机器人的移动控制方法的硬件环境的示意图;
图2是根据本公开实施例的一种可选的机器人的移动控制方法的流程示意图;
图3是根据本公开实施例的另一种可选的机器人的移动控制方法的流程示意图;
图4是根据本公开实施例的一种可选的机器人的移动控制装置的结构框图;
图5是根据本公开实施例的一种可选的电子装置的结构框图。
具体实施方式
下文中将参考附图并结合实施例来详细说明本公开。需要说明的是,在不冲突的情况下,本公开中的实施例及实施例中的特征可以相互组合。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
根据本公开实施例的一个方面,提供了一种机器人的移动控制方法。可选地,在本实施例中,上述机器人的移动控制方法可以应用于如图1所示的由机器人102和服务器104所构成的硬件环境中。如图1所示,机器人102可以通过网络与服务器104(例如,物联网平台或者云端服务器)进行连接,以对机器人102进行控制。
上述网络可以包括但不限于以下至少之一:有线网络,无线网络。上述有线网络可以包括但不限于以下至少之一:广域网,城域网,局域网,上述无线网络可以包括但不限于以下至少之一:WIFI(Wireless Fidelity,无线保真),蓝牙,红外。机器人102可以包括但不限于:清洁机器人,例如,扫地机器人、洗地机器人、自动洗拖布机器人、自清洁机器人等,服务器104可以是物联网平台的服务器。
本公开实施例的机器人的移动控制方法可以由机器人102、服务器104单独来执行,也可以由机器人102和服务器104中共同执行。其中,机器人102执行本公开实施例的机器人的移动控制方法也可以是由安装在其上的客户端来执行。
以由机器人102执行本实施例中的机器人的移动控制方法为例,图2是根据本公开实施例的一种可选的机器人的移动控制方法的流程示意图,如图2所示,该方法的流程可以包括以下步骤:
步骤S202,获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,所述目标点云数据中包含多个目标点。
本实施例中的机器人的移动控制方法可以应用到通过对机器人进行移动控制,实现对未知环境进行短暂探索的场景中,上述机器人可以是清洁机器人(即,具备清洁功能的机器人,其可以是扫地机器人,也可以是洗地机器人),其对应的未知环境可以是待清扫的区域,可以是飞行机器人,其对应的未知环境可以是待探测的区域,还可以是其他类型的机器人,在此不做限定。上述短暂探索可以是机器人对自身所处的空间环境进行建图操作,以便对障碍物进行规避等。
可选地,机器人中设置有激光雷达,可以是通过发射激光束探测目标的位置、速度等特征量的雷达系统,其工作原理是:向目标发射探测信号(激光束),接口目标反射回来的反射信号(目标回波);将接收到的反射信号与发射信号进行比较,在进行适当处理之后,可以获得目标的有关信息,例如,目标距离、方位、高度、速度、姿态、甚至形状等参数,从而对目标进行探测、跟踪和识别。激光雷达可以包括激光发射机、光接收机、转台和信息处理系统等,光发射机可以将电脉冲变成光脉冲发射出去,光学接收机可以把从目标反射回来的光脉冲还原成电脉冲,通过对还原的电脉冲进行处理,可以得到目标的点云数据,得到的点云数据可以发送到显示器进行显示(可以是以点云形式显示的)。
在本实施例中,目标机器人可以向所处的空间环境发射激光束,目标机器人上的光接收机可以把从目标反射回来的光脉冲还原成电脉冲,通过对获取的电脉冲进行解析,获取到自身所处的空间环境的点云数据,即,目标点云数据,上述目标点云数据中可以包含多个目标点。
步骤S204,对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点。
在本实施例中,可以对上述目标点云数据执行聚类操作,以得到一组类簇,这里一组类簇中的每个类簇中可以包含多个目标点中的至少一个目标点。可选地,得到的一组类簇可以包含至少两个类簇,每个类簇可以对应于目标机器人所处的空间环境中至少一个障碍物。
对目标点云数据进行聚类操作可以是:基于目标点云数据中的不同目标点之间的距离对多个目标点进行聚类,所采用的聚类方式可以有一种或者多种,可以包括不限于:指定聚类数量的聚类方式(例如,K-means聚类),未指定聚类数量的聚类方式(例如,层次聚类)。
例如,K-means聚类可以针对散落的目标点云数据的点集,选定聚类个数,然后随机初始化中心点,通过迭代计算拉近类内点的距离,增大类间点的距离,从而形成一组类簇。
又例如,层次聚类可以首先将每个目标点作为一个单独的类簇进行处理,如果目标点云数据包含X个目标点,则可以得到X个类簇;然后,在每轮迭代的过程中,根据两个类簇之间的距离,将满足合并条件的两个类簇合并为一个。在满足迭代结束条件时,所得到的多个类簇即为一组类簇。
可选地,上述聚类操作可以是基于网格实现的,可以先将二维空间划分成栅格形式,再将目标点云数据映射到网格中,然后计算每个栅格中点集密度,根据预设的阈值对网格单元的类别作出分类,并与邻近的栅格组形成类。
步骤S206,从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口。
在本实施例中,一组类簇中的相邻类簇之间会存在缺口,这里的缺口可以对应于障碍物之间的距离,其可能是空间环境中的一个开放空间中的开口方向处,比如,两个障碍物之间的间隔,一个房间开放的门处等。为了方便进行环境探索,可以从一组类簇中每个相邻类簇之间的缺口中,选择出允许机器人通过的缺口,得到目标缺口,选取的目标缺口为目标机器人待通过的缺口。上述目标缺口可以是目标机器人可通过的所有缺口中最大的缺口,也可以是目标机器人可通过的所有缺口中的任一缺口,还可以是目标机器人未通过的所有缺口中的任一缺口,本实施例中对此不做限定。
例如,机器人可以位于一号房间,在通过自身携带的激光雷达获取到所处的环境空间的点云数据之后,对获取到的点云数据进行聚类,得到一组类簇,选取一组类簇中的任意相邻的两个类簇之间的缺口中的最大缺口,选取的缺口可以是连接一号房间与二号房间的开放房门处。
步骤S208,控制目标机器人向目标缺口的方向进行移动。
在本实施例中,在确定出目标缺口之后,目标机器人可以向目标缺口的方向进行移动。进一步地,在目标机器人移动到目标缺口处之后,可以重新获取目标机器人当前所处的空间环境的点云数据,并对获取的点云数据执行聚类操作等,重新选取缺口控制目标机器人进行移动,进而实现对于全局空间的逐步探索。
例如,机器人位于一号房间内,在确定目标缺口方向为一号房间与二号房间的开放房门 处之后,机器人可以计算出移动至目标缺口的移动轨迹,并按照计算出的移动轨迹向目标缺口方向移动。在机器人移动至房门处后,可以根据实际的需求,确定是否开启对于二号房间的探索,若开启,则可以通过激光雷达获取二号房间的点云数据,并重新执行选取缺口的操作,以便对二号房间进行探索。
通过上述步骤S202至步骤S208,获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点;对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点;从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口;控制目标机器人向目标缺口的方向进行移动,解决了相关技术中的机器人的移动控制方法存在由于使用全图信息寻找缺口导致的移动控制的耗时长的问题,降低了移动控制耗时,提高了空间探索效率。
在一个示例性实施例中,获取目标机器人所处的空间环境的点云数据,得到目标点云数据,包括:
S11,在控制目标机器人进行旋转的过程中,通过目标机器人上的激光传感器获取目标机器人所处的空间环境的点云数据,得到目标点云数据。
在本实施例中,可以通过目标机器人上设置的激光传感器获取到目标机器人所处的空间环境的点云数据,上述激光传感器可以是激光雷达。获取点云数据的方式可以是:目标机器人在原地获得所处的空间环境的点云数据。在此情况下,目标机器人的移动速度可以为零,或者,以一个较低的速度进行移动,上述移动可以是前后方向或者左右方向的移动,也可以是原地转动。
目标机器人中对于激光传感器的保护装置可以是柱子,也可以是透明的挡板,在激光传感器采用柱子进行保护时,需要控制目标机器人旋转,在激光传感器采用挡板进行保护时,则不需要控制目标机器人旋转,便可以获取到目标机器人所在空间环境的所有点云数据。
可选地,对于激光传感器的保护装置为柱子的场景,在控制目标机器人进行旋转(例如,原地旋转至少一周,即,至少旋转360度)的过程中,可以按照与前述实施例中类似的方式,通过目标机器人上的激光传感器获取到所处的空间环境的点云数据,进而得到激光传感器在显示器上显示的目标点云数据。
通过本实施例,在机器人进行旋转的过程中,通过激光传感器获取机器人所处的空间环境的点云数据,可以保证点云数据获取的完整性,鸡儿提高对机器人所处空间环境进行探索的准确性。
在一个示例性实施例中,对目标点云数据执行聚类操作,得到一组类簇,包括:
S21,确定多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,其中,每个参考射线分别对应于一个参考角度;
S22,根据每个目标点与每个参考射线之间的距离,确定与每个参考射线匹配的目标点;
S23,将与每个参考射线匹配的目标点分别确定为一个类簇,得到一组类簇。
在本实施例中,可以预先设定一定角度的一组参考射线,不同参考射线可以对应于不同角度,这里,参考射线可以是在以目标机器人上的预设点为坐标原点、预设方向为坐标轴方向的参考坐标系中的射线。在对目标点云数据执行聚类操作时,可以将机器人所处环境的点云对应到每个角度的参考射线上,以角度信息辅助聚类。
可选地,可以先确定多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,这里每个参考射线分别对应于一个参考角度,一组参考射线对应于参考坐标系中的所有预设角度;再根据每个目标点与每个参考射线之间的距离,确定每个目标点匹配的参考射线(与每个目标点匹配的参考射线可以是与每个目标点距离最近的参考射线),进而确定出与每个参考射线匹配的目标点,并将与每个参考射线匹配的目标点分别确定为一个类簇,从而得到一组类簇。
例如,参考坐标系中共计有10个射线(即,10条参考射线),每个射线对应的一个角度,将机器人的激光传感器获取到的点云数据对应到每个角度的射线上,确定每个角度的射线所对应的点,从而确定出多个类簇。这里,如果一个射线没有匹配的点,可以将这类射线移除,不在本次聚类中使用。
通过本实施例,通过将周围点云对应到每个角度的射线上,以角度信息辅助聚类,可以提高对点云数据进行聚类的准确性。
在一个示例性实施例中,确定多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,包括:
S31,将每个目标点分别投射到每个参考射线上,得到每个目标点与每个参考射线之间的距离。
在本实施例中,在确定每个目标点与每个参考射线之间的距离时,可以首先将每个目标点分别投射到每个参考射线上,将每个目标点与对应投射点之间的距离(即,两者之间连线的长度),确定为每个目标点与每个射线之间的距离。
通过本实施例,通过将周围点云对应到每个角度的射线上,基于投射先确定点云数据中的各个点到各个射线的距离,可以提升对点云数据进行聚类的准确性。
在一个示例性实施例中,对目标点云数据执行聚类操作,得到一组类簇,包括:
S41,将目标点云数据降采样到二维栅格地图上,得到与多个目标点中的每个目标点对 应的二维点;
S42,根据每个目标点对应的二维点对多个目标点执行聚类操作,得到一组类簇。
一般而言,激光点云地图存储的是激光传感器对环境空间的原始扫描点云,其优点是保留信息完整,缺点是计算量大、不能直接用于导航避障等。激光点云栅格化的核心思想是:将激光雷达所扫描到的区域用网格进行处理,通过将点云数据降采样到二维栅格地图中,使得每个二维栅格点云代表空间的一小块区域,内含一部分点云,点云栅格化处理分为二维栅格化和三维栅格化,二维其实就是将三维点云进行一个投影。
在本实施例中,在对点云数据执行聚类操作时,可以将机器人所处环境的点云降采样到二维栅格地图中,基于二维栅格的信息进行聚类。对于目标点云数据,可以将目标点云数据降采样到二维栅格地图上,比如,将多个目标点中的每个目标点分别降采样到二维栅格地图中的某一栅格中,从而得到每个目标点对应的二维点,可以降低所需处理的点云数量。
在得到每个目标点对应的二维点,可以对每个目标点对应的二维点执行聚类操作,得到多个目标点对应的二维点的聚类结果,聚类结果可以是一组参考类簇,每个参考类簇包含多个目标点中的至少部分目标点对应的二维点,将每个参考类簇包含的二维点所对应的目标点确定为一个类簇,从而得到上述一组类簇。
通过本实施例,将点云数据降采样到二维栅格地图上,可以减少聚类操作所需处理的数据量,从而提高了对点云数据进行聚类的效率,进而提升了后续对机器人进行移动控制的效率。
在一个示例性实施例中,根据每个目标点对应的二维点对多个目标点执行聚类操作,得到一组类簇,包括:
S51,确定二维栅格地图中每个目标点对应的二维点所属的栅格,得到一组目标栅格,其中,一组目标栅格中的每个目标栅格包含与多个目标点的至少一个目标点对应的二维点;
S52,根据每个目标栅格包含的二维点、以及每个目标栅格的邻域信息对多个目标点执行聚类操作,得到一组类簇。
二维栅格地图中可以划分为多个栅格,每个栅格可以具有相邻的栅格,多个栅格中的每个栅格可以包含部分目标点对应的二维点,也可以不包含任何目标点对应的二维点。在本实施例中,在根据每个目标点对应的二维点对多个目标点执行聚类操作时,可以首先确定二维栅格地图中,每个目标点对应的二维点所属的栅格,从而得到一组目标栅格,这里一组目标栅格中的每个目标栅格包含与多个目标点的至少一个目标点对应的二维点。
在得到一组目标栅格之后,可以根据每个目标栅格包含的二维点、以及每个目标栅格的邻域信息对多个目标点执行聚类操作,得到一组类簇。上述邻域信息可以是:在根据每个目 标栅格包含的二维点判断出局部密度最大的栅格单位之后,给定邻域半径中其邻域栅格中包含二维点数量及与该网格单位的距离。
示例性地,可以采用邻域网格聚类算法对多个目标点进行聚类,可以先将原始数据(即,目标点云数据)映射至网格子空间(即,二维栅格地图),得到一组目标栅格,以具有最大局部密度的目标栅格为起始点,以给定邻域半径搜索其邻域范围内的目标栅格并标记,以新加入的目标栅格为基础,继续向外扩展搜索可能的目标栅格,直至没有新的目标栅格加入,从而确定属于同一类簇的目标栅格,进而确定出属于同一类簇的目标点,依次选取剩余的具有最大局部密度的目标栅格重复执行上述过程,最终确定出一组类簇。
通过本实施例,通过将点云降采样到栅格中,通过栅格邻域信息聚类,可以提高点云聚类的效率,进而提升后续对机器人进行移动控制的效率。
在一个示例性实施例中,从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,包括:
S61,将每个相邻类簇之间的距离,确定为每个相邻类簇之间的缺口的尺寸;
S62,从每个相邻类簇之间的缺口中,选取出尺寸最大的缺口,得到目标缺口。
在本实施例中,在得到一组类簇之后,可以将每个相邻类簇之间的距离(即,类间距离),确定为每个相邻类簇之间的缺口的尺寸。在确定出每个相邻类簇之间的缺口的尺寸之后,可以从每个相邻类簇之间的缺口中,选取出最大的缺口(即,类间距离最大的相邻类簇之间的缺口),从而得到目标缺口。
可选地,计算类间距离的方式可以有一种或者多种,例如,可以采用欧式距离计算相邻类簇的类间距离,这里,欧式距离可以是相邻类簇中的一个类簇的最后一个点与相邻类簇的另一个类簇的第一个点之间的距离(可以是以角度信息计算)。
例如,类簇1与类簇2为两相邻类簇,类簇1的最后一个点与类簇2的第一个点之间的距离可以视为类簇1与类簇2之间的距离,若类簇有若干个,对相邻类簇之间的距离依次进行计算,从而获取到一组类间距离(即,缺口尺寸),通过对一组类间距离进行排序,可以确定出最大的类间距离,即,尺寸最大的缺口,进而得到目标缺口。
通过本实施例,基于相邻类簇之间的距离确定邻类簇之间的缺口大小,选取出尺寸最大的缺口作为待通行的缺口,可以提升了机器人移动控制的可行性。
下面结合可选示例对本实施例中的机器人的移动控制方法进行解释说明。在本可选实例中,机器人为LDS机器人(Laser Direct Structuring,激光直接成型技术)。
本可选示例中提供了一种通过局部点云使机器人在室内环境快速进行短暂探索的方案,如图3所示,本可选示例中的机器人的移动控制方法的流程可以包括以下步骤:
步骤S302,获取周围环境点云。
LDS机器人在原地(LDS无柱子无需旋转,有柱子需要旋转)通过激光雷达获取所处周围环境的环境点云(即,点云数据)。
步骤S304,对点云进行聚类。
在得到环境点云之后,LDS机器人可以对得到的环境点云进行聚类,所采用的聚类方式可以是通过将环境点云投射到每个角度的射线上,以角度信息辅助聚类,也可以是首先将环境点云降采样到栅格中,再通过栅格邻域信息进行聚类。
步骤S306,计算聚类后各相邻类簇之间的类间距离。
LDS机器人可以计算聚类后各相邻类簇之间的类间距离,通过类间距离识别缺口,即,将类间距离最大的两相邻类簇间的缺口确定为最优缺口(即,目标缺口)。
步骤S308,控制机器人向最优缺口方向探索。
控制机器人向缺口尺寸最大处方向(即,最优缺口方向)移动。
通过本示例,通过对获取到的周围环境点云进行聚类,从而识别机器人移动的缺口方向,能够提高机器人在室内未知环境进行短暂探索的效率。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本公开并不受所描述的动作顺序的限制,因为依据本公开,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本公开所必须的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM(Read-Only Memory,只读存储器)/RAM(Random Access Memory,随机存取存储器)、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本公开每个实施例所述的方法。
根据本公开实施例的又一个方面,还提供了一种用于实施上述机器人的移动控制方法的机器人的移动控制装置。图4是根据本公开实施例的一种可选的机器人的移动控制装置的结构框图,如图4所示,该装置可以包括:
获取单元402,用于获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点;
聚类单元404,与获取单元402相连,用于对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点;
选取单元406,与聚类单元404相连,用于从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口;
控制单元408,与选取单元406相连,用于控制目标机器人向目标缺口的方向进行移动。
需要说明的是,该实施例中的获取单元402可以用于执行上述步骤S202,该实施例中的聚类单元404可以用于执行上述步骤S204,该实施例中的选取单元406可以用于执行上述步骤S206,该实施例中的控制单元408可以用于执行上述步骤S208,。
通过上述模块,获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点;对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点;从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口;控制目标机器人向目标缺口的方向进行移动,解决了相关技术中的机器人的移动控制方法存在由于使用全图信息寻找缺口导致的移动控制的耗时长的问题,降低了移动控制耗时,提高了空间探索效率。
在一个示例性实施例中,获取单元包括:
获取模块,用于在控制目标机器人进行旋转的过程中,通过目标机器人上的激光传感器获取目标机器人所处的空间环境的点云数据,得到目标点云数据。
在一个示例性实施例中,聚类单元包括:
第一确定模块,用于确定多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,其中,每个参考射线分别对应于一个参考角度;
第二确定模块,用于根据每个目标点与每个参考射线之间的距离,确定与每个参考射线匹配的目标点;
第三确定模块,用于将与每个参考射线匹配的目标点分别确定为一个类簇,得到一组类簇。
在一个示例性实施例中,第一确定模块包括:
投射子模块,用于将每个目标点分别投射到每个参考射线上,得到每个目标点与每个参考射线之间的距离。
在一个示例性实施例中,聚类单元包括:
降采样模块,用于将目标点云数据降采样到二维栅格地图上,得到与多个目标点中的每个目标点对应的二维点;
聚类模块,用于根据每个目标点对应的二维点对多个目标点执行聚类操作,得到一组类簇。
在一个示例性实施例中,聚类模块包括:
确定子模块,用于确定二维栅格地图中每个目标点对应的二维点所属的栅格,得到一组目标栅格,其中,一组目标栅格中的每个目标栅格包含与多个目标点的至少一个目标点对应的二维点;
执行子模块,用于根据每个目标栅格包含的二维点、以及每个目标栅格的邻域信息对多个目标点执行聚类操作,得到一组类簇。
在一个示例性实施例中,选取单元包括:
第四确定模块,用于将每个相邻类簇之间的距离,确定为每个相邻类簇之间的缺口的尺寸;
选取模块,用于从每个相邻类簇之间的缺口中,选取出尺寸最大的缺口,得到目标缺口。
此处需要说明的是,上述模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例所公开的内容。需要说明的是,上述模块作为装置的一部分可以运行在如图1所示的硬件环境中,可以通过软件实现,也可以通过硬件实现,其中,硬件环境包括网络环境。
根据本公开实施例的又一个方面,还提供了一种存储介质。可选地,在本实施例中,上述存储介质可以用于执行本公开实施例中上述任一项机器人的移动控制方法的程序代码。
可选地,在本实施例中,上述存储介质可以位于上述实施例所示的网络中的多个网络设备中的至少一个网络设备上。
可选地,在本实施例中,存储介质被设置为存储用于执行以下步骤的程序代码:
S1,获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点;
S2,对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点;
S3,从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口;
S4,控制目标机器人向目标缺口的方向进行移动。
可选地,本实施例中的具体示例可以参考上述实施例中所描述的示例,本实施例中对此不再赘述。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、ROM、RAM、移动 硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
根据本公开实施例的又一个方面,还提供了一种用于实施上述机器人的移动控制方法的电子装置,该电子装置可以是服务器、终端、或者其组合。
图5是根据本公开实施例的一种可选的电子装置的结构框图,如图5所示,包括处理器502、通信接口504、存储器506和通信总线508,其中,处理器502、通信接口504和存储器506通过通信总线508完成相互间的通信,其中,
存储器506,用于存储计算机程序;
处理器502,用于执行存储器506上所存放的计算机程序时,实现如下步骤:
S1,获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,目标点云数据中包含多个目标点;
S2,对目标点云数据执行聚类操作,得到一组类簇,其中,一组类簇中的每个类簇包含多个目标点中的至少一个目标点;
S3,从一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,目标缺口为允许目标机器人通过的缺口;
S4,控制目标机器人向目标缺口的方向进行移动。
可选地,在本实施例中,通信总线可以是PCI(Peripheral Component Interconnect,外设部件互连标准)总线、或EISA(Extended Industry Standard Architecture,扩展工业标准结构)总线等。该通信总线可以分为地址总线、数据总线、控制总线等。为便于表示,图5中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。通信接口用于上述电子装置与其他设备之间的通信。
上述的存储器可以包括RAM,也可以包括非易失性存储器(non-volatile memory),例如,至少一个磁盘存储器。可选地,存储器还可以是至少一个位于远离前述处理器的存储装置。
作为一种示例,上述存储器506中可以但不限于包括上述设备的控制装置中的获取单元402、聚类单元404、选取单元406以及控制单元408。此外,还可以包括但不限于上述设备的控制装置中的其他模块单元,本示例中不再赘述。
上述处理器可以是通用处理器,可以包含但不限于:CPU(Central Processing Unit,中央处理器)、NP(Network Processor,网络处理器)等;还可以是DSP(Digital Signal Processing,数字信号处理器)、ASIC(Application Specific Integrated Circuit,专用集成电路)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。
可选地,本实施例中的具体示例可以参考上述实施例中所描述的示例,本实施例在此不再赘述。
本领域普通技术人员可以理解,图5所示的结构仅为示意,实施上述机器人的移动控制方法的设备可以是终端设备,该终端设备可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图5其并不对上述电子装置的结构造成限定。例如,电子装置还可包括比图5中所示更多或者更少的组件(如网络接口、显示装置等),或者具有与图5所示的不同的配置。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、ROM、RAM、磁盘或光盘等。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本公开每个实施例所述方法的全部或部分步骤。
在本公开的上述实施例中,对每个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本公开所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例中所提供的方案的目的。
另外,在本公开每个实施例中的各功能单元可以集成在一个处理单元中,也可以是每个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上所述仅是本公开的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本公开原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本公开的保护范围。

Claims (14)

  1. 一种机器人的移动控制方法,其特征在于,包括:
    获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,所述目标点云数据中包含多个目标点;
    对所述目标点云数据执行聚类操作,得到一组类簇,其中,所述一组类簇中的每个类簇包含所述多个目标点中的至少一个目标点;
    从所述一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,所述目标缺口为允许所述目标机器人通过的缺口;
    控制所述目标机器人向所述目标缺口的方向进行移动。
  2. 根据权利要求1所述的方法,其中,所述获取目标机器人所处的空间环境的点云数据,得到目标点云数据,包括:
    在控制所述目标机器人进行旋转的过程中,通过所述目标机器人上的激光传感器获取所述目标机器人所处的空间环境的点云数据,得到所述目标点云数据。
  3. 根据权利要求1所述的方法,其中,所述一组类簇包含至少两个类簇,每个类簇对应于所述目标机器人所处的空间环境中至少一个障碍物。
  4. 根据权利要求1所述的方法,其中,所述对所述目标点云数据执行聚类操作包括:基于所述目标点云数据中的不同目标点之间的距离对多个目标点进行聚类,所采用的聚类方式为有一种或者多种。
  5. 根据权利要求1所述的方法,其中,所述对所述目标点云数据执行聚类操作包括:
    将二维空间划分成栅格形式,再将所述目标点云数据映射到网格中;
    计算每个栅格中点集密度,根据预设的阈值对网格单元的类别作出分类,并与邻近的栅格组形成类。
  6. 根据权利要求1所述的方法,其中,所述对所述目标点云数据执行聚类操作,得到一组类簇,包括:
    确定所述多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,其中,所述每个参考射线分别对应于一个参考角度;
    根据所述每个目标点与所述每个参考射线之间的距离,确定与所述每个参考射线匹配的目标点;
    将与所述每个参考射线匹配的目标点分别确定为一个类簇,得到所述一组类簇。
  7. 根据权利要求6所述的方法,其中,所述确定所述多个目标点中的每个目标点与一组参考射线中的每个参考射线之间的距离,包括:
    将所述每个目标点分别投射到所述每个参考射线上,得到所述每个目标点与所述每个参 考射线之间的距离。
  8. 根据权利要求1所述的方法,其中,所述对所述目标点云数据执行聚类操作,得到一组类簇,包括:
    将所述目标点云数据降采样到二维栅格地图上,得到与所述多个目标点中的每个目标点对应的二维点;
    根据所述每个目标点对应的二维点对所述多个目标点执行聚类操作,得到所述一组类簇。
  9. 根据权利要求8所述的方法,其中,所述根据所述每个目标点对应的二维点对所述多个目标点执行聚类操作,得到所述一组类簇,包括:
    确定所述二维栅格地图中所述每个目标点对应的二维点所属的栅格,得到一组目标栅格,其中,所述一组目标栅格中的每个目标栅格包含与所述多个目标点的至少一个目标点对应的二维点;
    根据所述每个目标栅格包含的二维点、以及所述每个目标栅格的邻域信息对所述多个目标点执行聚类操作,得到所述一组类簇。
  10. 根据权利要求1至9中任一项所述的方法,其中,所述从所述一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,包括:
    将所述每个相邻类簇之间的距离,确定为所述每个相邻类簇之间的缺口的尺寸;
    从所述每个相邻类簇之间的缺口中,选取出尺寸最大的缺口,得到所述目标缺口。
  11. 根据权利要求10所述的方法,其中,所述方法还包括:
    采用欧式距离计算相邻类簇的类间距离;其中,所述欧式距离为相邻类簇中的一个类簇的最后一个点与相邻类簇的另一个类簇的第一个点之间的距离。
  12. 一种机器人的移动控制装置,其特征在于,包括:
    获取单元,用于获取目标机器人所处的空间环境的点云数据,得到目标点云数据,其中,所述目标点云数据中包含多个目标点;
    聚类单元,用于对所述目标点云数据执行聚类操作,得到一组类簇,其中,所述一组类簇中的每个类簇包含所述多个目标点中的至少一个目标点;
    选取单元,用于从所述一组类簇中每个相邻类簇之间的缺口中,选取出目标缺口,其中,所述目标缺口为允许所述目标机器人通过的缺口;
    控制单元,用于控制所述目标机器人向所述目标缺口的方向进行移动。
  13. 一种计算机可读的存储介质,其特征在于,所述计算机可读的存储介质包括存储的程序,其中,所述程序运行时执行权利要求1至11中任一项所述的方法。
  14. 一种电子装置,包括存储器和处理器,其特征在于,所述存储器中存储有计算机程 序,所述处理器被设置为通过所述计算机程序执行权利要求1至11中任一项所述的方法。
PCT/CN2023/080705 2022-04-08 2023-03-10 机器人的移动控制方法和装置、存储介质及电子装置 WO2023193567A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210366407.0A CN116931557A (zh) 2022-04-08 2022-04-08 机器人的移动控制方法和装置、存储介质及电子装置
CN202210366407.0 2022-04-08

Publications (1)

Publication Number Publication Date
WO2023193567A1 true WO2023193567A1 (zh) 2023-10-12

Family

ID=88243951

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/080705 WO2023193567A1 (zh) 2022-04-08 2023-03-10 机器人的移动控制方法和装置、存储介质及电子装置

Country Status (2)

Country Link
CN (1) CN116931557A (zh)
WO (1) WO2023193567A1 (zh)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130051658A1 (en) * 2011-08-22 2013-02-28 Samsung Electronics Co., Ltd. Method of separating object in three dimension point cloud
CN108460779A (zh) * 2018-02-12 2018-08-28 浙江大学 一种动态环境下的移动机器人图像视觉定位方法
KR20190064311A (ko) * 2017-11-30 2019-06-10 주식회사 모빌테크 라이다를 이용한 지도 생성 방법 및 장치
CN109993192A (zh) * 2018-01-03 2019-07-09 北京京东尚科信息技术有限公司 目标对象识别方法及装置、电子设备、存储介质
CN110244743A (zh) * 2019-07-03 2019-09-17 浙江大学 一种融合多传感器信息的移动机器人自主脱困方法
CN110390346A (zh) * 2018-04-23 2019-10-29 北京京东尚科信息技术有限公司 目标对象识别方法、装置、电子设备及存储介质
CN111429574A (zh) * 2020-03-06 2020-07-17 上海交通大学 基于三维点云和视觉融合的移动机器人定位方法和系统
CN111723866A (zh) * 2020-06-19 2020-09-29 新石器慧通(北京)科技有限公司 点云聚类的方法及装置、无人车、可读存储介质
CN113925390A (zh) * 2021-10-19 2022-01-14 珠海一微半导体股份有限公司 一种基于地图图像的跨区域通道识别方法、机器人及芯片
CN114187425A (zh) * 2021-12-13 2022-03-15 河北工业大学 基于二进制占用网格的点云聚类与包围方法
CN114266801A (zh) * 2021-12-23 2022-04-01 内蒙古工业大学 基于三维激光雷达的移动机器人越野环境中地面分割方法

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130051658A1 (en) * 2011-08-22 2013-02-28 Samsung Electronics Co., Ltd. Method of separating object in three dimension point cloud
KR20190064311A (ko) * 2017-11-30 2019-06-10 주식회사 모빌테크 라이다를 이용한 지도 생성 방법 및 장치
CN109993192A (zh) * 2018-01-03 2019-07-09 北京京东尚科信息技术有限公司 目标对象识别方法及装置、电子设备、存储介质
CN108460779A (zh) * 2018-02-12 2018-08-28 浙江大学 一种动态环境下的移动机器人图像视觉定位方法
CN110390346A (zh) * 2018-04-23 2019-10-29 北京京东尚科信息技术有限公司 目标对象识别方法、装置、电子设备及存储介质
CN110244743A (zh) * 2019-07-03 2019-09-17 浙江大学 一种融合多传感器信息的移动机器人自主脱困方法
CN111429574A (zh) * 2020-03-06 2020-07-17 上海交通大学 基于三维点云和视觉融合的移动机器人定位方法和系统
CN111723866A (zh) * 2020-06-19 2020-09-29 新石器慧通(北京)科技有限公司 点云聚类的方法及装置、无人车、可读存储介质
CN113925390A (zh) * 2021-10-19 2022-01-14 珠海一微半导体股份有限公司 一种基于地图图像的跨区域通道识别方法、机器人及芯片
CN114187425A (zh) * 2021-12-13 2022-03-15 河北工业大学 基于二进制占用网格的点云聚类与包围方法
CN114266801A (zh) * 2021-12-23 2022-04-01 内蒙古工业大学 基于三维激光雷达的移动机器人越野环境中地面分割方法

Also Published As

Publication number Publication date
CN116931557A (zh) 2023-10-24

Similar Documents

Publication Publication Date Title
CN109059902B (zh) 相对位姿确定方法、装置、设备和介质
JP6571274B2 (ja) レーザ深度マップサンプリングのためのシステム及び方法
CN107728615B (zh) 一种自适应区域划分的方法及系统
CN107610084B (zh) 一种对深度图像和激光点云图进行信息融合的方法与设备
US10031231B2 (en) Lidar object detection system for automated vehicles
CN108875804B (zh) 一种基于激光点云数据的数据处理方法和相关装置
WO2021104497A1 (zh) 基于激光雷达的定位方法及系统、存储介质和处理器
EP3008488B1 (en) Lidar-based classification of object movement
CN109521757B (zh) 静态障碍物识别方法和装置
EP3624055B1 (en) Ground detection method, apparatus, electronic device, vehicle and storage medium
CN111094895B (zh) 用于在预构建的视觉地图中进行鲁棒自重新定位的系统和方法
CN108345836A (zh) 用于自主车辆的标志识别
US20220282993A1 (en) Map fusion method, device and storage medium
CN111380510B (zh) 重定位方法及装置、机器人
CN110796686A (zh) 目标跟踪方法及设备、存储装置
CN111308500B (zh) 基于单线激光雷达的障碍物感知方法、装置和计算机终端
CN111932943A (zh) 动态目标的检测方法、装置、存储介质及路基监测设备
WO2022087916A1 (zh) 定位方法、装置、电子设备和存储介质
WO2023179717A1 (zh) 用于激光雷达的点云处理方法、装置、设备及存储介质
US20190290493A1 (en) Intelligent blind guide method and apparatus
CN112130165A (zh) 一种定位方法、装置、介质及无人设备
CN115346192A (zh) 基于多源传感器感知的数据融合方法、系统、设备及介质
WO2021087777A1 (zh) 数据处理方法、装置、雷达、设备及存储介质
US20210256740A1 (en) Method for increasing point cloud sampling density, point cloud processing system, and readable storage medium
CN114091515A (zh) 障碍物检测方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23784131

Country of ref document: EP

Kind code of ref document: A1