CN113741446A - Robot autonomous exploration method, terminal equipment and storage medium - Google Patents

Robot autonomous exploration method, terminal equipment and storage medium Download PDF

Info

Publication number
CN113741446A
CN113741446A CN202110998158.2A CN202110998158A CN113741446A CN 113741446 A CN113741446 A CN 113741446A CN 202110998158 A CN202110998158 A CN 202110998158A CN 113741446 A CN113741446 A CN 113741446A
Authority
CN
China
Prior art keywords
robot
distance
key points
key
adjacent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110998158.2A
Other languages
Chinese (zh)
Other versions
CN113741446B (en
Inventor
黄高波
毕占甲
赵广超
黄祥斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202110998158.2A priority Critical patent/CN113741446B/en
Publication of CN113741446A publication Critical patent/CN113741446A/en
Priority to PCT/CN2021/139343 priority patent/WO2023024347A1/en
Application granted granted Critical
Publication of CN113741446B publication Critical patent/CN113741446B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The application is applicable to the technical field of robots, and provides a method for autonomous exploration of a robot, a terminal device and a storage medium, wherein the method comprises the following steps: the method comprises the steps of obtaining first distances between a robot and first key points, wherein the first key points are position points where the robot is located when the robot scans obstacles in a preset area, calculating a second distance between every two adjacent first key points based on the first distances between the robot and the first key points, the two adjacent first key points are two first key points with adjacent angle coordinates, the angle coordinates of the first key points are the angle position relation of the first keys relative to the robot, and determining whether the autonomous exploration of the robot is feasible or not based on the second distances.

Description

Robot autonomous exploration method, terminal equipment and storage medium
Technical Field
The application belongs to the technical field of robots, and particularly relates to a method for autonomous robot exploration, a terminal device and a storage medium.
Background
With the development of robots, new challenges are presented to the autonomous exploration of robots. The autonomous exploration of the robot means that the robot determines the next moving target of the robot through a sensor on the robot in an unknown environment.
Because of the variable environment, some environments are not suitable for the robot to conduct autonomous exploration, and if the robot is in the unsuitable environment to conduct autonomous exploration, the robot is easy to damage due to the influence of the environment. Therefore, after the robot enters an area, how to judge whether the robot can perform autonomous exploration in the area is a problem to be solved at present.
Disclosure of Invention
The embodiment of the application provides a method and a device for robot autonomous exploration, terminal equipment and a storage medium, and can solve the problem that whether robot autonomous exploration is feasible or not cannot be judged at present.
In a first aspect, an embodiment of the present application provides a method for autonomous robot exploration, including:
acquiring a first distance between the robot and each first key point, wherein each first key point is a position point where an obstacle is located, and the position point is obtained when the robot scans the obstacle in a preset area;
calculating a second distance between every two adjacent first key points based on first distances between the robot and the first key points, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angle position relation of the first key points relative to the robot;
and when the second distance meets a preset condition, controlling the robot to perform autonomous exploration.
In a second aspect, an embodiment of the present application provides an apparatus for autonomous robot exploration, including:
the data acquisition module is used for acquiring first distances between the robot and each first key point, wherein each first key point is a position point where an obstacle is located, and the position point is obtained when the robot scans the obstacle in a preset area;
the data calculation module is used for calculating a second distance between every two adjacent first key points based on first distances between the robot and the first key points, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angle position relation of the first key points relative to the robot;
and the control module is used for controlling the robot to perform autonomous exploration when the second distance meets a preset condition.
In a third aspect, an embodiment of the present application provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method for autonomous robot exploration according to any of the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, where the computer program, when executed by a processor, implements the method for autonomous robot exploration according to any one of the above first aspects.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the method for robot autonomous discovery according to any one of the above first aspects.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that: according to the method, first distances between the robot and each first key point are obtained, then a second distance between every two adjacent first key points is calculated based on the first distances, and finally the robot is controlled to conduct autonomous exploration when the second distances meet preset conditions; according to the method, whether an unsealed area exists in an area where the robot is located can be determined by calculating the distance between every two adjacent first key points, and further whether the robot can conduct autonomous exploration can be determined; the method and the device for the autonomous exploration of the robot realize the feasible analysis of the autonomous exploration of the robot, can ensure that the robot performs the autonomous exploration in an area suitable for the autonomous exploration, and reduce the damage of the robot caused by environmental factors.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario of a method for autonomous robot exploration according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a method for autonomous robot exploration according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a relationship between a robot and a first key point according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a first keypoint in a polar coordinate system according to an embodiment of the present disclosure;
FIG. 5 is a flowchart illustrating a method for determining feasibility of autonomous exploration based on second distance according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a second distance and a third distance among distances of a first keypoint provided by an embodiment of the present application;
fig. 7 is a flowchart illustrating a method for determining a first distance according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a first distance of a first keypoint set acquired by a first radar and a second radar according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an apparatus for autonomous robot exploration according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in the specification of this application and the appended claims, the term "if" may be interpreted contextually as "when … …" or "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
With the use of the disinfection robot, the current disinfection robot is mostly used in a closed indoor environment. However, due to the variety of environments, many killing robots are applied to open environments, such as squares and stores. However, some open environments are not suitable for the disinfection robot to conduct autonomous exploration, for example, the disinfection robot has the problems that the center of gravity is high, the inertial braking distance is long, the sensor blind area is large and the like, and the disinfection robot cannot conduct collision type exploration and near cliff exploration, so that the disinfection robot is damaged when conducting autonomous exploration in the unsuitable environment.
Fig. 1 is a schematic view of an application scenario of a method for robot autonomous exploration according to an embodiment of the present disclosure, where the method for determining feasibility of robot autonomous exploration is used for feasibility analysis of robot autonomous exploration. Wherein the robot 10 is used to detect the distance between the robot and an obstacle in the environment in which the robot is located. The processor 20 is configured to obtain the respective distances collected by the robot 10 and analyze the respective distances to determine whether the robot can explore autonomously in the environment.
The method for autonomous robot search according to the embodiment of the present application will be described in detail below with reference to fig. 1.
Fig. 2 shows a schematic flow chart of a method for autonomous robot exploration provided by the present application, which may be implemented in a processor in the robot or in a processor external to the robot. Referring to fig. 2, the method is detailed as follows:
s101, acquiring a first distance between the robot and each first key point, wherein each first key point is a position point where an obstacle is located, and the position point is obtained when the robot scans the obstacle in a preset area.
In this embodiment, the preset area may be set as required, and the preset area is provided with an obstacle, for example, the obstacle may include a wall, a table, a chair, and the like.
In this embodiment, after the robot enters the preset area, the ranging radar on the robot scans the preset area by 360 degrees to obtain the distance between the robot and each first key point, which is recorded as the first distance in the application.
Specifically, the ranging radar transmits ranging signals to the periphery at preset angle intervals, for example, one ranging signal is transmitted at every angle θ, the ranging signals are reflected after reaching the obstacle, and the radar determines the distance between the radar and the obstacle, namely the distance between the robot and the obstacle, according to the time of the received reflected signals. The position of the obstacle, at which the ranging signal transmitted by the radar is received, is a key point, and is referred to as a first key point in the application. There may be multiple first keypoints on the same obstacle. Hundreds of ranging signals can be sent out by ranging radar scanning for a week, and the number of specific ranging signals can be set as required.
By way of example, as shown in fig. 3, the robot transmits a ranging signal at a point a, b, c, d are first key points obtained by the robot through a ranging radar, the distance from each first key point to the point a is a first distance, the distance from a to a is l1, the distance from b to a is l2, the distance from c to a is l3, and the distance from d to a is l 4.
In this embodiment, if there is no obstacle at a certain position in the preset area, the ranging signal sent by the ranging radar is not received by the obstacle, and therefore no transmitting signal is generated, and the position where the ranging signal is transmitted does not have the first key point and the first distance.
S102, calculating a second distance between every two adjacent first key points based on first distances between the robot and the first key points, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angle position relation of the first key points relative to the robot.
In this embodiment, when the robot scans the obstacle in the preset area to obtain the first key point, the obtained first key point on the obstacle has different angular positions relative to the robot because the position of the obstacle relative to the robot is different.
Specifically, a reference direction may be set to establish a polar coordinate system, and an origin (0 point) of the polar coordinate system is a position where the robot is located. The angle coordinate of each first key point is an included angle between the direction of the first key point and the reference direction, that is, the angle coordinate of the first key point is an angle value in the polar coordinate of the first key point, and the first distance between the robot and the first key point is a distance value in the polar coordinate. Two adjacent first keypoints may be determined based on the angular coordinates of the respective first keypoints.
By way of example, fig. 4 shows a polar coordinate system established with a robot as the origin. Where point Q is the origin, the horizontal axis R is the reference direction, and the polar coordinate of the first keypoint S1 is (R)1,θ1) The polar coordinate of the first key point S2 is (r)2,θ2)。r1Characterizing the distance, θ, of the first keypoint S1 from the robot1The included angle between the connecting line of the first key point S1 and the point Q and the reference direction, that is, the angular coordinate of the first key point S1, is represented. r is2Characterizing the distance, θ, of the first keypoint S2 from the robot2The included angle between the line connecting the first key point S2 and the point Q and the reference direction, i.e. the angular coordinate of the first key point S2, is represented. There are no other keypoints between the first keypoint S1 and the first keypoint S2, and thus the first keypoint 1 and the first keypoint S2 are two adjacent first keypoints.
And S103, controlling the robot to perform autonomous exploration when the second distance meets a preset condition.
In this embodiment, when the second distance does not satisfy the preset condition, the robot is controlled to perform the involuntary search.
Specifically, if the second distance is greater, it is determined that no obstacle exists between the two first key points, that is, an unsealed area may exist at the position, and the unsealed area may be an area that cannot be detected by the robot during the autonomous exploration, and the unsealed area may damage the robot. Otherwise, the robot can perform autonomous exploration in the preset area.
If the robot is determined to be unsuitable for autonomous exploration in the preset area, the robot can display preset information, and the preset information is used for prompting a user that the area is unsuitable for autonomous exploration of the robot.
According to the method, first distances between the robot and each first key point are obtained, then second distances between every two adjacent first key points are calculated based on the first distances, and finally whether autonomous exploration of the robot is feasible or not is determined based on the second distances; according to the method, whether an unsealed area exists in an area where the robot is located can be determined by calculating the distance between every two adjacent first key points, and further whether the robot can conduct autonomous exploration can be determined; the method and the device for the autonomous exploration of the robot realize the feasible analysis of the autonomous exploration of the robot, can ensure that the robot performs the autonomous exploration in an area suitable for the autonomous exploration, and reduce the damage of the robot caused by environmental factors.
In one possible implementation manner, the implementation process of step S102 may include:
and S1021, acquiring a first included angle formed by connecting lines between each two adjacent first key points and the robot, wherein the two adjacent first key points are the first key points collected by the robot at the same position.
In this embodiment, the included angle between two adjacent first keypoints can be determined according to the difference between the angle coordinates of the two first keypoints.
In this embodiment, since the first keypoints may be acquired by the robot at different positions, in order to ensure the accuracy of the calculation, the adjacent first keypoints are the keypoints acquired by the robot at the same position. First key points collected by the robot at different positions cannot be recorded as adjacent first key points. For convenience of description, a first included angle formed by connecting lines between two adjacent first key points and the robot is referred to as a first included angle between the two adjacent first key points.
And S1022, calculating a second distance between the two adjacent first key points based on the first distances respectively corresponding to the two adjacent first key points and a first included angle formed by the connecting line between the two adjacent first key points and the robot.
In particular, it can be based on the formula
Figure BDA0003234524250000081
li,jIs a second distance, D, between adjacent first keypoints i and first keypoints jiIs the first distance, D, of the robot from the first keypoint ijIs a first distance between the robot and the first keypoint j, and gamma is a first included angle between the first keypoint i and the second keypoint j.
Optionally, if the distance measuring radar sends each distance measuring signal at equal intervals, for example, the distance measuring radar sends one distance measuring signal every 5 degrees, the distance measuring radar sends 72 distance measuring signals within 360 degrees, each distance measuring signal is numbered according to a certain direction, each first key point corresponds to the number of one distance measuring signal, and an included angle between two adjacent first key points can be determined according to the number of the distance measuring signal. For example, the ranging signals are numbered in reverse time, the ranging signal number transmitted when the angle is 0 degree is 1, and the ranging signal number transmitted when the angle is 5 degrees is 2.
Specifically, the first included angle between two adjacent first key points may be (d) according to the formula γj-di) X theta, where gamma is a first angle between the first keypoint i and the second keypoint j, djIs the number corresponding to the first key point j, diAnd theta is the angle interval of the ranging signals transmitted by the ranging radar, namely the angle difference between two adjacent ranging signals.
According to the method and the device, because the included angle between two adjacent first key points can be determined, the distance between each first key point and the robot can be determined, and the distance between two adjacent first key points can be quickly, simply and accurately obtained based on the information.
As shown in fig. 5, in a possible implementation manner, the implementation process of step S103 may include:
and S1031, determining whether a third distance exists in the second distances, wherein the third distance is a second distance larger than a first preset threshold value.
In this embodiment, the preset condition may include that the third distance is not present among the second distances.
In this embodiment, the first preset threshold may be set as needed.
In this embodiment, an unclosed region in the preset region is one of factors affecting autonomous exploration of the robot, and if a large unclosed region exists in the preset region, the safety of the robot may be affected, so that the robot is not suitable for autonomous exploration if a large unclosed region exists in the preset region, and otherwise, autonomous exploration may be performed.
Specifically, the method for determining whether a larger unclosed area exists in the preset area may be to determine whether a third distance exists in each second distance, and determine whether a larger unclosed area exists between two adjacent first key points by determining whether the third distance exists.
And S1032, if the third distance does not exist in the second distances, controlling the robot to perform self-autonomous search.
In this embodiment, if the third distance does not exist in the second distances, it is determined that a large area which is not conducive to the autonomous search by the robot does not exist in the preset area, and the robot may perform the autonomous search in the preset area.
S1033, if the third distance exists in the second distances, calculating a total length of the contour map composed of the first keypoints based on the second distances.
In this embodiment, if the third distance exists in the second distances, it is determined that a larger unclosed area exists in the preset area, and it is further determined whether the unclosed area affects the autonomous search of the robot.
Specifically, when the third distance exists in each second distance, the total length of the contour map composed of each first keypoint is one of the factors influencing the autonomous exploration feasibility judgment, and therefore, the total length of the contour map composed of each first keypoint needs to be calculated first.
Specifically, the total length of the contour map composed of the first keypoints may be the sum of the second distances.
S1034, when the third distance and the total length of the contour map meet preset requirements, controlling the robot to conduct autonomous exploration.
Specifically, whether the unclosed region at the third distance influences the autonomous exploration of the robot may be determined according to the proportion of each third distance in the contour map composed of each first key point, if so, the robot may not conduct the autonomous exploration, otherwise, the robot may conduct the autonomous exploration.
In this embodiment, the preset condition may include that a third distance exists in the second distances, and the third distance and the total length of the contour map satisfy a preset requirement.
Specifically, the implementation process of step S1034 may include:
calculating a ratio of the third distance to a total length of the contour map. And if the ratio is larger than a second preset threshold value, controlling the robot to perform non-autonomous exploration. And if the ratio which is larger than the second preset threshold value does not exist in the ratio, determining that the robot carries out autonomous exploration.
In this embodiment, the preset requirement may include that there is no ratio greater than a second preset threshold in the ratio of the third distance to the total length of the contour map.
In this embodiment, the second preset threshold may be set as needed. If the ratio of the third distance to the total length of the contour map is greater than a second preset threshold, it is determined that the unclosed area at the third distance is larger, the larger unclosed area affects the autonomous exploration of the robot, and the robot is not suitable for the autonomous exploration. If the ratio of the third distance to the total length of the contour map is not greater than a second preset threshold, it is determined that the unclosed area at the third distance is small, and the small unclosed area is not enough to influence the autonomous exploration of the robot, so that the robot can conduct the autonomous exploration.
For example, as shown in fig. 6, point O is a position where the robot is located, a line connected to point O in the figure represents a ranging signal sent by the ranging radar, point O is a starting point of the ranging signal, and an end point of the ranging signal represents each first key point. The third distance is represented by the dashed line portions L1-L5 in the figure. The sum of the solid line part and the dotted line part in the figure represents the total length of the outline graph formed by the first key points.
As shown in fig. 7, in a possible implementation manner, the implementation process of step S101 may include:
s1011, acquiring first information collected by the robot, wherein the first information comprises second key points, the distance between the robot and the second key points, and second included angles formed by connecting lines between two adjacent second key points and the robot respectively, the second key points are positions where obstacles are located and detected by a first radar in a preset area, the two adjacent second key points are two second key points with adjacent angle coordinates, and the angle coordinates of the second key points are the angle position relationship of the second key points relative to the robot.
In this embodiment, because the general central point who sets up at the robot of top radar puts, and the scope that the top radar gathered is wider, therefore the robot can use the range radar at top to gather first information, records the range radar at top into first radar in this application.
Specifically, the first radar transmits a ranging signal, the position of the obstacle where the ranging signal is received is recorded as a second key point, and then the distance between the robot and each second key point is determined through a reflection signal of the ranging signal.
In this embodiment, the determination method of two adjacent second keypoints is similar to the determination method of two adjacent first keypoints, please refer to the description of step S102, and details are not repeated here.
S1012, determining whether a region which is not detected by the first radar exists in the preset region or not based on a second included angle between the two adjacent second key points.
In this embodiment, since there may be a higher obstacle in the preset area, there may also be a lower obstacle. When the height of the obstacle is lower or higher than the height of the first radar, the ranging signal transmitted by the first radar will not be received by the obstacle, and the first radar cannot detect the obstacle, possibly resulting in the omission of the obstacle. Or, since the first radar is far away from the obstacle, the ranging signal transmitted by the first radar cannot be received by the obstacle at a far distance, so that the first radar cannot detect the obstacle. Therefore, after the first information is acquired by the first radar, whether an area which is not detected by the first radar exists in the preset area can be determined according to the first information.
Specifically, if an included angle between two adjacent second key points is greater than a preset included angle, it is determined that a region which is not detected by the first radar exists in the preset region. The undetected area is an area between two second key points corresponding to a second included angle larger than the preset included angle. The value of the preset included angle can be set as required.
And S1013, if an area which is not detected by the first radar exists in the preset area, acquiring second information collected by the robot, wherein the second information is information of the undetected area which is collected by the robot at a second position and/or by a second radar, and the second radar and the first radar are arranged at different heights of the robot.
In this embodiment, if there is an area undetected by the first radar in the preset area, the robot may be moved to shorten the distance between the robot and the area undetected by the first radar. And/or collecting information of the area which is not detected by the first radar by adopting a radar which is different from the first radar in height position, wherein the collected information of the area which is not detected by the first radar is recorded as second information. In the present application, a radar different in height position from the first radar is referred to as a second radar.
And S1014, determining a first distance between the robot and each first key point based on the first information and the second information.
Specifically, the implementation process of step S1014 may include:
s10141, judging whether the second information comprises a third key point, wherein the third key point is a position point where an obstacle in the area which is not detected by the first radar is located.
In this embodiment, it is necessary to determine whether the second information includes a key point of an obstacle in an area that is not detected by the first radar, which is referred to as a third key point in this application. It may be determined whether an obstacle is included in the undetected region according to the third key point.
S10142, if the second information includes the third key point, obtaining first distances between the robot and each first key point based on the distance between the robot and the second key point and the distance between the robot and the third key point, where the first key points include the second key point and the third key point, and the second information includes the distance between the robot and the third key point.
In this embodiment, if the third keypoint is present in the second information, it is determined that an obstacle in an area not detected by the first radar is detected using the second radar, and the second keypoint and the third keypoint may be collectively regarded as the first keypoint.
By way of example, as shown in fig. 8, the O1 location is the first location of the robot, and the T1 through Tn are the second keypoints that the robot was at the first location and acquired using the first radar. O2 is the second position of the robot, and Y1 to Y4 are the third key points that the robot is in the second position and uses the second radar to pick up. Y1 and T1 may be coincident points or points that are closer together when they are in the same plane. Y4 and Tn may be coincident points or points that are closer together when they are in the same plane.
S10143, if the second information does not include the third keypoint, obtaining a first distance between the robot and each first keypoint based on a distance between the robot and the second keypoint, where the first keypoint includes the second keypoint.
In this embodiment, if the second information does not include the third key point, it is determined that the second radar does not detect the obstacle in the area not detected by the first radar, and it may be determined that the obstacle may not exist in the area not detected by the first radar.
S1015, if there is no area undetected by the first radar in the preset area, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, where the first key point includes the second key point.
In the embodiment of the application, the first key point of the position of the obstacle in the preset area is determined through the information acquired by the robot for multiple times, the distance from the robot to the first key point is further obtained, the obstacle in the preset area can be ensured to be detected through multiple acquisition, and the obtained first key point is ensured to be more comprehensive.
It should be noted that, in the determination process of the first key point, a depth camera and an ultrasonic sensor on the robot may also be used for jointly determining. The depth camera is used for acquiring a depth image of the preset area. The ultrasonic sensor is used for detecting the position of an obstacle in a preset area. And fusing the information collected by the first radar, the second radar, the depth camera and the ultrasonic sensor by using an information fusion technology to obtain first key points and first distances between the robot and the first key points.
In a possible implementation manner, the robot may further acquire an area of a preset region when acquiring the first distance, and the feasibility of autonomous exploration of the robot is determined according to the acquired area of the preset region.
Specifically, the method may further include:
s201, acquiring the area of the preset area acquired by the robot.
In this embodiment, in the process of acquiring the first distance, the robot may detect the area of the preset area through the ground area detection module.
S202, if the area of the preset area is larger than a third preset threshold value, determining that the autonomous exploration of the robot is not feasible.
In this embodiment, the third preset threshold may be set as needed, for example, the third preset threshold may be set to 800 square meters, 900 square meters, 950 square meters, and the like.
S203, if the area of the preset area is smaller than or equal to the third preset threshold, determining that the robot is feasible to independently explore.
In the embodiment of the present application, the robot performs autonomous exploration in a large area with very low exploration efficiency, so that the large area is not favorable for the robot to perform autonomous exploration. The area of the preset region can be collected by walking in the preset region after the robot enters the preset region. And if the area of the preset area is larger than the third preset area, determining that the autonomous exploration of the robot is not feasible, otherwise determining that the autonomous exploration of the robot is feasible. Through the analysis of the area size of the preset area, a simple and convenient judgment method is provided for the autonomous exploration of the robot.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 9 is a block diagram of a configuration of a device for determining feasibility of autonomous robot search according to an embodiment of the present application, which corresponds to the method for determining feasibility of autonomous robot search according to the foregoing embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 9, the apparatus 300 may include: a data acquisition module 310, a data calculation module 320, and a determination module 330.
The data obtaining module 310 is configured to obtain a first distance between the robot and each first key point, where each first key point is a position point where an obstacle is located, where the obstacle is obtained when the robot scans the obstacle in a preset area.
A data calculating module 320, configured to calculate a second distance between every two adjacent first key points based on a first distance between the robot and each first key point, where the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are an angular position relationship of the first key with respect to the robot.
And the control module 330 is configured to control the robot to perform autonomous exploration when the second distance meets a preset condition.
In a possible implementation manner, the data calculation module 320 may specifically include:
the included angle obtaining unit is used for obtaining a first included angle formed by a connecting line between each two adjacent first key points and the robot, wherein the two adjacent first key points are first key points collected by the robot at the same position;
and the calculating unit is used for calculating a second distance between the two adjacent first key points based on the first distances respectively corresponding to the two adjacent first key points and the first included angle formed by the connecting line between the two adjacent first key points and the robot.
In a possible implementation manner, the control module 330 may specifically include:
the judging unit is used for determining whether a third distance exists in each second distance, wherein the third distance is a second distance larger than a first preset threshold value;
and a first control unit, configured to control the robot to perform autonomous exploration if the third distance does not exist in the second distances.
In a possible implementation manner, the control module 330 may specifically further include:
a length calculating unit, configured to calculate, based on the second distances, a total length of the contour map formed by the first keypoints, if the third distance exists in the second distances;
and the second control unit is used for controlling the robot to perform autonomous exploration on the basis of the third distance and the total length of the contour map.
In a possible implementation manner, the second control unit may specifically be configured to:
calculating a ratio of the third distance to a total length of the contour map;
and if the ratio which is larger than the second preset threshold value does not exist in the ratio, controlling the robot to conduct autonomous exploration.
In a possible implementation manner, the data obtaining module 310 may specifically be configured to:
acquiring first information acquired by the robot, wherein the first information comprises second key points, the distance between the robot and the second key points and a second included angle formed by connecting lines between two adjacent second key points and the robot respectively, the second key points are positions where obstacles are located, detected by a first radar, of the robot at a first position in the preset area, the two adjacent second key points are two second key points with adjacent angle coordinates, and the angle coordinates of the second key points are the angle position relationship of the second key points relative to the robot;
determining whether a region which is not detected by the first radar exists in the preset region or not based on a second included angle between the two adjacent second key points;
if the preset area has an area which is not detected by the first radar, acquiring second information collected by the robot, wherein the second information is information of the undetected area which is collected by the robot at a second position and/or by a second radar which is arranged at a different height from the first radar;
determining a first distance between the robot and each first key point based on the first information and the second information;
if the preset area does not have the area which is not detected by the first radar, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points comprise the second key points.
In a possible implementation manner, the data obtaining module 310 may be further specifically configured to:
judging whether the second information comprises a third key point, wherein the third key point is a position point where an obstacle in an area which is not detected by the first radar is located;
if the second information comprises the third key point, obtaining first distances between the robot and each first key point based on the distance between the robot and the second key point and the distance between the robot and the third key point, wherein the first key points comprise the second key point and the third key point, and the second information comprises the distance between the robot and the third key point;
if the second information does not include the third key point, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points include the second key point.
In one possible implementation, the apparatus 300 further includes:
the area obtaining module is used for obtaining the area of the preset area collected by the robot;
the first result output module is used for controlling the robot to perform non-autonomous exploration if the area of the preset area is larger than a third preset threshold;
and the second result output module is used for controlling the robot to independently explore if the area of the preset area is smaller than or equal to the third preset threshold.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a terminal device, and referring to fig. 10, the terminal device 400 may include: at least one processor 410, a memory 420, and a computer program stored in the memory 420 and executable on the at least one processor 410, wherein the processor 410 when executing the computer program implements the steps of any of the method embodiments described above, such as the steps S101 to S103 in the embodiment shown in fig. 2. Alternatively, the processor 410, when executing the computer program, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the modules 310 to 330 shown in fig. 10.
Illustratively, a computer program may be partitioned into one or more modules/units, which are stored in the memory 420 and executed by the processor 410 to accomplish the present application. The one or more modules/units may be a series of computer program segments capable of performing specific functions, which are used to describe the execution of the computer program in the terminal device 400.
Those skilled in the art will appreciate that fig. 10 is merely an example of a terminal device and is not limiting of terminal devices and may include more or fewer components than shown, or some components may be combined, or different components such as input output devices, network access devices, buses, etc.
The Processor 410 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 420 may be an internal storage unit of the terminal device, or may be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. The memory 420 is used for storing the computer programs and other programs and data required by the terminal device. The memory 420 may also be used to temporarily store data that has been output or is to be output.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The method for autonomously exploring the robot provided by the embodiment of the application can be applied to terminal equipment such as a computer, a tablet computer, a notebook computer, a netbook, a Personal Digital Assistant (PDA) and the like, and the embodiment of the application does not limit the specific type of the terminal equipment.
The present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the embodiments of the method for autonomous robot exploration.
The embodiment of the present application provides a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the embodiments of the method for autonomous robot exploration when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method for autonomous robotic exploration, comprising:
acquiring a first distance between the robot and each first key point, wherein each first key point is a position point where an obstacle is located, and the position point is obtained when the robot scans the obstacle in a preset area;
calculating a second distance between every two adjacent first key points based on first distances between the robot and the first key points, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angle position relation of the first key points relative to the robot;
and when the second distance meets a preset condition, controlling the robot to perform autonomous exploration.
2. The method for autonomous robot exploration according to claim 1, wherein said calculating a second distance between each two adjacent first keypoints based on first distances of said robot from said respective first keypoints comprises:
acquiring a first included angle formed by a connecting line between each two adjacent first key points and the robot, wherein the two adjacent first key points are first key points acquired by the robot at the same position;
and calculating a second distance between the two adjacent first key points based on first distances respectively corresponding to the two adjacent first key points and a first included angle formed by connecting lines between the two adjacent first key points and the robot.
3. The method for robot autonomous discovery according to claim 1, wherein said controlling said robot to perform autonomous discovery when said second distance satisfies a preset condition includes:
determining whether a third distance exists in the second distances, wherein the third distance is a second distance larger than a first preset threshold value;
and if the third distance does not exist in the second distances, controlling the robot to perform autonomous exploration.
4. The method for robotic autonomous exploration according to claim 3, wherein after said determining whether a third distance exists among respective second distances, comprising:
if the third distance exists in the second distances, calculating the total length of the contour map formed by the first key points based on the second distances;
and when the third distance and the total length of the contour map meet preset requirements, controlling the robot to conduct autonomous exploration.
5. The method for robot autonomous exploration according to claim 4, wherein said controlling said robot to conduct autonomous exploration when said third distance and a total length of said contour map satisfy preset requirements comprises:
calculating a ratio of the third distance to a total length of the contour map;
and if the ratio which is larger than a second preset threshold value does not exist in the ratio, controlling the robot to conduct autonomous exploration.
6. The method for autonomous robot exploration according to claim 1, wherein said obtaining a first distance of the robot from each first keypoint comprises:
acquiring first information acquired by the robot, wherein the first information comprises second key points, the distance between the robot and the second key points and a second included angle formed by connecting lines between two adjacent second key points and the robot respectively, the second key points are positions where obstacles are located, detected by a first radar, of the robot at a first position in the preset area, the two adjacent second key points are two second key points with adjacent angle coordinates, and the angle coordinates of the second key points are the angle position relationship of the second key points relative to the robot;
determining whether a region which is not detected by the first radar exists in the preset region or not based on a second included angle between the two adjacent second key points;
if the preset area has an area which is not detected by the first radar, acquiring second information collected by the robot, wherein the second information is information of the undetected area which is collected by the robot at a second position and/or by a second radar which is arranged at a different height from the first radar;
determining a first distance between the robot and each first key point based on the first information and the second information;
if the preset area does not have the area which is not detected by the first radar, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points comprise the second key points.
7. The method of robotic autonomous exploration according to claim 6, wherein said determining a first distance of said robot from each first keypoint based on said first information and said second information comprises:
judging whether the second information comprises a third key point, wherein the third key point is a position point where an obstacle in an area which is not detected by the first radar is located;
if the second information comprises the third key point, obtaining first distances between the robot and each first key point based on the distance between the robot and the second key point and the distance between the robot and the third key point, wherein the first key points comprise the second key point and the third key point, and the second information comprises the distance between the robot and the third key point;
if the second information does not include the third key point, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points include the second key point.
8. The method for robotic autonomous exploration according to claim 1, further comprising:
acquiring the area of the preset area acquired by the robot;
if the area of the preset area is larger than a third preset threshold value, controlling the robot to perform non-autonomous exploration;
and if the area of the preset area is smaller than or equal to the third preset threshold, controlling the robot to perform autonomous exploration.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements a method of autonomous robot exploration according to any of claims 1 to 8.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out a method for autonomous exploration by a robot according to any one of claims 1 to 8.
CN202110998158.2A 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium Active CN113741446B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110998158.2A CN113741446B (en) 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium
PCT/CN2021/139343 WO2023024347A1 (en) 2021-08-27 2021-12-17 Autonomous exploration method for robot, and terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110998158.2A CN113741446B (en) 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113741446A true CN113741446A (en) 2021-12-03
CN113741446B CN113741446B (en) 2024-04-16

Family

ID=78733543

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110998158.2A Active CN113741446B (en) 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium

Country Status (2)

Country Link
CN (1) CN113741446B (en)
WO (1) WO2023024347A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114847809A (en) * 2022-07-07 2022-08-05 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium
WO2023024347A1 (en) * 2021-08-27 2023-03-02 深圳市优必选科技股份有限公司 Autonomous exploration method for robot, and terminal device and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109464074A (en) * 2018-11-29 2019-03-15 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof
CN109744945A (en) * 2017-11-08 2019-05-14 杭州萤石网络有限公司 A kind of area attribute determines method, apparatus, system and electronic equipment
CN110448241A (en) * 2019-07-18 2019-11-15 广东宝乐机器人股份有限公司 The stranded detection of robot and method of getting rid of poverty
CN111104933A (en) * 2020-03-20 2020-05-05 深圳飞科机器人有限公司 Map processing method, mobile robot, and computer-readable storage medium
CN111198378A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Boundary-based autonomous exploration method and device
CN111329398A (en) * 2020-03-27 2020-06-26 上海高仙自动化科技发展有限公司 Robot control method, robot, electronic device, and readable storage medium
CN111638526A (en) * 2020-05-20 2020-09-08 电子科技大学 Method for robot to automatically build graph in strange environment
CN111904346A (en) * 2020-07-09 2020-11-10 深圳拓邦股份有限公司 Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium
US20210109537A1 (en) * 2019-10-09 2021-04-15 Wuhan University Autonomous exploration framework for indoor mobile robotics using reduced approximated generalized voronoi graph
US20210131822A1 (en) * 2017-09-12 2021-05-06 RobArt GmbH Exploration of an unknown environment by an autonomous mobile robot
WO2021114989A1 (en) * 2019-12-13 2021-06-17 苏州宝时得电动工具有限公司 Autonomous robot and control method thereof, and computer storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104375505B (en) * 2014-10-08 2017-02-15 北京联合大学 Robot automatic road finding method based on laser ranging
CN104460666B (en) * 2014-10-27 2017-05-10 上海理工大学 Robot autonomous obstacle avoidance moving control method based on distance vectors
DE102017121127A1 (en) * 2017-09-12 2019-03-14 RobArt GmbH Exploration of an unknown environment by an autonomous mobile robot
CN108983777B (en) * 2018-07-23 2021-04-06 浙江工业大学 Autonomous exploration and obstacle avoidance method based on self-adaptive front exploration target point selection
CN110936383B (en) * 2019-12-20 2022-11-18 上海有个机器人有限公司 Obstacle avoiding method, medium, terminal and device for robot
CN113741446B (en) * 2021-08-27 2024-04-16 深圳市优必选科技股份有限公司 Robot autonomous exploration method, terminal equipment and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210131822A1 (en) * 2017-09-12 2021-05-06 RobArt GmbH Exploration of an unknown environment by an autonomous mobile robot
CN109744945A (en) * 2017-11-08 2019-05-14 杭州萤石网络有限公司 A kind of area attribute determines method, apparatus, system and electronic equipment
CN109464074A (en) * 2018-11-29 2019-03-15 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof
CN110448241A (en) * 2019-07-18 2019-11-15 广东宝乐机器人股份有限公司 The stranded detection of robot and method of getting rid of poverty
US20210109537A1 (en) * 2019-10-09 2021-04-15 Wuhan University Autonomous exploration framework for indoor mobile robotics using reduced approximated generalized voronoi graph
WO2021114989A1 (en) * 2019-12-13 2021-06-17 苏州宝时得电动工具有限公司 Autonomous robot and control method thereof, and computer storage medium
CN113064408A (en) * 2019-12-13 2021-07-02 苏州宝时得电动工具有限公司 Autonomous robot, control method thereof, and computer storage medium
CN111198378A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Boundary-based autonomous exploration method and device
CN111104933A (en) * 2020-03-20 2020-05-05 深圳飞科机器人有限公司 Map processing method, mobile robot, and computer-readable storage medium
CN111329398A (en) * 2020-03-27 2020-06-26 上海高仙自动化科技发展有限公司 Robot control method, robot, electronic device, and readable storage medium
CN111638526A (en) * 2020-05-20 2020-09-08 电子科技大学 Method for robot to automatically build graph in strange environment
CN111904346A (en) * 2020-07-09 2020-11-10 深圳拓邦股份有限公司 Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023024347A1 (en) * 2021-08-27 2023-03-02 深圳市优必选科技股份有限公司 Autonomous exploration method for robot, and terminal device and storage medium
CN114847809A (en) * 2022-07-07 2022-08-05 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium
CN114847809B (en) * 2022-07-07 2022-09-20 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium

Also Published As

Publication number Publication date
WO2023024347A1 (en) 2023-03-02
CN113741446B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
US11422261B2 (en) Robot relocalization method and apparatus and robot using the same
CN109974727B (en) Robot charging method and device and robot
WO2020207190A1 (en) Three-dimensional information determination method, three-dimensional information determination device, and terminal apparatus
CN109978925B (en) Robot pose recognition method and robot thereof
CN110442120B (en) Method for controlling robot to move in different scenes, robot and terminal equipment
CN110471409B (en) Robot inspection method and device, computer readable storage medium and robot
CN113741446B (en) Robot autonomous exploration method, terminal equipment and storage medium
CN112528831B (en) Multi-target attitude estimation method, multi-target attitude estimation device and terminal equipment
CN109739223B (en) Robot obstacle avoidance control method and device, terminal device and storage medium
CN111814752B (en) Indoor positioning realization method, server, intelligent mobile device and storage medium
WO2022073427A1 (en) Visual positioning method and apparatus for object grabbing point, and storage medium and electronic device
JP6562530B1 (en) Charging stand recognition method, apparatus and robot
CN109884639B (en) Obstacle detection method and device for mobile robot
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN110597249B (en) Robot and recharging positioning method and device thereof
WO2021254376A1 (en) Transport robot control method and device, transport robot, and storage medium
CN110263713A (en) Method for detecting lane lines, device, electronic equipment and storage medium
WO2021103065A1 (en) Charging pile positioning method and apparatus for sweeping robot
CN109685764B (en) Product positioning method and device and terminal equipment
CN112198878A (en) Instant map construction method and device, robot and storage medium
CN113095227A (en) Robot positioning method and device, electronic equipment and storage medium
CN111220988B (en) Map data processing method, device, equipment and medium
CN110062169A (en) Image pickup method, filming apparatus, terminal device and computer readable storage medium
CN102833671A (en) Method and system for positioning robot vision
JP2023503750A (en) ROBOT POSITIONING METHOD AND DEVICE, DEVICE, STORAGE MEDIUM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant