CN113741446B - Robot autonomous exploration method, terminal equipment and storage medium - Google Patents

Robot autonomous exploration method, terminal equipment and storage medium Download PDF

Info

Publication number
CN113741446B
CN113741446B CN202110998158.2A CN202110998158A CN113741446B CN 113741446 B CN113741446 B CN 113741446B CN 202110998158 A CN202110998158 A CN 202110998158A CN 113741446 B CN113741446 B CN 113741446B
Authority
CN
China
Prior art keywords
robot
distance
key point
key
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110998158.2A
Other languages
Chinese (zh)
Other versions
CN113741446A (en
Inventor
黄高波
毕占甲
赵广超
黄祥斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN202110998158.2A priority Critical patent/CN113741446B/en
Publication of CN113741446A publication Critical patent/CN113741446A/en
Priority to PCT/CN2021/139343 priority patent/WO2023024347A1/en
Application granted granted Critical
Publication of CN113741446B publication Critical patent/CN113741446B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Acoustics & Sound (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application is applicable to the technical field of robots, and provides a method for autonomous exploration of a robot, terminal equipment and a storage medium, wherein the method comprises the following steps: acquiring first distances between a robot and each first key point, wherein each first key point is a position point of an obstacle, which is obtained when the robot scans the obstacle in a preset area, and calculating second distances between every two adjacent first key points based on the first distances between the robot and each first key point, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, the angle coordinates of the first key points are the angle position relation of the first key relative to the robot, and determining whether the autonomous exploration of the robot is feasible based on the second distances.

Description

Robot autonomous exploration method, terminal equipment and storage medium
Technical Field
The application belongs to the technical field of robots, and particularly relates to a method for autonomous exploration of a robot, terminal equipment and a storage medium.
Background
As robots develop, new challenges are presented to autonomous exploration of robots. Autonomous exploration of a robot refers to the robot determining a moving target of the robot in the next step through sensors on the robot in an unknown environment.
Due to the changeable environments, some environments are not suitable for the robot to perform autonomous exploration, and if the robot is in the unsuitable environment to perform autonomous exploration, the robot is easily damaged due to the influence of the environments. Therefore, it is a problem to be solved at present how to determine whether or not a robot can perform autonomous exploration in an area after the robot enters the area.
Disclosure of Invention
The embodiment of the application provides a method, a device, terminal equipment and a storage medium for autonomous exploration of a robot, which can solve the problem that whether autonomous exploration of the robot is feasible cannot be judged at present.
In a first aspect, an embodiment of the present application provides a method for autonomous exploration by a robot, including:
acquiring a first distance between a robot and each first key point, wherein each first key point is a position point of an obstacle obtained when the robot scans the obstacle in a preset area;
calculating a second distance between every two adjacent first key points based on the first distance between the robot and each first key point, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angular position relation of the first key points relative to the robot;
And when the second distance meets the preset condition, controlling the robot to perform autonomous exploration.
In a second aspect, an embodiment of the present application provides an apparatus for autonomous exploration by a robot, including:
the data acquisition module is used for acquiring first distances between the robot and each first key point, wherein each first key point is a position point where an obstacle is located, wherein the position point is obtained when the robot scans the obstacle in a preset area;
the data calculation module is used for calculating a second distance between every two adjacent first key points based on the first distance between the robot and each first key point, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angular position relation of the first key points relative to the robot;
and the control module is used for controlling the robot to perform autonomous exploration when the second distance meets the preset condition.
In a third aspect, an embodiment of the present application provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method of autonomous exploration of the robot of any of the above first aspects when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program, where the computer program when executed by a processor implements the method for autonomous exploration by a robot according to any of the first aspects above.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on a terminal device, causes the terminal device to perform the method of autonomous exploration by a robot according to any of the first aspects above.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Compared with the prior art, the embodiment of the application has the beneficial effects that: the method comprises the steps of firstly obtaining first distances between a robot and each first key point, then calculating second distances between every two adjacent first key points based on the first distances, and finally controlling the robot to perform autonomous exploration when the second distances meet preset conditions; according to the method and the device, whether the area where the robot is located is an unsealed area or not can be determined by calculating the distance between every two adjacent first key points, and whether the robot can perform autonomous exploration or not can be further determined; the method and the device realize the feasible analysis of autonomous exploration of the robot, can ensure that the robot performs autonomous exploration in an area suitable for autonomous exploration, and reduce the damage of the robot caused by environmental factors.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is an application scenario schematic diagram of a method for autonomous exploration by a robot according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for autonomous exploration by a robot according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a relationship between a robot and a first key point according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a first key point in a polar coordinate system according to an embodiment of the present application;
FIG. 5 is a flowchart of a method for determining autonomous exploration feasibility based on a second distance according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a second distance and a third distance among the distances of the first keypoints provided in an embodiment of the application;
FIG. 7 is a flowchart illustrating a method for determining a first distance according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of acquiring a first distance from a first set of key points using a first radar and a second radar according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a device for autonomous exploration by a robot according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted in context as "when … …" or "upon" or "in response to determining" or "in response to detecting". Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
With the use of the killing robots, most of the current killing robots are used in closed indoor environments. However, many of the suicide robots are applied to open environments, such as squares, venues, etc., due to the variety of environments. However, some open environments are not suitable for autonomous exploration by the killing robot, for example, environments with downward steps exist, and the killing robot cannot perform collision exploration and cliff approaching exploration due to the problems of high gravity center, long inertia braking distance, large sensor blind area and the like, so that the killing robot is damaged when the killing robot performs autonomous exploration in unsuitable environments.
Fig. 1 is an application scenario schematic diagram of a method for autonomous exploration of a robot, which is provided in an embodiment of the present application, and the method for determining feasibility of autonomous exploration of a robot may be used for feasibility analysis of autonomous exploration of a robot. Wherein the robot 10 is used for detecting the distance of the robot from an obstacle in the environment. The processor 20 is configured to acquire each distance acquired by the robot 10, and analyze each distance to determine whether the robot can perform autonomous exploration in the environment.
The method for autonomous exploration of a robot according to the embodiment of the present application will be described in detail with reference to fig. 1.
Fig. 2 shows a schematic flow chart of a method for autonomous exploration of a robot provided by the present application, which may be implemented in a processor in the robot, or in a processor external to the robot. Referring to fig. 2, the method is described in detail as follows:
s101, obtaining first distances between the robot and each first key point, wherein each first key point is a position point of an obstacle obtained when the robot scans the obstacle in a preset area.
In this embodiment, the preset area may be set as needed, and an obstacle is provided in the preset area, for example, the obstacle may include a wall, a table, a chair, and the like.
In this embodiment, after the robot enters the preset area, the range radar on the robot scans the preset area for 360 degrees to obtain the distance between the robot and each first key point, which is referred to as the first distance in this application.
Specifically, the ranging radar transmits ranging signals to the periphery at preset angle intervals, for example, each interval angle θ transmits a ranging signal, the ranging signal is reflected back after reaching an obstacle, and the radar determines the distance between the radar and the obstacle, that is, the distance between the robot and the obstacle according to the time of the received reflected signal. The location on the obstacle where the radar-transmitted ranging signal is received is the key point, denoted herein as the first key point. There may be multiple first keypoints on the same obstacle. Hundreds of ranging signals can be sent out by the ranging radar in one circle of scanning, and the number of specific ranging signals can be set according to the needs.
As an example, as shown in fig. 3, the robot transmits a ranging signal at the point a, a, b, c, d is a first key point obtained by the robot through the ranging radar, a distance from each first key point to the point a is a first distance, a distance from a to a is l1, b to a is l2, c to a is l3, and d to a is l4.
In this embodiment, if there is no obstacle at a certain position in the preset area, the ranging signal sent by the ranging radar is not received by the obstacle, and thus no transmission signal is generated, and the position where the ranging signal is transmitted does not have the first key point and the first distance.
S102, calculating a second distance between every two adjacent first key points based on the first distance between the robot and each first key point, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angle position relation of the first key points relative to the robot.
In this embodiment, when the robot scans the obstacle in the preset area to obtain the first key point, the first key point on the obtained obstacle has different angular positions with respect to the robot because the positions of the obstacle with respect to the robot are different.
Specifically, a reference direction can be set to establish a polar coordinate system, and the origin (0 point) of the polar coordinate system is the position of the robot. The angle coordinates of the first key points are included angles between the direction in which the first key points are located and the reference direction, namely, the angle coordinates of the first key points are angle values in the polar coordinates of the first key points, and the first distance between the robot and the first key points is a distance value in the polar coordinates. Two adjacent first keypoints may be determined based on the angular coordinates of the respective first keypoints.
By way of example, a polar coordinate system established with the origin of the robot is shown in FIG. 4. Wherein the point Q is the origin, the horizontal axis R is the reference direction, and the polar coordinate of the first key point S1 is (R 1 ,θ 1 ) The polar coordinate of the first key point S2 is (r 2 ,θ 2 )。r 1 Characterizing a distance, θ, between the first keypoint S1 and the robot 1 And an included angle between the connecting line of the first key point S1 and the Q point and the reference direction is represented, namely the angle coordinate of the first key point S1. r is (r) 2 Characterizing a distance, θ, between the first keypoint S2 and the robot 2 And an included angle between the connecting line of the first key point S2 and the Q point and the reference direction is represented, namely the angle coordinate of the first key point S2. No other keypoints exist between the first keypoint S1 and the first keypoint S2, and therefore, the first keypoint 1 and the first keypoint S2 are two adjacent first keypoints.
And S103, controlling the robot to perform autonomous exploration when the second distance meets a preset condition.
In this embodiment, when the second distance does not satisfy the preset condition, the robot is controlled to perform non-autonomous exploration.
Specifically, if the second distance is larger, it is determined that there is no obstacle between the two first key points, that is, there may be an unsealed area at the position, and the unsealed area may be an area that the robot cannot detect during autonomous exploration, and the unsealed area may damage the robot, so that in order to ensure the safety of the robot, if there is an unsealed area in the preset area, the robot may not be suitable for autonomous exploration in the preset area. Otherwise, determining that the robot can perform autonomous exploration in the preset area.
If the robot is determined to be unsuitable for autonomous exploration in the preset area, the robot can display preset information, wherein the preset information is used for prompting a user that the area is unsuitable for autonomous exploration of the robot.
The method comprises the steps of firstly obtaining first distances between a robot and each first key point, then calculating second distances between every two adjacent first key points based on the first distances, and finally determining whether the robot can explore autonomously based on the second distances; according to the method and the device, whether the area where the robot is located is an unsealed area or not can be determined by calculating the distance between every two adjacent first key points, and whether the robot can perform autonomous exploration or not can be further determined; the method and the device realize the feasible analysis of autonomous exploration of the robot, can ensure that the robot performs autonomous exploration in an area suitable for autonomous exploration, and reduce the damage of the robot caused by environmental factors.
In one possible implementation, the implementation procedure of step S102 may include:
s1021, obtaining a first included angle formed by connecting each two adjacent first key points with the robot respectively, wherein the two adjacent first key points are first key points acquired by the robot at the same position.
In this embodiment, the included angle between two adjacent first key points may be determined according to the difference between the angular coordinates of the two first key points.
In this embodiment, since the first key points may be collected by the robot at different positions, in order to ensure accuracy of calculation, adjacent first key points are key points collected by the robot at the same position. The first key points collected by the robot at different positions cannot be recorded as adjacent first key points. For convenience of description, a first included angle formed by connecting two adjacent first key points with the robot is hereinafter referred to as a first included angle between two adjacent first key points.
S1022, calculating a second distance between the two adjacent first key points based on the first distances respectively corresponding to the two adjacent first key points and a first included angle formed by connecting the two adjacent first key points with the robot.
Specifically, it can be according to the formulal i,j Is the second distance between the adjacent first key point i and the first key point j, D i For the first distance between the robot and the first key point i, D j The first distance between the robot and the first key point j is defined by gamma, and the first included angle between the first key point i and the second key point j is defined by gamma.
Optionally, if the ranging radar sends each ranging signal according to an equidistant angle, for example, the ranging radar sends one ranging signal every 5 degrees, the ranging radar sends 72 ranging signals in 360 degrees, each ranging signal is numbered according to a certain direction, each first key point corresponds to a number of one ranging signal, and an included angle between two adjacent first key points can be determined according to the numbers of the ranging signals. For example, the ranging signals are numbered in reverse time, and the ranging signal emitted when the angle is 0 degrees is numbered No. 1, the ranging signal emitted when the angle is 5 degrees is numbered No. 2, and so on.
Specifically, the first included angle between two adjacent first key points may be calculated according to the formula γ= (d j -d i ) X theta, wherein y is a first angle between the first and second key points i and j, d j Number d corresponding to the first key point j i For the number corresponding to the first key point i, θ is the angular interval of the ranging signals transmitted by the ranging radar, that is, the angular difference between two adjacent ranging signals.
According to the method and the device, the included angle between the two adjacent first key points can be determined, the distance between each first key point and the robot can be determined, and the distance between the two adjacent first key points can be obtained rapidly, simply, conveniently and accurately based on the information.
As shown in fig. 5, in one possible implementation, the implementation procedure of step S103 may include:
s1031, determining whether third distances exist in the second distances, wherein the third distances are second distances larger than a first preset threshold value.
In this embodiment, the preset condition may include that the third distance does not exist in the second distance.
In this embodiment, the first preset threshold may be set as needed.
In this embodiment, the non-closed area in the preset area is one of factors affecting the autonomous exploration of the robot, and if there is a larger non-closed area in the preset area, the safety of the robot may be affected, so if there is a larger non-closed area in the preset area, the robot is not suitable for autonomous exploration, otherwise, autonomous exploration may be performed.
Specifically, the method for determining whether the larger unsealed area exists in the preset area may be to determine whether a third distance exists in each second distance, and whether the larger unsealed area exists between two adjacent first key points may be determined by determining whether the third distance exists.
S1032, if the third distance does not exist in the second distances, controlling the robot to perform autonomous exploration.
In this embodiment, if the third distance does not exist in the second distances, it is determined that a larger area which is unfavorable for the robot to perform autonomous exploration does not exist in the preset area, and the robot may perform autonomous exploration in the preset area.
And S1033, if the third distances exist in the second distances, calculating the total length of the profile formed by the first key points based on the second distances.
In this embodiment, if the third distance exists in each second distance, it is determined that a larger unsealed area exists in the preset area, and it is further necessary to determine whether the unsealed area affects autonomous exploration of the robot.
Specifically, when the third distance exists in each second distance, the total length of the profile composed of each first key point is one of factors affecting the determination of the autonomous exploration feasibility, so the total length of the profile composed of each first key point needs to be calculated first.
Specifically, the total length of the profile formed by the first key points may be the sum of the second distances.
And S1034, controlling the robot to perform autonomous exploration when the third distance and the total length of the profile diagram meet preset requirements.
Specifically, whether the unsealed area at the third distance has an influence on the autonomous exploration of the robot can be determined according to the ratio of the third distance in the profile formed by the first key points, if so, the autonomous exploration of the robot cannot be performed, otherwise, the autonomous exploration can be performed.
In this embodiment, the preset condition may include that a third distance exists in the second distance, and the third distance and the total length of the profile meet preset requirements.
Specifically, the implementation procedure of step S1034 may include:
a ratio of the third distance to the total length of the profile is calculated. And if the ratio is larger than a second preset threshold value, controlling the robot to perform non-autonomous exploration. And if the ratio greater than the second preset threshold value does not exist in the ratios, determining that the robot performs autonomous exploration.
In this embodiment, the preset requirement may include that a ratio of the third distance to the total length of the profile map is not greater than the second preset threshold.
In this embodiment, the second preset threshold may be set as needed. If the ratio of the third distance to the total length of the profile is greater than a second preset threshold, determining that the unsealed area at the third distance is larger, wherein the larger unsealed area can influence autonomous exploration of the robot, and the robot is not suitable for autonomous exploration. If the ratio of the third distance to the total length of the profile is not greater than the second preset threshold, determining that the unsealed area at the third distance is smaller, wherein the smaller unsealed area is insufficient to influence autonomous exploration of the robot, so that the robot can perform autonomous exploration.
As an example, as shown in fig. 6, the O point is the position of the robot, the line connected to the O point in the figure represents the ranging signal sent by the ranging radar, the O point is the starting point of the ranging signal, and the end point of the ranging signal represents each first key point. The dashed line portions L1-L5 in the figure characterize the third distance. The sum of the solid line portion and the broken line portion in the figure represents the total length of the profile composed of the respective first key points.
As shown in fig. 7, in one possible implementation, the implementation procedure of step S101 may include:
s1011, acquiring first information acquired by the robot, wherein the first information comprises a second key point, a distance between the robot and the second key point, and a second included angle formed by connecting lines between two adjacent second key points and the robot, the second key point is a first position of the robot in a preset area and a position point where an obstacle detected by a first radar is located, the two adjacent second key points are two second key points with adjacent angle coordinates, and the angle coordinates of the second key points are the angle position relation of the second key relative to the robot.
In this embodiment, since the top radar is generally disposed at the center of the robot, and the range of the top radar is wider, the robot can collect the first information using the ranging radar on the top, and the ranging radar on the top is referred to as the first radar in this application.
Specifically, the first radar marks the position of the obstacle, which receives the ranging signal, as the second key point by transmitting the ranging signal, and then determines the distance between the robot and each second key point by the reflected signal of the ranging signal.
In this embodiment, the method for determining the two adjacent second key points is similar to the method for determining the two adjacent first key points, please refer to the description of step S102, and the description is omitted here.
And S1012, determining whether an area which is not detected by the first radar exists in the preset area or not based on a second included angle between the two adjacent second key points.
In this embodiment, since there may be a higher obstacle in the preset area, there may also be a lower obstacle. When the height of the obstacle is lower or higher than that of the first radar, the ranging signal transmitted by the first radar will not be received by the obstacle, and the first radar cannot detect the obstacle, so that omission of the obstacle may be caused. Alternatively, since the first radar is farther from the obstacle, the ranging signal transmitted by the first radar cannot be received by the obstacle at a longer distance, so that the first radar cannot detect the obstacle. Therefore, after the first information is acquired by the first radar, whether an area which is not detected by the first radar exists in the preset area or not can be determined according to the first information.
Specifically, if the included angle between two adjacent second key points is larger than the preset included angle, determining that an area which is not detected by the first radar exists in the preset area. The undetected area is an area between two second key points corresponding to a second included angle larger than the preset included angle. The value of the preset included angle can be set according to the requirement.
S1013, if an area which is not detected by the first radar exists in the preset area, acquiring second information acquired by the robot, wherein the second information is information of the area which is not detected and is acquired by the robot at a second position and/or by adopting a second radar, and the second radar and the first radar are arranged at different heights of the robot.
In this embodiment, if there is an area in the preset area where the first radar is not detected, the robot may be moved, shortening the distance between the robot and the area where the first radar is not detected. And/or collecting information of an area not detected by the first radar by adopting a radar with a height position different from that of the first radar, wherein the collected information of the area not detected by the first radar is recorded as second information. A radar that is different from the first radar in height position is referred to herein as a second radar.
S1014, determining a first distance between the robot and each first key point based on the first information and the second information.
Specifically, the implementation procedure of step S1014 may include:
s10141, judging whether the second information comprises a third key point, wherein the third key point is a position point where an obstacle in the area not detected by the first radar is located.
In this embodiment, it is necessary to determine whether the second information includes a key point of an obstacle in an area not detected by the first radar, which is referred to as a third key point in this application. It may be determined whether an obstacle is included in the undetected area according to the third key point.
S10142, if the second information includes the third key point, obtaining a first distance between the robot and each first key point based on a distance between the robot and the second key point and a distance between the robot and the third key point, wherein the first key point includes the second key point and the third key point, and the second information includes a distance between the robot and the third key point.
In the present embodiment, if the third key point exists in the second information, it is determined that an obstacle in an area not detected by the first radar is detected using the second radar, and the second key point and the third key point may be taken together as the first key point.
As an example, as shown in fig. 8, the O1 position is a first position of the robot, and T1 to Tn are second key points acquired by the robot at the first position and using the first radar. O2 is the second position of the robot, and Y1 to Y4 are the third key points of the robot at the second position and acquired using the second radar. Y1 and T1 may be coincident points or points closer together when in the same plane. Y4 and Tn may be coincident points or points closer together when in the same plane.
S10143, if the third key point is not included in the second information, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key point includes the second key point.
In this embodiment, if the third key point is not included in the second information, it is determined that the second radar does not detect the obstacle in the area where the first radar does not detect, and thus it may be determined that the obstacle may not exist in the area where the first radar does not detect, and therefore, the first key point and the first distance between the robot and the first key point may be obtained directly using the information detected by the first radar.
S1015, if the area which is not detected by the first radar does not exist in the preset area, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points comprise the second key points.
In the embodiment of the application, the first key point of the position of the obstacle in the preset area is determined through the information acquired by the robot for many times, so that the distance between the robot and the first key point is further obtained, the obstacle in the preset area can be detected through many times of acquisition, and the obtained first key point is ensured to be more comprehensive.
It should be noted that, in the determination process of the first key point, the depth camera and the ultrasonic sensor on the robot may also be used for determination together. The depth camera is used for acquiring a depth image of a preset area. The ultrasonic sensor is used for detecting the position of an obstacle in a preset area. And fusing all information acquired by the first radar, the second radar, the depth camera and the ultrasonic sensor by using an information fusion technology to obtain first key points and first distances between the robot and all the first key points.
In one possible implementation manner, the robot may further collect an area of the preset area when collecting the first distance, and determine feasibility of autonomous exploration of the robot according to the collected area of the preset area.
Specifically, the method may further include:
s201, acquiring the area of the preset area acquired by the robot.
In this embodiment, the robot may detect the area of the preset area through the ground area detection module in the process of acquiring the first distance.
S202, if the area of the preset area is larger than a third preset threshold value, determining that the autonomous exploration of the robot is not feasible.
In this embodiment, the third preset threshold may be set as needed, for example, the third preset threshold may be set to 800 square meters, 900 square meters, 950 square meters, or the like.
And S203, if the area of the preset area is smaller than or equal to the third preset threshold value, determining that the autonomous exploration of the robot is feasible.
In the embodiment of the application, the exploration efficiency of the robot for autonomous exploration in the area with a large area is very low, so that the area with a large area is unfavorable for the autonomous exploration of the robot. After the robot enters the preset area, the area of the preset area can be acquired by walking in the preset area. If the area of the preset area is larger than that of the third preset area, the robot is determined to be unworkable in autonomous exploration, otherwise, the robot is determined to be unworkable in autonomous exploration. By analyzing the area of the preset area, a simple and convenient judging method is provided for autonomous exploration of the robot.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Corresponding to the method for determining the autonomous exploration feasibility of the robot described in the above embodiments, fig. 9 shows a block diagram of the apparatus for determining the autonomous exploration feasibility of the robot provided in the embodiment of the present application, and for convenience of explanation, only the portions related to the embodiment of the present application are shown.
Referring to fig. 9, the apparatus 300 may include: a data acquisition module 310, a data calculation module 320, and a control module 330.
The data obtaining module 310 is configured to obtain a first distance between the robot and each first key point, where each first key point is a position point where an obstacle obtained when the robot scans the obstacle in a preset area is located.
The data calculation module 320 is configured to calculate, based on the first distances between the robot and the first key points, a second distance between every two adjacent first key points, where the two adjacent first key points are two first key points with adjacent angular coordinates, and the angular coordinates of the first key points are angular positional relationships of the first key relative to the robot.
And the control module 330 is configured to control the robot to perform autonomous exploration when the second distance meets a preset condition.
In one possible implementation, the data calculation module 320 may specifically include:
the included angle obtaining unit is used for obtaining a first included angle formed by connecting each two adjacent first key points with the robot respectively, wherein the two adjacent first key points are first key points acquired by the robot at the same position;
and the calculating unit is used for calculating a second distance between two adjacent first key points based on the first distances respectively corresponding to the two adjacent first key points and a first included angle formed by connecting lines between the two adjacent first key points and the robot.
In one possible implementation, the control module 330 may specifically include:
the judging unit is used for determining whether third distances exist in the second distances or not, and the third distances are second distances larger than a first preset threshold value;
and the first control unit is used for controlling the robot to perform autonomous exploration if the third distance does not exist in the second distances.
In one possible implementation, the control module 330 may specifically further include:
A length calculating unit, configured to calculate, based on the respective second distances, a total length of the profile formed by the respective first key points if the third distances exist in the respective second distances;
and the second control unit is used for controlling the robot to perform autonomous exploration based on the third distance and the total length of the profile.
In one possible implementation, the second control unit may be specifically configured to:
calculating the ratio of the third distance to the total length of the profile;
and if the ratio greater than the second preset threshold value does not exist in the ratios, controlling the robot to perform autonomous exploration.
In one possible implementation, the data acquisition module 310 may be specifically configured to:
acquiring first information acquired by the robot, wherein the first information comprises a second key point, a distance between the robot and the second key point and a second included angle formed by connecting lines between two adjacent second key points and the robot respectively, the second key point is a first position of the robot in the preset area and a position point where an obstacle detected by a first radar is located, the two adjacent second key points are two second key points with adjacent angle coordinates, and the angle coordinates of the second key points are the angle position relation of the second key relative to the robot;
Determining whether an area which is not detected by the first radar exists in the preset area or not based on a second included angle between the two adjacent second key points;
if the area which is not detected by the first radar exists in the preset area, acquiring second information acquired by the robot, wherein the second information is information of the area which is not detected and acquired by the robot at a second position and/or by adopting a second radar, and the second radar and the first radar are arranged at different heights of the robot;
determining a first distance between the robot and each first key point based on the first information and the second information;
and if the area which is not detected by the first radar does not exist in the preset area, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points comprise the second key points.
In one possible implementation, the data acquisition module 310 may be further configured to:
judging whether the second information comprises a third key point or not, wherein the third key point is a position point of an obstacle in an area which is not detected by the first radar;
If the second information includes the third key point, obtaining a first distance between the robot and each first key point based on a distance between the robot and the second key point and a distance between the robot and the third key point, wherein the first key point includes the second key point and the third key point, and the second information includes a distance between the robot and the third key point;
and if the third key point is not included in the second information, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key point comprises the second key point.
In one possible implementation, the apparatus 300 further includes:
the area obtaining module is used for obtaining the area of the preset area acquired by the robot;
the first result output module is used for controlling the robot to perform involuntary exploration if the area of the preset area is larger than a third preset threshold value;
and the second result output module is used for controlling the robot to perform autonomous exploration if the area of the preset area is smaller than or equal to the third preset threshold value.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the present application further provides a terminal device, referring to fig. 10, the terminal device 400 may include: at least one processor 410, a memory 420, and a computer program stored in the memory 420 and executable on the at least one processor 410, the processor 410, when executing the computer program, performing the steps of any of the various method embodiments described above, such as steps S101 to S103 in the embodiment shown in fig. 2. Alternatively, the processor 410 may perform the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 310 to 330 shown in fig. 10, when executing the computer program.
By way of example, a computer program may be partitioned into one or more modules/units that are stored in memory 420 and executed by processor 410 to complete the present application. The one or more modules/units may be a series of computer program segments capable of performing specific functions for describing the execution of the computer program in the terminal device 400.
It will be appreciated by those skilled in the art that fig. 10 is merely an example of a terminal device and is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or different components, such as input-output devices, network access devices, buses, etc.
The processor 410 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 420 may be an internal storage unit of the terminal device, or may be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like. The memory 420 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 420 may also be used to temporarily store data that has been output or is to be output.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or one type of bus.
The method for autonomous exploration of the robot provided by the embodiment of the application can be applied to terminal equipment such as computers, tablet computers, notebook computers, netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the embodiment of the application does not limit the specific types of the terminal equipment.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the embodiments of the method for autonomous exploration by a robot described above.
Embodiments of the present application provide a computer program product that, when run on a mobile terminal, causes the mobile terminal to perform the steps in the various embodiments of the method that enable autonomous exploration by a robot as described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. A method for autonomous exploration by a robot, comprising:
acquiring a first distance between a robot and each first key point, wherein each first key point is a position point of an obstacle obtained when the robot scans the obstacle in a preset area;
Calculating a second distance between every two adjacent first key points based on the first distance between the robot and each first key point, wherein the two adjacent first key points are two first key points with adjacent angle coordinates, and the angle coordinates of the first key points are the angular position relation of the first key points relative to the robot;
when the second distance meets a preset condition, controlling the robot to perform autonomous exploration;
when the second distance meets a preset condition, controlling the robot to perform autonomous exploration, including:
determining whether third distances exist in the second distances or not, wherein the third distances are the second distances larger than a first preset threshold value;
and if the third distance does not exist in the second distances, controlling the robot to perform autonomous exploration.
2. The method of autonomous exploration by a robot of claim 1, wherein said calculating a second distance between each adjacent two first keypoints based on first distances of said robot from said respective first keypoints comprises:
acquiring a first included angle formed by connecting each two adjacent first key points with the robot respectively, wherein the two adjacent first key points are first key points acquired by the robot at the same position;
And calculating a second distance between two adjacent first key points based on the first distances respectively corresponding to the two adjacent first key points and a first included angle formed by connecting lines between the two adjacent first key points and the robot.
3. The method of autonomous exploration of a robot of claim 1, wherein after said determining whether a third distance exists among the respective second distances, comprising:
if the third distances exist in the second distances, calculating the total length of the profile formed by the first key points based on the second distances;
and when the third distance and the total length of the profile diagram meet preset requirements, controlling the robot to perform autonomous exploration.
4. The method for autonomous exploration by a robot according to claim 3, wherein controlling the robot to perform autonomous exploration when the third distance and the total length of the profile meet preset requirements comprises:
calculating the ratio of the third distance to the total length of the profile;
and if the ratio greater than the second preset threshold value does not exist in the ratios, controlling the robot to perform autonomous exploration.
5. The method for autonomous exploration by a robot of claim 1, wherein said obtaining a first distance of the robot from each first keypoint comprises:
acquiring first information acquired by the robot, wherein the first information comprises a second key point, a distance between the robot and the second key point and a second included angle formed by connecting lines between two adjacent second key points and the robot respectively, the second key point is a first position of the robot in the preset area and a position point where an obstacle detected by a first radar is located, the two adjacent second key points are two second key points with adjacent angle coordinates, and the angle coordinates of the second key points are the angle position relation of the second key relative to the robot;
determining whether an area which is not detected by the first radar exists in the preset area or not based on a second included angle between the two adjacent second key points;
if the area which is not detected by the first radar exists in the preset area, acquiring second information acquired by the robot, wherein the second information is information of the area which is not detected and acquired by the robot at a second position and/or by adopting a second radar, and the second radar and the first radar are arranged at different heights of the robot;
Determining a first distance between the robot and each first key point based on the first information and the second information;
and if the area which is not detected by the first radar does not exist in the preset area, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key points comprise the second key points.
6. The method of autonomous exploration of a robot of claim 5, wherein said determining a first distance of said robot from each first keypoint based on said first information and said second information comprises:
judging whether the second information comprises a third key point or not, wherein the third key point is a position point of an obstacle in an area which is not detected by the first radar;
if the second information includes the third key point, obtaining a first distance between the robot and each first key point based on a distance between the robot and the second key point and a distance between the robot and the third key point, wherein the first key point includes the second key point and the third key point, and the second information includes a distance between the robot and the third key point;
And if the third key point is not included in the second information, obtaining a first distance between the robot and each first key point based on the distance between the robot and the second key point, wherein the first key point comprises the second key point.
7. The method of autonomous exploration by a robot of claim 1, further comprising:
acquiring the area of the preset area acquired by the robot;
if the area of the preset area is larger than a third preset threshold value, controlling the robot to perform non-autonomous exploration;
and if the area of the preset area is smaller than or equal to the third preset threshold value, controlling the robot to perform autonomous exploration.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method of autonomous exploration of a robot according to any of claims 1 to 7 when executing the computer program.
9. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements a method of autonomous exploration by a robot according to any of claims 1 to 7.
CN202110998158.2A 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium Active CN113741446B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110998158.2A CN113741446B (en) 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium
PCT/CN2021/139343 WO2023024347A1 (en) 2021-08-27 2021-12-17 Autonomous exploration method for robot, and terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110998158.2A CN113741446B (en) 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113741446A CN113741446A (en) 2021-12-03
CN113741446B true CN113741446B (en) 2024-04-16

Family

ID=78733543

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110998158.2A Active CN113741446B (en) 2021-08-27 2021-08-27 Robot autonomous exploration method, terminal equipment and storage medium

Country Status (2)

Country Link
CN (1) CN113741446B (en)
WO (1) WO2023024347A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741446B (en) * 2021-08-27 2024-04-16 深圳市优必选科技股份有限公司 Robot autonomous exploration method, terminal equipment and storage medium
CN114847809B (en) * 2022-07-07 2022-09-20 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109464074A (en) * 2018-11-29 2019-03-15 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof
CN109744945A (en) * 2017-11-08 2019-05-14 杭州萤石网络有限公司 A kind of area attribute determines method, apparatus, system and electronic equipment
CN110448241A (en) * 2019-07-18 2019-11-15 广东宝乐机器人股份有限公司 The stranded detection of robot and method of getting rid of poverty
CN111104933A (en) * 2020-03-20 2020-05-05 深圳飞科机器人有限公司 Map processing method, mobile robot, and computer-readable storage medium
CN111198378A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Boundary-based autonomous exploration method and device
CN111329398A (en) * 2020-03-27 2020-06-26 上海高仙自动化科技发展有限公司 Robot control method, robot, electronic device, and readable storage medium
CN111638526A (en) * 2020-05-20 2020-09-08 电子科技大学 Method for robot to automatically build graph in strange environment
CN111904346A (en) * 2020-07-09 2020-11-10 深圳拓邦股份有限公司 Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium
WO2021114989A1 (en) * 2019-12-13 2021-06-17 苏州宝时得电动工具有限公司 Autonomous robot and control method thereof, and computer storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104375505B (en) * 2014-10-08 2017-02-15 北京联合大学 Robot automatic road finding method based on laser ranging
CN104460666B (en) * 2014-10-27 2017-05-10 上海理工大学 Robot autonomous obstacle avoidance moving control method based on distance vectors
DE102017121127A1 (en) * 2017-09-12 2019-03-14 RobArt GmbH Exploration of an unknown environment by an autonomous mobile robot
JP2020533720A (en) * 2017-09-12 2020-11-19 ロブアート ゲーエムベーハーROBART GmbH Exploring an unknown environment with an autonomous mobile robot
CN108983777B (en) * 2018-07-23 2021-04-06 浙江工业大学 Autonomous exploration and obstacle avoidance method based on self-adaptive front exploration target point selection
CN110703747B (en) * 2019-10-09 2021-08-03 武汉大学 Robot autonomous exploration method based on simplified generalized Voronoi diagram
CN110936383B (en) * 2019-12-20 2022-11-18 上海有个机器人有限公司 Obstacle avoiding method, medium, terminal and device for robot
CN113741446B (en) * 2021-08-27 2024-04-16 深圳市优必选科技股份有限公司 Robot autonomous exploration method, terminal equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109744945A (en) * 2017-11-08 2019-05-14 杭州萤石网络有限公司 A kind of area attribute determines method, apparatus, system and electronic equipment
CN109464074A (en) * 2018-11-29 2019-03-15 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof
CN110448241A (en) * 2019-07-18 2019-11-15 广东宝乐机器人股份有限公司 The stranded detection of robot and method of getting rid of poverty
WO2021114989A1 (en) * 2019-12-13 2021-06-17 苏州宝时得电动工具有限公司 Autonomous robot and control method thereof, and computer storage medium
CN113064408A (en) * 2019-12-13 2021-07-02 苏州宝时得电动工具有限公司 Autonomous robot, control method thereof, and computer storage medium
CN111198378A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Boundary-based autonomous exploration method and device
CN111104933A (en) * 2020-03-20 2020-05-05 深圳飞科机器人有限公司 Map processing method, mobile robot, and computer-readable storage medium
CN111329398A (en) * 2020-03-27 2020-06-26 上海高仙自动化科技发展有限公司 Robot control method, robot, electronic device, and readable storage medium
CN111638526A (en) * 2020-05-20 2020-09-08 电子科技大学 Method for robot to automatically build graph in strange environment
CN111904346A (en) * 2020-07-09 2020-11-10 深圳拓邦股份有限公司 Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium

Also Published As

Publication number Publication date
WO2023024347A1 (en) 2023-03-02
CN113741446A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
US20210063577A1 (en) Robot relocalization method and apparatus and robot using the same
CN113741446B (en) Robot autonomous exploration method, terminal equipment and storage medium
CN110442120B (en) Method for controlling robot to move in different scenes, robot and terminal equipment
WO2020207190A1 (en) Three-dimensional information determination method, three-dimensional information determination device, and terminal apparatus
CN110363076B (en) Personnel information association method and device and terminal equipment
CN111814752B (en) Indoor positioning realization method, server, intelligent mobile device and storage medium
CN112528831B (en) Multi-target attitude estimation method, multi-target attitude estimation device and terminal equipment
CN110471409B (en) Robot inspection method and device, computer readable storage medium and robot
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN109884639B (en) Obstacle detection method and device for mobile robot
CN111612841A (en) Target positioning method and device, mobile robot and readable storage medium
CN111381586A (en) Robot and movement control method and device thereof
US20200209365A1 (en) Laser data calibration method and robot using the same
CN110850859B (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
US11002842B2 (en) Method and apparatus for determining the location of a static object
CN112802358B (en) Parking space navigation method and device based on artificial intelligence, terminal equipment and medium
KR102441276B1 (en) Method, device and system for managing and providing safety information of structure based on mobile terminal
CN112198878B (en) Instant map construction method and device, robot and storage medium
CN113095227A (en) Robot positioning method and device, electronic equipment and storage medium
US20210008730A1 (en) Pose determining method for mobile robot and apparatus and mobile robot thereof
CN110062169A (en) Image pickup method, filming apparatus, terminal device and computer readable storage medium
CN112689842B (en) Target detection method and device
CN109788431B (en) Bluetooth positioning method, device, equipment and system based on adjacent node group
CN105653101B (en) Touch point sensing method and optical touch system
CN109831737B (en) Bluetooth positioning method, device, equipment and system based on confidence degree

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant