CN114153200A - Trajectory prediction and self-moving equipment control method - Google Patents

Trajectory prediction and self-moving equipment control method Download PDF

Info

Publication number
CN114153200A
CN114153200A CN202111248186.9A CN202111248186A CN114153200A CN 114153200 A CN114153200 A CN 114153200A CN 202111248186 A CN202111248186 A CN 202111248186A CN 114153200 A CN114153200 A CN 114153200A
Authority
CN
China
Prior art keywords
moving
moving object
target
motion
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111248186.9A
Other languages
Chinese (zh)
Inventor
郭振兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN202111248186.9A priority Critical patent/CN114153200A/en
Publication of CN114153200A publication Critical patent/CN114153200A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Abstract

The embodiment of the application provides a track prediction and self-moving equipment control method. In some embodiments of the present application, a mobile device detects a moving object existing in its surrounding environment and a motion trajectory of the moving object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; if the self-moving equipment is located in the target movement direction, the self-moving equipment can block the movement of the moving object, the self-moving equipment can actively avoid the moving object moving towards the self-moving equipment, and the working performance and the intelligent degree of the self-moving equipment are improved.

Description

Trajectory prediction and self-moving equipment control method
Statement
The application has the following application numbers: 201811261238.4 filed on a divisional basis, the original filing date being: 2018.10.26, the name of invention creation: provided are a method, equipment and a storage medium for avoiding collision.
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a track prediction and self-moving equipment control method.
Background
When the robot is in a working process and an obstacle exists on a traveling path, the robot can actively avoid the obstacle so as to better execute a working task.
At present, the obstacle avoidance function of a sweeping robot is generally realized by matching infrared, laser and ultrasonic equidistant sensors with a spring baffle, and after the distance sensors detect that an obstacle exists in front or the spring baffle touches the obstacle, the robot returns or bypasses according to a control instruction of obstacle avoidance.
Disclosure of Invention
The application provides track prediction and self-moving equipment control methods in various aspects, and the track prediction and self-moving equipment control methods are used for actively avoiding objects moving towards self-moving equipment and improving the working performance and the intelligent degree of the self-moving equipment.
The embodiment of the application provides an avoidance method, which is suitable for self-moving equipment, and comprises the following steps:
detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object;
predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object;
and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
An embodiment of the present application further provides a self-moving device, including: a machine body, one or more processors, and one or more memories storing computer programs;
the one or more processors to execute the computer program to:
detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object;
predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object;
and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the one or more processors to perform actions comprising:
detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object;
predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object;
and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
In some embodiments of the present application, a mobile device detects a moving object existing in its surrounding environment and a motion trajectory of the moving object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; if the self-moving equipment is located in the target movement direction, the self-moving equipment can block the movement of the moving object, the self-moving equipment can actively avoid the moving object moving towards the self-moving equipment, and the working performance and the intelligent degree of the self-moving equipment are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart of a method of avoiding according to an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a target movement direction of a person at a future time and a relationship with a self-moving device provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a target movement direction of a person at a future time in relation to a self-moving device according to yet another exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of a target movement direction of a person at a future time in relation to a self-moving device according to yet another exemplary embodiment of the present application;
FIG. 5 is a schematic flow chart diagram illustrating a method of avoidance provided in accordance with yet another exemplary embodiment of the present application;
FIG. 6 is a schematic flow chart diagram illustrating a method of avoidance provided in accordance with yet another exemplary embodiment of the present application;
FIG. 7 is an illustration of an application scenario of the avoidance method of the present application;
FIG. 8 is an illustration of another application scenario of the method of avoidance of the present application;
FIG. 9 is an illustration of another application scenario of the method of avoidance of the present application;
fig. 10 is a block diagram of a self-moving device according to an exemplary embodiment of the present application;
fig. 11 is a block diagram of a robot according to an exemplary embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
At present, a robot usually executes tasks in a working environment with a fixed layout, and the robot can avoid obstacles according to a constructed map, but the robot may stop at a place where people are influenced to walk, so that the walking of the people is influenced.
In view of the above existing problems, in some embodiments of the present application, a mobile device detects a mobile object existing in its surrounding environment and a motion trajectory of the mobile object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; if the self-moving equipment is located in the target movement direction, the self-moving equipment can block the movement of the moving object, the self-moving equipment can actively avoid the moving object moving towards the self-moving equipment, and the working performance and the intelligent degree of the self-moving equipment are improved.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method of avoiding collision according to an exemplary embodiment of the present application. As shown in fig. 1, the method includes:
s101: detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object;
s102: predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object;
s103: and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
In this embodiment, the type of the autonomous mobile device is not limited, and may be an unmanned aerial vehicle, an unmanned vehicle, a robot, or the like, and the type of the robot, the unmanned vehicle, or the unmanned aerial vehicle is not limited. The self-moving equipment has functions of calculation, communication, internet connection and the like besides the basic service function. When the autonomous mobile device is a robot, the basic service function of the robot may be different according to different application scenarios. The robot can be a sweeping robot, a following robot, a welcoming robot and the like. For example, for a sweeping robot applied to scenes such as homes, office buildings, shopping malls and the like, the basic service function of the sweeping robot is to sweep the ground in the scene; for a following robot, the basic service function is to follow a target object; the basic service function of a guest-greeting robot is to welcome the customer and guide the customer to the destination.
In this embodiment, various types of sensors are installed on the self-moving device for collecting information in the surrounding environment. For example, a sensor such as a vision, laser, infrared, ultrasonic sensor is mounted on the mobile device. The self-moving equipment collects surrounding environment information through the sensor, and can detect moving objects existing in the surrounding environment of the self-moving equipment and motion tracks of the moving objects based on the collected surrounding environment information and by combining an object identification technology. Generally, the moving object detected here will have a certain distance from the self-moving device.
In this embodiment, after the self-moving device detects that a moving object exists in its surrounding environment, it further predicts the moving direction of the moving object within a certain time in the future according to the motion trajectory of the moving object, and determines whether the self-moving device is located in the moving direction by combining the predicted moving direction of the moving object within a certain time in the future; if the judgment result is that the self-moving equipment is located in the predicted movement direction, the self-moving equipment can be determined to block the moving object, so that the moving object can be actively avoided, and the moving object can be ensured to normally move along the self movement track; on the contrary, if the self-moving equipment is not located in the predicted movement direction, it can be determined that the self-moving equipment does not generate obstruction to the moving object, the self-moving equipment does not need to actively avoid, the self-moving equipment can continuously execute the self task at the current position, and the moving object can normally move along the self movement track.
It should be noted that, in this embodiment, the motion direction of the target motion direction of the moving object in a certain time period in the future represents the motion trend of the moving object in the next time period, which mainly refers to the overall motion direction of the moving object in the next time period. Of course, if the moving direction of the moving object at each location point in the next period of time is the same, the predicted moving direction may be the instantaneous moving direction of the moving object.
In addition, in this embodiment, the time length of the "future fixed time" is not specifically limited. The "future fixed time" may be a fixed value preset according to application requirements, for example, 5 seconds, 10 seconds, 20 seconds, 1 minute, and the like. In addition, the "future certain time" may be a time that is estimated to be approximately required for the mobile object to move to the self-moving apparatus based on the movement speed of the mobile object and the distance between the mobile object and the self-moving apparatus when the mobile object is detected. The self-moving device may detect the distance between the moving object and the self-moving device through a laser radar, or may calculate the distance between the moving object and the self-moving device by combining with an image acquired by a vision sensor.
In the above or below embodiments, optionally, a visual sensor, such as a camera or the like, is provided on the self-moving device. An optional embodiment of the self-moving device provided with the visual sensor for detecting the moving object existing in the surrounding environment based on the collected environment information is that, under a coordinate system established by the self-moving device, the self-moving device firstly continuously collects environment images from the surrounding environment of the self-moving device through the visual sensor, and in combination with an image recognition technology, the object included in the continuously collected environment images can be recognized as a target object existing in the surrounding environment of the self-moving device. Next, whether the target object is a moving object in a moving state or not may be analyzed, for example, whether the target object is a walking person or not.
Alternatively, if the field angle of the vision sensor is fixed, it may be identified whether the target object is a moving object in a moving state by analyzing the image position of the target object in the continuously acquired environment image, so as to achieve the effect of continuously tracking the target object. If the position coordinates of the target object in the continuously acquired environment images are changed, the target object can be determined to be a moving object; otherwise, the target object is determined to be a non-moving object.
Optionally, if the field angle of the vision sensor is changed, the self-moving device may obtain the field angle of the vision sensor when acquiring each environmental image, and further may calculate the position coordinates of the target object in the environment around the self-moving device by combining the field angle of the vision sensor when acquiring each environmental image and the image position of the target object in the corresponding environmental image, which may also achieve the effect of continuously tracking the target object. If the position of the target object in the surrounding environment of the mobile device is changed, the target object can be determined to be a moving object; otherwise, the target object is determined to be a non-moving object. ,
further, when the target object is determined as the moving object, a motion trajectory including coordinates of a plurality of position points through which the moving object passes in sequence may be determined as the moving object according to the change of the position coordinates.
Before predicting the target motion direction of the moving object within a certain time in the future, judging whether the moving object enters a preset distance range of the mobile equipment or not by combining the detected motion track of the moving object; if the judgment result is yes, acquiring the motion track of the moving object within the preset distance range of the mobile device, and predicting the target motion direction of the moving object within a certain future time, otherwise, not performing prediction processing. The preset distance range is set in advance by a user, the preset distance range is not limited, and the preset distance range can be adjusted in combination with specific scene adaptability.
For example, a moving object 2 meters away from the moving device is preset, the motion trail of the moving object is collected, and the target motion direction of the moving object in a future time is predicted based on the collected motion trail. The method comprises the steps that the self-moving equipment collects an environment image in the surrounding environment based on a vision sensor arranged on the self-moving equipment, if the moving object is detected, the position coordinate of the moving object in the surrounding environment of the self-moving equipment is calculated according to the position coordinate of the moving object in the environment image, the distance between the moving object and the self-moving equipment is further calculated, when the moving object is detected to enter a range of 2 meters away from the self-moving equipment, the motion trail behind the moving object is collected, the utilization rate of CPU resources of the self-moving equipment is improved, the data processing speed of the self-moving equipment is improved, and more accurate and reasonable operation results are output rapidly.
Calculating the position coordinates of the moving object in the surrounding environment of the mobile equipment according to the position coordinates of the moving object in the environment image,
in the above or below-described embodiment, optionally, the detected motion trajectory of the moving object includes coordinates of a plurality of position points through which the moving object passes in sequence. Based on this, an optional way of predicting the target movement direction of the moving object within a certain time in the future by combining the detected movement trajectory of the moving object includes that an angle direction calculated according to coordinates of adjacent position points in a plurality of position points through which the moving object sequentially passes is taken as the movement direction of the moving object sequentially passing through each position point; and predicting the target motion direction of the moving object within a certain time in the future according to the motion direction of the moving object passing through each position point in sequence.
In this embodiment, the number of the position points and the intervals between the position points included in the detected motion trajectory of the moving object are not limited, and may be adaptively set according to application requirements. Several ways of setting the parameters associated with the location points are described below:
the method comprises the steps of setting time and time intervals from the mobile device according to time, collecting an environment image of the surrounding environment, calculating position coordinates of the mobile object in the surrounding environment of the mobile device according to the position coordinates of the mobile object in the environment image, using the position coordinates as a position point on a motion track of the mobile object, and forming the motion track of the mobile object by a plurality of sequentially detected position points. It is obvious that the acquisition of the environment images may also be performed at different time intervals, so that the intervals between the detected position points of the moving object may possibly be different.
And secondly, continuously acquiring environment images of the surrounding environment according to the movement distance setting of the moving object, selecting a plurality of corresponding environment images of the pixel points corresponding to the preset distance of the moving object according to the mapping relation between the pixel point displacement of the moving object in the two images and the actual movement distance of the self-moving equipment, respectively calculating the position coordinates of the moving object in the surrounding environment of the self-moving equipment in the plurality of environment images, and forming the movement track of the moving object as a plurality of position points of the moving object on the movement track. Obviously, the environmental images may be acquired at different moving distances of the moving object, and thus the detected positions of the moving object may have different distances.
Under a coordinate system established by the robot, the motion direction of the moving object at a certain position point can be calculated through the coordinates of two adjacent position points. Here, the moving direction when the moving object sequentially passes each position point is an instantaneous moving direction. The calculation of the moving direction of the moving object passing through a certain position point can select the position point before the position point or select the position point after the position point. Taking the nth position point as an example, the position coordinate (x) of the nth position point can be utilizedn,yn) Position coordinates (x) with the (n-1) th position pointn-1,yn-1) And calculating the moving direction of the moving object passing through the nth position point, wherein the calculation formula can be expressed as: tn and atan2 (y)n-yn-1,xn-xn-1) Wherein tn is theThe direction of motion at n position points. For another example, also taking the nth position point as an example, the position coordinate (x) of the nth position point can be utilizedn,yn) Position coordinates (x) with the (n + 1) th position pointn+1,yn+1) And calculating the moving direction of the moving object passing through the nth position point, wherein the calculation formula can be expressed as: tn and atan2 (y)n+1-yn,xn+1-xn) And tn is the moving direction of the nth position point.
In the above embodiment, the target moving direction of the moving object within a certain time in the future is predicted according to the moving direction of the moving object passing through each position point in sequence, which may include, but is not limited to, the following two implementation manners:
mode A: and predicting the target motion direction of the moving object within a certain time in the future by the motion direction of the moving object passing through each position point in sequence.
Mode B: and predicting the target motion direction of the moving object within a certain time in the future by combining the target object to which the moving object needs to go and the motion direction of the moving object when the moving object sequentially passes through each position point. In this embodiment, the target motion direction of the moving object in a future period is predicted by combining the target object to which the moving object needs to go, so that the accuracy and reliability of the prediction of the target motion direction of the moving object are improved.
In the above mode a, the target movement direction of the moving object within a certain time in the future is predicted mainly according to the movement direction of the moving object passing through each position point in sequence, so that the implementation is relatively simple, and the method is suitable for the situation that some surrounding environments are relatively simple. For example, a person freely walks in an open field, the robot starts to collect a section of walking track of the person when the person is at a certain distance from the robot, and the target movement direction of the person in a certain time in the future is predicted according to the collected walking track of the person; the sweeping robot cleans the ground according to a set route, when the accompanying robot keeps a certain distance away from the sweeping robot, a section of walking track of the sweeping robot starts to be collected, and the target movement direction of the sweeping robot within a certain time in the future is predicted according to the collected walking track of the sweeping robot.
Fig. 2, 3 and 4 are schematic diagrams of the target movement direction and the relation of the self-moving device of several persons in a certain time in the future according to the exemplary embodiments of the present application. As can be seen from the figures, in the case of fig. 2 and 4, the person is moving towards the autonomous moving apparatus in the future direction of movement, and the autonomous moving apparatus needs to make active avoidance, and in the case of fig. 3, the person is moving towards one side in the future direction of movement and is not moving towards the autonomous moving apparatus, and therefore the autonomous moving apparatus does not need to make active avoidance.
In the above-mentioned mode B, in other scenarios, the surrounding environment of the self-moving device includes many other objects besides the moving object, which may be related to the moving purpose of the moving object. For example, if a person walks in the hall, detecting that there is still an elevator in the hall, then the person may be walking to the elevator, the person walks in the office, there is a printer in the office, a file cabinet, and then the person may be walking to the printer or to the file cabinet. Therefore, the target motion direction of the moving object within a certain time in the future is predicted according to the motion direction of the moving object passing through each position point in sequence, and the target object to which the moving object needs to go can be judged in combination with whether the target object exists in the surrounding environment of the mobile equipment, so that the accuracy rate of predicting the target motion direction of the moving object within the certain time in the future is improved. Firstly, whether a target object to which a moving object needs to go exists around the mobile device or not needs to be judged, if the target object exists, the target motion direction of the moving object in a certain time in the future is predicted according to the motion direction of the moving object passing through each position point in sequence by combining the position of the target object.
For example, a person walks in an elevator hall, an elevator is detected to exist in the elevator hall by combining the detected walking track of the person, whether the target movement direction of the person takes the elevator in a certain time in the future is judged according to the position of the elevator and the movement direction of the person in the walking track of the person, and if the person takes the elevator with a high probability, the target movement direction is adjusted to the direction facing the elevator; the method comprises the steps that a person walks in an office, a file cabinet and the like exist in the office, the detected walking track of the person is combined, the fact that the file cabinet still exists in the office is detected, whether the target movement direction of the person in a certain period of time in the future moves towards the file cabinet or not is judged according to the position of the file cabinet and the movement direction of the person in the walking track of the person, and if the person moves towards the file cabinet with a high probability, the target movement direction is adjusted to the direction towards the file cabinet. The method and the device improve the accuracy rate of predicting the target motion direction of the moving object within a certain time in the future by combining the target object to which the moving object possibly goes in the environment.
After the target motion direction of the moving object within a certain time in the future is predicted, whether the self-moving device is located in the target motion direction is further judged. Calculating an included angle formed by the coordinates of each position point in the plurality of position points which the moving object sequentially passes through and the coordinates of the self-moving equipment, and taking the included angle as the motion direction of the moving object relative to the self-moving equipment when the moving object sequentially passes through each position point; and if the moving direction of the moving object passing through the position points in sequence and the moving direction of the moving object relative to the self-moving equipment when the moving object passes through the position points in sequence meet the set direction matching conditions, determining that the self-moving equipment is positioned in the target moving direction.
An alternative embodiment is that, for each of the plurality of location points, a direction angle between the moving direction of the moving object passing through the location point and the moving direction of the moving object relative to the mobile device passing through the location point is calculated; counting the position points with the included angle of the direction smaller than or equal to an angle threshold value and the number of the position points; and if the direction included angles corresponding to the specified number of position points passed by the moving object last are all smaller than or equal to the angle threshold value, and the counted number of the position points is larger than the point number threshold value, determining that the self-moving equipment is positioned in the target movement direction. For example: the method comprises the steps of extracting 100 position points on a motion track of a moving object, calculating a first motion direction of the moving object when the moving object passes through the 100 position points, calculating a second motion direction of the moving object relative to self-moving equipment when the moving object passes through the position points, calculating an included angle between the first motion direction and the second motion direction, counting 80 position points with the angle threshold value being smaller than 2 degrees, wherein the position points are larger than a point threshold value 60, and the angles of 20 position points with the last appointed number are smaller than 2 degrees, determining that the self-moving equipment is located in a target motion direction, and actively avoiding the self-moving equipment to reduce the probability that the moving object hits the self-moving equipment, otherwise, not avoiding the self-moving equipment.
In the embodiment, whether a target object to which a moving object needs to go exists in the surrounding environment of the mobile device is identified, and optionally, the moving purpose of the moving object is determined according to the category of the moving object; judging whether an object adaptive to the moving purpose of the moving object exists in the surrounding environment of the mobile equipment or not; if there is any object, an object adapted to the movement purpose of the moving object is set as the target object. For example, in an elevator car it is detected whether there is an elevator in the surroundings, which a person needs to walk past, etc.
And predicting the target motion direction of the moving object within a certain time in the future according to the motion direction of the moving object passing through each position point in sequence by combining the position of the target object. An alternative embodiment is that the initial motion direction of the moving object in a certain time in the future is predicted according to the motion direction of the moving object passing through each position point in sequence; judging whether the target object is positioned in the initial motion direction; and if the target object is positioned in the initial motion direction, taking the initial motion direction as the target motion direction. If the target object is not positioned in the initial movement direction, adjusting the initial movement direction to enable the target object to be positioned in the adjusted movement direction; and taking the adjusted movement direction as a target movement direction. The target motion direction of the moving object in a future period of time is predicted by combining the target object, and the accuracy and reliability of the prediction of the target motion direction of the moving object are improved. For example, when it is detected that there is an elevator in the elevator hall for a person to go, if it is predicted that the initial movement direction of the person in the future does not face the elevator from the movement direction at each position point of the person, the initial movement direction is adjusted to a direction facing the elevator as the target movement direction.
After the target motion direction of the moving object within a certain time in the future is predicted, whether the self-moving equipment is located in the target motion direction or not is judged, if the self-moving equipment is located in the target motion direction, avoidance processing is carried out on the moving object, and if the self-moving equipment is not located in the target motion direction, the self-moving equipment is kept in place. The avoidance processing for the moving object may be to control the moving device to move in a direction away from the target moving direction, and the following examples describe some cases of controlling the moving device to move in a direction away from the target moving direction.
Case 1: and controlling the self-moving equipment to move towards a direction which is perpendicular to the target movement direction and deviates from the target movement direction.
Case 2: detecting whether an open area capable of accommodating the mobile equipment exists in a direction far away from the movement direction of the target; and if so, controlling the mobile equipment to move to the open area in the direction far away from the target movement direction.
Based on the foregoing embodiment, fig. 5 is a schematic flowchart of an avoidance method according to another exemplary embodiment of the present application, and as shown in fig. 5, the method includes:
s501: detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object;
s502: predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object;
s503: judging whether the mobile device is positioned in the target movement direction, if so, executing a step S504; if not, executing step S505;
s504: the self-moving equipment carries out avoidance processing aiming at the moving object.
S505: the self-moving device remains in place.
Based on the foregoing embodiments, fig. 6 is a schematic flowchart of a more detailed avoidance method provided in an exemplary embodiment of the present application, and as shown in fig. 6, the method includes:
s601: detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object;
s602: judging whether the moving object enters a preset distance range of the mobile equipment or not; if yes, executing step S603, otherwise, continuing to execute step S601;
s603: acquiring a motion track of a moving object within a preset distance range from a mobile device, and predicting a target motion direction of the moving object within a certain future time;
s604: predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object;
s605: judging whether the mobile device is located in the target movement direction, if so, executing a step S606; if not, executing step S607;
s606: carrying out avoidance processing on the moving object;
s607: the self-moving device remains in place.
In a method embodiment of an avoidance method of the present application, a mobile object and a motion trajectory of the mobile object existing in its surrounding environment are detected from a mobile device; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; if the self-moving equipment is located in the target movement direction, the object moving towards the self-moving equipment is actively avoided, and the working performance and the intelligent degree of the self-moving equipment are improved.
The avoidance method of the present application is explained below with reference to embodiments of different scenarios.
Application scenario 1: as shown in fig. 7, the robot is located in the corridor, the robot detects whether there is a moving object in the corridor based on its vision sensor, it is detected that someone is moving and obtains the motion trail of the person, when the person enters the preset distance range from the mobile device, the motion trail of the detected person is combined, the next motion direction of the person is predicted, and it is judged whether the motion direction is towards the robot, if towards the robot, the robot actively moves towards the wall side perpendicular to and away from one side of the motion direction of the person, and the person moving towards the robot actively avoids, so as to leave enough space for the person, ensure the person to pass smoothly, and avoid blocking the person.
Application scenario 2: as shown in fig. 8, the robot is located in a bedroom, the robot detects whether a moving object exists in the bedroom based on a vision sensor of the robot, detects that a person moves to obtain a motion track of the person, when the person enters a preset distance range of the robot, the motion track of the detected person is combined to predict the next motion direction of the person, and judges whether the motion direction faces the robot, and at the same time, recognizes that the person in the bedroom needs to walk past to take clothes, and judges that the wardrobe is located in the motion direction of the person, if the robot is located in the target motion direction between the wardrobe and the person, the robot forms a barrier to the person, the robot performs avoidance processing, the robot detects that an open area exists above the robot, the robot moves to the open area, and actively avoids the person moving towards the wardrobe, so that enough space is left for the person to pass through the person smoothly, the robot is prevented from blocking the human body from moving to the wardrobe to take clothes.
Application scenario 3: as shown in fig. 9, the robot is located in the elevator car, the robot detects whether there is a moving object in the elevator car based on its vision sensor, detects that a person is moving and acquires a motion trajectory of the person, when the human enters the preset distance range of the robot, the motion track of the detected human is combined to predict the next motion direction of the human, and whether the motion direction faces the robot is judged, meanwhile, the elevator which a person needs to walk through in the elevator room is identified, and the elevator is judged to be positioned in the target movement direction of the movement of the person, if the robot is positioned in the target movement direction between the elevator and the person, the robot forms a barrier to the human body, the robot carries out avoidance processing, the robot detects that open areas exist in the upper and lower directions of the machine, the robot randomly selects to carry out upward active avoidance, enough space is reserved for the human body, the human body is guaranteed to pass through smoothly, and the robot is prevented from blocking the human body from entering the elevator.
Fig. 10 is a block diagram of a self-moving device according to an exemplary embodiment of the present application. The self-moving device includes one or more processors 1002 and one or more memories 1003 storing computer programs. Necessary components such as audio components 1001, power components 1004, sensors 1005, etc. may also be included.
One or more processors 1002 for executing computer programs for: detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
Optionally, the one or more processors 1002 detect moving objects present in the environment surrounding the mobile device and motion trajectories of the moving objects for: continuously acquiring environmental images from the environment around the mobile device; identifying a target object existing in the surrounding environment of the mobile device and position coordinates of the target object based on the continuously acquired environment images; and if the position coordinates of the target object change, determining that the target object is a moving object, and determining the motion track of the moving object according to the change of the position coordinates.
Optionally, the one or more processors 1002 may predict a target motion direction of the mobile object in a future time by combining the detected motion trajectory of the mobile object, where the detected motion trajectory of the mobile object includes coordinates of a plurality of location points through which the mobile object passes in sequence, and the predicted target motion direction is used for: calculating an angle direction obtained according to coordinates of adjacent position points in a plurality of position points through which the moving object sequentially passes, wherein the angle direction is used as a motion direction of the moving object when the moving object sequentially passes through each position point; and predicting the target motion direction of the moving object within a certain time in the future according to the motion direction of the moving object passing through each position point in sequence.
Optionally, the one or more processors 1002 predict a target moving direction of the moving object in a future time according to the moving direction of the moving object passing through each location point in sequence, so as to: identifying whether a target object to which a moving object needs to go exists in the surrounding environment of the mobile equipment; and if the target object exists, predicting the target motion direction of the moving object within a certain time in the future according to the motion direction of the moving object passing through each position point in sequence by combining the position of the target object.
Optionally, the one or more processors 1002, identifying whether there are target objects to which a moving object needs to go from the mobile device's surroundings, may be configured to: determining the moving purpose of the moving object according to the category of the moving object; judging whether an object adaptive to the moving purpose of the moving object exists in the surrounding environment of the mobile equipment or not; if there is any object, an object adapted to the movement purpose of the moving object is set as the target object.
Optionally, the one or more processors 1002, in combination with the position of the target object, predict a target moving direction of the moving object within a certain time in the future according to the moving direction of the moving object passing through each position point in turn, so as to: predicting the initial motion direction of the moving object within a certain time in the future according to the motion directions of the moving object passing through the position points in sequence; judging whether the target object is positioned in the initial motion direction; and if the target object is positioned in the initial motion direction, taking the initial motion direction as the target motion direction.
Optionally, the one or more processors 1002 may be further configured to: if the target object is not positioned in the initial movement direction, adjusting the initial movement direction to enable the target object to be positioned in the adjusted movement direction; and taking the adjusted movement direction as a target movement direction.
Optionally, the one or more processors 1002 may be further configured to: calculating an included angle formed by the coordinates of each position point in the plurality of position points which the moving object sequentially passes through and the coordinates of the self-moving equipment, and taking the included angle as the motion direction of the moving object relative to the self-moving equipment when the moving object sequentially passes through each position point; and if the moving direction of the moving object passing through the position points in sequence and the moving direction of the moving object relative to the self-moving equipment when the moving object passes through the position points in sequence meet the set direction matching conditions, determining that the self-moving equipment is positioned in the target moving direction.
Optionally, the one or more processors 1002 may be further configured to: for each position point in the plurality of position points, calculating a direction included angle between the movement direction of the moving object passing through the position point and the movement direction of the moving object relative to the self-moving equipment when the moving object passes through the position point; counting the position points with the included angle of the direction smaller than or equal to an angle threshold value and the number of the position points; and if the direction included angles corresponding to the specified number of position points passed by the moving object last are all smaller than or equal to the angle threshold value, and the counted number of the position points is larger than the point number threshold value, determining that the self-moving equipment is positioned in the target movement direction.
Optionally, the one or more processors 1002, before predicting the target moving direction of the moving object within a certain time in the future in combination with the detected motion trajectory of the moving object, may further be configured to: judging whether the moving object enters a preset distance range of the mobile equipment or not; if the judgment result is yes, acquiring the motion track of the moving object within the preset distance range of the mobile device, and predicting the target motion direction of the moving object within a certain time in the future.
Optionally, the one or more processors 1002 perform avoidance processing for the moving object for: and controlling the mobile equipment to move to the direction far away from the target motion direction.
Optionally, the one or more processors 1002 control movement from the mobile device away from the target motion direction for: and controlling the self-moving equipment to move towards a direction which is perpendicular to the target movement direction and deviates from the target movement direction.
Optionally, controlling the mobile device to move away from the target motion direction for: detecting whether an open area capable of accommodating the mobile equipment exists in a direction far away from the movement direction of the target; and if so, controlling the mobile equipment to move to the open area in the direction far away from the target movement direction.
Correspondingly, the embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program, and the computer program, when executed by the one or more processors 1002, causes the one or more processors 1002 to perform the steps in the respective method embodiments illustrated in fig. 1.
In an embodiment of the self-moving device of the present application, the self-moving device detects a moving object existing in its surrounding environment and a motion trajectory of the moving object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; if the self-moving equipment is located in the target movement direction, the object moving towards the self-moving equipment is actively avoided, and the working performance and the intelligent degree of the self-moving equipment are improved.
The self-moving equipment can be a robot, an unmanned vehicle and the like. Fig. 11 is a block diagram of a robot according to an exemplary embodiment of the present disclosure. As shown in fig. 11, the robot includes: a machine body 1101; the machine body 1101 is provided with one or more processors 1103 and one or more memories 1104 storing computer instructions. In addition, the machine body 1101 may further be provided with a sensor 1102, which is used for acquiring an environmental image of the surrounding environment during the operation of the robot. The sensor 1102 may be a vision sensor 1102, such as a camera, etc., or a distance sensor 1102, such as a lidar.
In addition to one or more processors 1103 and one or more memories 1104, some basic components of the robot, such as an audio component, a power supply component, an odometer, a driving component, etc., are disposed on the machine body 1101. An audio component, which may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. The sensors 1102 may also include a lidar sensor 1102, a humidity sensor 1102, and the like. Alternatively, the drive assembly may include drive wheels, drive motors, universal wheels, and the like. Alternatively, the sweeping assembly may include a sweeping motor, a sweeping brush, a dusting brush, a dust suction fan, and the like. The basic components and the structures of the basic components included in different robots are different, and the embodiments of the present application are only some examples.
It is noted that the audio component, the sensor 1102, the one or more processors 1103, and the one or more memories 1104 may be disposed inside the machine body 1101, or may be disposed on a surface of the machine body 1101.
The machine body 1101 is an execution mechanism by which the robot performs a task of a job, and can execute an operation designated by the processor 1103 in a certain environment. Wherein, the appearance form of robot has been reflected to a certain extent to the mechanical body. In the present embodiment, the external form of the robot is not limited, and may be, for example, a circle, an ellipse, a triangle, a convex polygon, or the like.
The one or more memories 1104 are primarily for storing computer programs that are executable by the one or more processors 1103 to cause the one or more processors 1104 to perform the avoidance operation. In addition to storing computer programs, the one or more memories 1104 may also be configured to store various other data to support operations on the robot.
One or more processors 1103, which may be considered control systems for the robot, may be configured to execute computer programs stored in one or more memories 1104 to perform avoidance operations on the robot.
The processor 1103, e.g., the one or more memories 1104 have stored therein computer programs, which the one or more processors 1103 may execute, and which may be used to: detecting a moving object existing in the surrounding environment of the mobile equipment and a motion track of the moving object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
Optionally, the one or more processors 1103 detect moving objects existing in the environment surrounding the mobile device and motion trajectories of the moving objects for: continuously acquiring environmental images from the environment around the mobile device; identifying a target object existing in the surrounding environment of the mobile device and position coordinates of the target object based on the continuously acquired environment images; and if the position coordinates of the target object change, determining that the target object is a moving object, and determining the motion track of the moving object according to the change of the position coordinates.
Alternatively, the one or more processors 1103 predict the target motion direction of the moving object in a future time in combination with the detected motion trajectory of the moving object, where the detected motion trajectory of the moving object includes coordinates of a plurality of position points sequentially passed by the moving object, and are used to: calculating an angle direction obtained by calculation according to the coordinates of adjacent position points in a plurality of position points sequentially passed by the moving object as a moving direction of the moving object when the moving object sequentially passes through each position point; and predicting the target motion direction of the moving object within a certain time in the future according to the motion direction of the moving object passing through each position point in sequence.
Alternatively, the one or more processors 1103 predict a target moving direction of the moving object in a future time period according to the moving direction of the moving object passing through each location point in sequence, so as to: identifying whether a target object to which a moving object needs to go exists in the surrounding environment of the mobile equipment; and if the target object exists, predicting the target motion direction of the moving object within a certain time in the future according to the motion direction of the moving object passing through each position point in sequence by combining the position of the target object.
Optionally, the one or more processors 1103, identifying whether there is a target object in the environment surrounding the mobile device to which a moving object needs to go, may be configured to: determining the moving purpose of the moving object according to the category of the moving object; judging whether an object adaptive to the moving purpose of the moving object exists in the surrounding environment of the mobile equipment or not; if there is any object, an object adapted to the movement purpose of the moving object is set as the target object.
Optionally, the one or more processors 1103 predict a target moving direction of the moving object within a certain time in the future according to the moving direction of the moving object passing through each position point in sequence, in combination with the position of the target object, so as to: predicting the initial motion direction of the moving object within a certain time in the future according to the motion directions of the moving object passing through the position points in sequence; judging whether the target object is positioned in the initial motion direction; and if the target object is positioned in the initial motion direction, taking the initial motion direction as the target motion direction.
Optionally, the one or more processors 1103 may also be configured to: if the target object is not positioned in the initial movement direction, adjusting the initial movement direction to enable the target object to be positioned in the adjusted movement direction; and taking the adjusted movement direction as a target movement direction.
Optionally, the one or more processors 1103 may also be configured to: calculating an included angle formed by the coordinates of each position point in the plurality of position points which the moving object sequentially passes through and the coordinates of the self-moving equipment, and taking the included angle as the motion direction of the moving object relative to the self-moving equipment when the moving object sequentially passes through each position point; and if the moving direction of the moving object passing through the position points in sequence and the moving direction of the moving object relative to the self-moving equipment when the moving object passes through the position points in sequence meet the set direction matching conditions, determining that the self-moving equipment is positioned in the target moving direction.
Optionally, the one or more processors 1103 may also be configured to: for each position point in the plurality of position points, calculating a direction included angle between the movement direction of the moving object passing through the position point and the movement direction of the moving object relative to the self-moving equipment when the moving object passes through the position point; counting the position points with the included angle of the direction smaller than or equal to an angle threshold value and the number of the position points; and if the direction included angles corresponding to the specified number of position points passed by the moving object last are all smaller than or equal to the angle threshold value, and the counted number of the position points is larger than the point number threshold value, determining that the self-moving equipment is positioned in the target movement direction.
Optionally, the one or more processors 1103, before predicting the target motion direction of the moving object within a certain time in the future in combination with the detected motion trajectory of the moving object, may further be configured to: judging whether the moving object enters a preset distance range of the mobile equipment or not; if the judgment result is yes, acquiring the motion track of the moving object within the preset distance range of the mobile device, and predicting the target motion direction of the moving object within a certain time in the future.
Optionally, the one or more processors 1103 perform avoidance processing for the moving object to: and controlling the mobile equipment to move to the direction far away from the target motion direction.
Optionally, the one or more processors 1103 control the movement from the mobile device to a direction away from the target motion direction for: and controlling the self-moving equipment to move towards a direction which is perpendicular to the target movement direction and deviates from the target movement direction.
Optionally, controlling the mobile device to move away from the target motion direction for: detecting whether an open area capable of accommodating the mobile equipment exists in a direction far away from the movement direction of the target; and if so, controlling the mobile equipment to move to the open area in the direction far away from the target movement direction.
Correspondingly, the embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program, and the computer program, when executed by the one or more processors 1103, causes the one or more processors 1103 to perform the steps in the respective method embodiments illustrated in fig. 1.
In an embodiment of the self-moving device of the present application, the self-moving device detects a moving object existing in its surrounding environment and a motion trajectory of the moving object; predicting the target motion direction of the moving object within a certain time in the future by combining the detected motion trail of the moving object; if the self-moving equipment is located in the target movement direction, the object moving towards the self-moving equipment is actively avoided, and the working performance and the intelligent degree of the self-moving equipment are improved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (11)

1. A trajectory prediction method, suitable for self-moving equipment,
detecting a moving object existing in an elevator room where the mobile equipment is located and a motion track of the moving object;
determining a location of at least one elevator in the elevator cab;
determining the moving direction of the moving object passing through each position point in sequence;
and predicting the moving direction of the moving object towards the target object within a certain time in the future according to the moving direction of the moving object passing through each position point in sequence by combining the position of the elevator.
2. The method of claim 1, wherein the motion trajectory includes position points that the mobile object sequentially passes through, and the predicting of the motion direction of the mobile object toward the target object within a certain time in the future based on the motion direction of the mobile object sequentially passing through the position points comprises:
and taking the angle direction obtained by calculating the coordinates of adjacent position points in the plurality of position points through which the moving object sequentially passes as the motion direction of the moving object sequentially passing through each position point.
3. A trajectory prediction method, suitable for self-moving equipment,
detecting a moving object existing in an office where the mobile equipment is located and a motion track of the moving object;
determining at least one target object in the office; the target object includes: at least one of a printer and a file cabinet;
determining the moving direction of the moving object passing each position point once;
and predicting the moving direction of the moving object towards the target object within a certain time in the future according to the moving direction of the moving object passing through each position point in sequence by combining the position of the target object.
4. The method of claim 3, wherein the motion trajectory includes position points that the mobile object sequentially passes through, and the predicting of the motion direction of the mobile object toward the target object within a certain time in the future based on the motion direction of the mobile object sequentially passing through the position points comprises:
and taking the angle direction obtained by calculating the coordinates of adjacent position points in the plurality of position points through which the moving object sequentially passes as the motion direction of the moving object sequentially passing through each position point.
5. A trajectory prediction method, suitable for self-moving equipment,
detecting a moving object existing in a bedroom where the mobile equipment is located and a motion track of the moving object;
determining a position of at least one wardrobe in the bedroom;
determining the moving direction of the moving object passing through each position point in sequence;
and predicting the moving direction of the moving object towards the target object within a certain time in the future according to the moving direction of the moving object when the moving object sequentially passes through each position point by combining the position of the wardrobe.
6. A method of controlling a self-moving device, the method comprising:
detecting moving objects existing in the elevator car from the mobile equipment and motion tracks of the moving objects;
identifying an elevator in the elevator cab from the mobile device;
predicting a target motion direction of a moving object within a certain time in the future according to the position of the elevator and the motion trail of the moving object, wherein the target motion direction is a direction facing the elevator;
and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
7. A method of controlling a self-moving device, the method comprising:
detecting a moving object existing in an office from a mobile device and a motion trail of the moving object;
identifying a target object in an office from a mobile device; the target object includes: at least one of a printer and a file cabinet;
predicting a target motion direction of the mobile object within a certain time in the future according to the position of the target object and the motion trail of the mobile object, wherein the target motion direction is a direction towards the target object;
and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
8. A method of controlling a self-moving device, the method comprising:
detecting a moving object existing in a bedroom from the mobile equipment and a motion track of the moving object;
identifying a wardrobe in a bedroom from a mobile device;
predicting a target motion direction of the mobile object within a certain time in the future according to the position of the wardrobe and the motion trail of the mobile object, wherein the target motion direction is towards the wardrobe;
and if the self-moving equipment is positioned in the target moving direction, carrying out avoidance processing on the moving object.
9. A method of controlling a self-moving device, the method comprising:
detecting moving objects existing in the elevator car from the mobile equipment and motion tracks of the moving objects;
identifying an elevator in the elevator cab from the mobile device;
predicting a target motion direction of a moving object within a certain time in the future according to the position of the elevator and the motion trail of the moving object, wherein the target motion direction is a direction facing the elevator;
and adjusting the moving object to approach from the mobile equipment to the target motion direction.
10. A method of controlling a self-moving device, the method comprising:
detecting a moving object existing in an office from a mobile device and a motion trail of the moving object;
identifying a target object in an office from a mobile device; the target object includes: at least one of a printer and a file cabinet;
predicting a target motion direction of the mobile object within a certain time in the future according to the position of the target object and the motion trail of the mobile object, wherein the target motion direction is a direction towards the target object;
and adjusting the moving object to approach from the mobile equipment to the target motion direction.
11. A method of controlling a self-moving device, the method comprising:
detecting a moving object existing in a bedroom from the mobile equipment and a motion track of the moving object;
identifying a wardrobe in a bedroom from a mobile device;
predicting a target motion direction of the mobile object within a certain time in the future according to the position of the wardrobe and the motion trail of the mobile object, wherein the target motion direction is towards the wardrobe;
and adjusting the moving object to approach from the mobile equipment to the target motion direction.
CN202111248186.9A 2018-10-26 2018-10-26 Trajectory prediction and self-moving equipment control method Pending CN114153200A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111248186.9A CN114153200A (en) 2018-10-26 2018-10-26 Trajectory prediction and self-moving equipment control method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111248186.9A CN114153200A (en) 2018-10-26 2018-10-26 Trajectory prediction and self-moving equipment control method
CN201811261238.4A CN111103875B (en) 2018-10-26 2018-10-26 Method, apparatus and storage medium for avoiding

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811261238.4A Division CN111103875B (en) 2018-10-26 2018-10-26 Method, apparatus and storage medium for avoiding

Publications (1)

Publication Number Publication Date
CN114153200A true CN114153200A (en) 2022-03-08

Family

ID=70331842

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111248186.9A Pending CN114153200A (en) 2018-10-26 2018-10-26 Trajectory prediction and self-moving equipment control method
CN201811261238.4A Active CN111103875B (en) 2018-10-26 2018-10-26 Method, apparatus and storage medium for avoiding

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201811261238.4A Active CN111103875B (en) 2018-10-26 2018-10-26 Method, apparatus and storage medium for avoiding

Country Status (2)

Country Link
CN (2) CN114153200A (en)
WO (1) WO2020083069A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111984008A (en) * 2020-07-30 2020-11-24 深圳优地科技有限公司 Robot control method, device, terminal and storage medium
CN114153197B (en) * 2020-08-17 2023-08-18 速感科技(北京)有限公司 Method and device for getting rid of poverty of autonomous mobile equipment
CN113359753B (en) * 2021-06-24 2023-09-08 深圳市普渡科技有限公司 Robot, robot welcome movement method and readable storage medium
CN113598651A (en) * 2021-07-23 2021-11-05 深圳市云鼠科技开发有限公司 Cleaning robot control method and device based on microwaves, computer equipment and memory
CN113951767A (en) * 2021-11-08 2022-01-21 珠海格力电器股份有限公司 Control method and device for movable equipment

Family Cites Families (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4699426B2 (en) * 2006-08-08 2011-06-08 パナソニック株式会社 Obstacle avoidance method and obstacle avoidance moving device
JP4576445B2 (en) * 2007-04-12 2010-11-10 パナソニック株式会社 Autonomous mobile device and program for autonomous mobile device
JP5212939B2 (en) * 2008-07-17 2013-06-19 パナソニック株式会社 Autonomous mobile device
CN100570523C (en) * 2008-08-18 2009-12-16 浙江大学 A kind of mobile robot's barrier-avoiding method based on the barrier motion prediction
JP5774552B2 (en) * 2012-07-09 2015-09-09 株式会社東芝 Robot controller
JP2014186588A (en) * 2013-03-25 2014-10-02 Seiko Epson Corp Simulation apparatus, program, and image generating method
KR102071947B1 (en) * 2013-05-10 2020-01-31 삼성전자주식회사 Cleaning robot and controlling method thereof
KR102094347B1 (en) * 2013-07-29 2020-03-30 삼성전자주식회사 Auto-cleaning system, cleaning robot and controlling method thereof
CN103558856A (en) * 2013-11-21 2014-02-05 东南大学 Service mobile robot navigation method in dynamic environment
US9563205B2 (en) * 2014-02-10 2017-02-07 Savioke, Inc. Sensor configurations and methods for mobile robot
JP6572205B2 (en) * 2014-04-03 2019-09-04 株式会社日立製作所 Autonomous mobile
DE102014017863A1 (en) * 2014-12-03 2016-06-09 Daimler Ag Method for carrying out a parking operation of a vehicle, device for carrying out the method and vehicle with such a device
US10331140B2 (en) * 2014-12-25 2019-06-25 Equos Research Co., Ltd. Moving body
JP6455234B2 (en) * 2015-03-03 2019-01-23 富士通株式会社 Action detection method and action detection apparatus
CN104680559B (en) * 2015-03-20 2017-08-04 青岛科技大学 The indoor pedestrian tracting method of various visual angles based on motor behavior pattern
CN106610664A (en) * 2015-10-22 2017-05-03 沈阳新松机器人自动化股份有限公司 Movement obstacle avoidance device and control method
JP6638348B2 (en) * 2015-11-20 2020-01-29 日本精工株式会社 Mobile robot system
WO2017144350A1 (en) * 2016-02-25 2017-08-31 Nec Europe Ltd. Method for motion planning for autonomous moving objects
CN105739500B (en) * 2016-03-29 2020-06-30 海尔优家智能科技(北京)有限公司 Interactive control method and device of intelligent sweeping robot
CN106023244A (en) * 2016-04-13 2016-10-12 南京邮电大学 Pedestrian tracking method based on least square locus prediction and intelligent obstacle avoidance model
CN105807773A (en) * 2016-05-13 2016-07-27 南京工程学院 Restaurant service robot system based on iGPS and internal communication
CN107788913A (en) * 2016-08-31 2018-03-13 科沃斯机器人股份有限公司 Clean robot and its control method
CN107797550A (en) * 2016-09-01 2018-03-13 松下电器(美国)知识产权公司 Autonomous formula robot, method and non-transient recording medium
CN107977986B (en) * 2016-10-21 2021-11-23 北京君正集成电路股份有限公司 Method and device for predicting motion trail
CN106598046B (en) * 2016-11-29 2020-07-10 北京儒博科技有限公司 Robot avoidance control method and device
CN106774314A (en) * 2016-12-11 2017-05-31 北京联合大学 A kind of home-services robot paths planning method based on run trace
WO2018143620A2 (en) * 2017-02-03 2018-08-09 Samsung Electronics Co., Ltd. Robot cleaner and method of controlling the same
CN107046711B (en) * 2017-02-21 2020-06-23 沈晓龙 Database establishment method for indoor positioning and indoor positioning method and device
CN107643752B (en) * 2017-05-09 2020-12-08 清研华宇智能机器人(天津)有限责任公司 Omnidirectional mobile robot path planning algorithm based on pedestrian trajectory prediction
CN107168342B (en) * 2017-07-12 2020-04-07 哈尔滨工大智慧工厂有限公司 Pedestrian trajectory prediction method for robot path planning
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN107390721B (en) * 2017-07-26 2021-05-18 歌尔科技有限公司 Robot following control method and device and robot
CN107741745B (en) * 2017-09-19 2019-10-22 浙江大学 A method of realizing mobile robot autonomous positioning and map structuring
CN107544507A (en) * 2017-09-28 2018-01-05 速感科技(北京)有限公司 Mobile robot control method for movement and device
CN107885209B (en) * 2017-11-13 2020-08-21 浙江工业大学 Obstacle avoidance method based on dynamic window and virtual target point
CN107894773A (en) * 2017-12-15 2018-04-10 广东工业大学 A kind of air navigation aid of mobile robot, system and relevant apparatus
CN107908191B (en) * 2017-12-20 2024-03-29 芜湖哈特机器人产业技术研究院有限公司 Motion control system and method for serial-parallel robot
CN107992052B (en) * 2017-12-27 2020-10-16 纳恩博(北京)科技有限公司 Target tracking method and device, mobile device and storage medium
CN108544490B (en) * 2018-01-05 2021-02-23 广东雷洋智能科技股份有限公司 Obstacle avoidance method for unmanned intelligent robot road
CN108549410A (en) * 2018-01-05 2018-09-18 灵动科技(北京)有限公司 Active follower method, device, electronic equipment and computer readable storage medium
CN108490938A (en) * 2018-03-21 2018-09-04 沈阳上博智像科技有限公司 Unmanned equipment vision obstacle avoidance system and method

Also Published As

Publication number Publication date
CN111103875B (en) 2021-12-03
CN111103875A (en) 2020-05-05
WO2020083069A1 (en) 2020-04-30

Similar Documents

Publication Publication Date Title
CN111103875B (en) Method, apparatus and storage medium for avoiding
CN110968083B (en) Method for constructing grid map, method, device and medium for avoiding obstacles
KR101948728B1 (en) Method and system for collecting data
US8793069B2 (en) Object recognition system for autonomous mobile body
US8787614B2 (en) System and method building a map
EP1978432B1 (en) Routing apparatus for autonomous mobile unit
CN112650235A (en) Robot obstacle avoidance control method and system and robot
KR20180080498A (en) Robot for airport and method thereof
Wang et al. Acoustic robot navigation using distributed microphone arrays
KR20190134554A (en) Method of identifying dynamic obstacle and robot implementing thereof
KR100962593B1 (en) Method and apparatus for area based control of vacuum cleaner, and recording medium thereof
CN110713087B (en) Elevator door state detection method and device
CN111197985B (en) Area identification method, path planning method, device and storage medium
CN110946511B (en) Method, apparatus and storage medium for determining slippage
KR20220055167A (en) Autonomous robot, world map management server of autonomous robot and collision avoidance method using the same
KR102570164B1 (en) Airport robot, and method for operating server connected thereto
JP2019012504A (en) Autonomous moving device, autonomous moving method and program
US11400593B2 (en) Method of avoiding collision, robot and server implementing thereof
Nguyen et al. Confidence-aware pedestrian tracking using a stereo camera
EP3842885A1 (en) Autonomous movement device, control method and storage medium
JP4774401B2 (en) Autonomous mobile route setting device
WO2021246169A1 (en) Information processing device, information processing system, method, and program
JP7411185B2 (en) Guidance of a mobile robot that follows passersby
Glas et al. Simultaneous people tracking and robot localization in dynamic social spaces
WO2021246170A1 (en) Information processing device, information processing system and method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination