CN111638719A - Robot and its moving method, equipment, circuit and medium - Google Patents

Robot and its moving method, equipment, circuit and medium Download PDF

Info

Publication number
CN111638719A
CN111638719A CN202010528602.XA CN202010528602A CN111638719A CN 111638719 A CN111638719 A CN 111638719A CN 202010528602 A CN202010528602 A CN 202010528602A CN 111638719 A CN111638719 A CN 111638719A
Authority
CN
China
Prior art keywords
robot
obstacle
determining
collision
sensing module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010528602.XA
Other languages
Chinese (zh)
Inventor
周骥
冯歆鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaoxing Zhaoguan Electronic Technology Co ltd
Original Assignee
Shaoxing Zhaoguan Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaoxing Zhaoguan Electronic Technology Co ltd filed Critical Shaoxing Zhaoguan Electronic Technology Co ltd
Priority to CN202010528602.XA priority Critical patent/CN111638719A/en
Publication of CN111638719A publication Critical patent/CN111638719A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present disclosure relates to a robot and a traveling method, apparatus, circuit, and medium thereof. The traveling method includes: the robot moves towards the obstacle along a first direction, and the moving speed is less than the preset speed; determining whether the robot collides with an obstacle; in response to determining that the robot collides with an obstacle, the robot moves away from the obstacle in a second direction; and the robot advances towards the obstacle along a third direction, the advancing speed is less than the preset speed, and an included angle between the first direction and the third direction is less than a preset angle.

Description

Robot and its moving method, equipment, circuit and medium
Technical Field
The present disclosure relates to the field of robots, and more particularly, to a robot and a method, apparatus, circuit, and medium for traveling the same.
Background
There is an obstacle avoidance related art related to a robot. In the related art, a robot detects an obstacle using a camera and a laser radar (LDS). The camera can be configured to acquire images of an external environment of the robot, communicate with the processor to perform positioning of the obstacle, and move the robot to avoid the obstacle so as to avoid the obstacle and prevent the robot from being damaged due to collision with the obstacle. The LDS can be configured to determine the distance between the robot and the obstacle according to the time of the laser emitting and then being reflected by the obstacle, so that the obstacle is positioned, the robot is moved to avoid the obstacle, the obstacle is avoided, and the robot is prevented from being damaged due to collision with the obstacle.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, unless otherwise indicated, the problems mentioned in this section should not be considered as having been acknowledged in any prior art.
Disclosure of Invention
According to an aspect of the present disclosure, there is provided a traveling method of a robot, including: the robot moves towards the obstacle along a first direction, and the moving speed is less than the preset speed; determining whether the robot collides with an obstacle; in response to determining that the robot collides with an obstacle, the robot moves away from the obstacle in a second direction; and the robot advances towards the obstacle along a third direction, the advancing speed is less than the preset speed, and an included angle between the first direction and the third direction is less than a preset angle.
According to another aspect of the present disclosure, there is also provided an electronic circuit comprising: circuitry configured to perform the steps of the travel method described above.
According to another aspect of the present disclosure, there is also provided a robot including: an electronic circuit as described above.
According to another aspect of the present disclosure, there is also provided an electronic device including: a processor; and a memory storing a program comprising instructions that, when executed by the processor, cause the electronic device to perform the travel method described above.
According to another aspect of the present disclosure, there is also provided a non-transitory computer readable storage medium storing a program, the program comprising instructions which, when executed by a processor of an electronic device, cause the electronic device to perform the travel method described above.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the embodiments and, together with the description, serve to explain the exemplary implementations of the embodiments. The illustrated embodiments are for purposes of illustration only and do not limit the scope of the claims. Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
Fig. 1 is a flowchart illustrating a travel method of a robot according to an exemplary embodiment;
2-4 are schematic diagrams illustrating a method of travel of a robot according to an exemplary embodiment;
fig. 5 is a flowchart illustrating a travel method of a robot according to an exemplary embodiment;
fig. 6 is a front view illustrating a home sweeping robot according to an exemplary embodiment;
fig. 7 is a bottom view illustrating a home sweeping robot according to an exemplary embodiment;
fig. 8 and 9 are flowcharts illustrating a travel method of a robot according to an exemplary embodiment;
fig. 10 is a block diagram showing the structure of an exemplary computing device to which the exemplary embodiments can be applied.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. Furthermore, the term "and/or" as used in this disclosure is intended to encompass any and all possible combinations of the listed items.
With the development of artificial intelligence technology, robots with mobility have been increasingly widely used in many fields (e.g., home life, logistics, catering, medical, etc.). The mobile robot has the moving capability, and meanwhile, the problem of safe obstacle avoidance of the mobile robot needs to be considered, so that the probability of damage caused by collision with an obstacle in the moving process of the robot is reduced.
According to the related art, a sensing module configured to detect whether there is an obstacle in front of a robot may be provided on the robot. The robot can also be positioned according to the detection signal of the sensing module, so that the robot can execute obstacle avoidance logic and avoid the obstacle, and the robot is prevented from being damaged due to collision. Various types of sensing modules may be provided to enable more accurate detection and location of obstacles. For example, a combination of at least two of a camera module, a laser radar module, an ultrasonic sensor module, and an infrared sensor module may be used.
Unlike the related art described above, the present disclosure provides a traveling method of a robot, which can implement an edgewise running logic of traveling against an obstacle. The robot travels toward the obstacle by decelerating so that the robot can tap the obstacle. The robot is moved away from the obstacle after each tap on the obstacle, and then the direction of travel toward the obstacle is adjusted at a small angle, so that a secondary collision with the obstacle can be achieved. Therefore, the robot in the disclosure is different from the related art in which the robot avoids the obstacle, but the robot actively dabs the obstacle and adjusts the traveling direction of the obstacle at a small angle after each dabbing of the obstacle, so that the robot can travel along the obstacle and has good proximity to the obstacle. In addition, due to the adoption of deceleration and light collision of the obstacle, the acting force on the robot in the collision process is reduced, and the possibility of damage of the robot due to collision can be reduced.
The robot may be, for example, a domestic sweeping robot. When the barrier is the wall, the domestic sweeping robot can realize the initiative dabbing of the wall through adopting the edgewise operation logic of the disclosure, and the small-angle adjustment is towards the advancing direction of the wall after each dabbing. Thereby can be Z style of calligraphy and advance along the wall, it is good with the proximity of wall, realize the edgewise cleaning, promote user experience.
The edgewise operation logic of the present disclosure may also be applicable to other types of robots, such as other household robots (e.g., a weeding robot or a window wiping robot), commercial robots (e.g., a mall cleaning robot or a welcome robot), or industrial robots (e.g., a shipping robot). By taking the weeding robot as an example, the edgewise operation logic can realize that the weeding robot advances along an obstacle (such as a fence), has good proximity with the obstacle, realizes edgewise weeding, and can better remove grass close to the obstacle. Taking the freight robot as an example, when the area of the site where the freight robot is located is small, the freight robot can move along the barrier by adopting the edgewise operation logic of the disclosure, and steering can be realized by adjusting the direction at small angles for many times. Accordingly, the edgewise run logic of the present disclosure can be applied to application scenarios required in various fields.
The traveling method of the robot of the present disclosure is further described below with reference to the drawings.
Fig. 1 is a flowchart illustrating a travel method of a robot according to an exemplary embodiment of the present disclosure. As shown in fig. 1, the traveling method may include: s101, the robot moves towards an obstacle along a first direction, and the moving speed is smaller than a preset speed; step S102, determining whether the robot collides with an obstacle; step S103, responding to the fact that the robot collides with the obstacle, and enabling the robot to be far away from the obstacle along a second direction; and step S104, the robot advances towards the obstacle along a third direction, the speed in the advancing process is smaller than the preset speed, and the included angle between the first direction and the third direction is smaller than a preset angle.
The preset speed can be the normal running speed of the robot or be less than the normal running speed of the robot. That is, the robot of the present disclosure decelerates first, and then travels toward the obstacle again to realize that the robot dabs the obstacle, reduces the possibility that the robot is damaged due to the collision.
In the present disclosure, the robot may travel toward the obstacle at a reduced speed or at a uniform speed.
The obstacle may be different in different application scenarios. Taking a household sweeping robot as an example, the obstacle may be a wall, a sofa, a wardrobe, a cabinet or a door, etc. The domestic robot of sweeping floor is through adopting this disclosed edgewise operation logic, can realize that the proximity with the barrier is good, realizes that the edgewise cleans, promotes user experience.
According to some embodiments, iteration may be performed, repeating steps S101-S104, until the robot no longer collides with the obstacle, successfully avoiding the obstacle. Therefore, the moving direction of the robot can be adjusted for multiple times, and the robot can move along the obstacle. In the present disclosure, the traveling directions of the robot toward the obstacle in any two adjacent times are respectively set as the first direction and the third direction, which is only for convenience of description and understanding, and it is not limited that the traveling directions of the robot toward the obstacle in any two adjacent times are necessarily respectively the same first direction and the same third direction. For example, for the traveling direction of the robot toward the obstacle three adjacent times, for convenience of description and understanding, the traveling direction toward the obstacle for the first time and the traveling direction toward the obstacle for the second time may be set to the first direction and the third direction, respectively, for the adjacent traveling directions. The adjacent traveling direction toward the obstacle for the second time and the traveling direction toward the obstacle for the third time may be set to the first direction and the third direction, respectively.
A principle schematic of a travel method of a robot according to an exemplary embodiment of the present disclosure is shown in fig. 2 to 4. As shown in fig. 2 to 4, the robot 100 may travel toward the obstacle in the first direction D1, such that the robot 100 taps the obstacle for the first time; in response to detecting that the robot 100 collides with the obstacle, the robot 100 may perform steering away from the obstacle in the second direction D2; then, the robot 100 travels toward the obstacle in the third direction D3. The included angle between the first direction D1 and the third direction D3 is smaller than a preset angle, the moving direction of the robot towards the obstacle is adjusted by a small angle, and the robot 100 can touch the obstacle for the second time. In response to detecting the robot 100 tapping the obstacle a second time, the robot 100 performs a turn away from the obstacle. The robot then adjusts the direction of travel towards the obstacle again by a small angle. And iteration can be carried out for a plurality of times until the robot does not collide with the obstacle any more, and the obstacle is successfully avoided. Thus, the robot can move along the obstacle by adjusting the moving direction of the robot to the obstacle for a plurality of times, so that the robot and the obstacle have good proximity. After n (n is a positive integer) times of adjustment, the advancing direction of the robot can be approximately parallel to the collision surface of the obstacle, so that the robot can not collide with the obstacle any more, the robot does not need to adjust the advancing direction any more, can quit the edgewise operation logic, switches to the normal operation speed, and continues to operate. For convenience of description and understanding, during the course of the robot traveling against the obstacle, for any two adjacent collisions of the robot with the obstacle, can be defined as: the robot travels towards the obstacle along the first direction D1 and collides with the obstacle at the k-1 th time (n is more than or equal to 1 k, k is a positive integer), the robot travels towards the obstacle along the third direction D3 and collides with the obstacle at the k-th time, and an included angle between the first direction D1 and the third direction D3 is smaller than a preset angle, so that the small-angle adjustment of the robot after the robot lightly touches the obstacle is realized, the robot can collide with the obstacle again, and the robot can travel along the obstacle. According to some embodiments, the robot may be arranged to adjust the angle towards the direction of travel of the obstacle the same each time, i.e. the angle between the first direction D1 and the third direction D3 is the same for any two consecutive collisions between the robot and the obstacle, for control purposes. The preset angle may be, for example, 0 ° to 15 ° (e.g., 2 °).
According to some embodiments, the edgewise running logic may be initiated to perform steps S101 to S104 in response to detecting that the distance between the robot and the obstacle is less than a first preset distance. Therefore, after the obstacle is detected, the edgewise operation logic can be started to realize the purpose of traveling along the obstacle.
Based on this, according to some embodiments, the robot may comprise a first sensing module for detecting a distance between the robot and the obstacle. As shown in fig. 5, step S101 may further include: step S201, acquiring a detection signal of the first sensing module; and step S202, determining the distance between the robot and the obstacle according to the detection signal of the first sensing module. Accordingly, step S101 may be performed in response to determining that the distance between the robot and the obstacle is less than a first preset distance. Thus, the first sensing module is capable of detecting an obstacle before the robot collides with the obstacle, and may decelerate to travel toward the obstacle in response to determining that the distance between the robot and the obstacle is less than the first preset distance. Therefore, the robot can be fast close to the barrier at a normal running speed, and when the distance between the robot and the barrier is smaller than a first preset distance, the robot decelerates to lightly touch the barrier, so that the running efficiency is improved. In addition, the robot is decelerated in advance to lightly touch the barrier, so that the problem that the robot is damaged due to collision with the barrier at a normal running speed can be avoided, and the possibility of damage of the robot due to collision is reduced.
The first sensing module may, for example, comprise at least one of the following sensing modules: camera module, laser radar module and ultrasonic sensing module. The camera module may be configured to acquire an image of an environment in which the robot is located, and for example, a distance between the robot and an obstacle in the environment may be detected based on binocular vision, so as to achieve positioning of the robot. The positioning of the robot may also be achieved based on structured light or TOF to detect the distance between the robot and obstacles in the environment. The laser radar module can be configured to detect the distance between the robot and the obstacle through the time of the laser emitting and then reflecting back when the laser touches the obstacle, so that the robot can be positioned. The ultrasonic sensing module can be configured to detect the distance between the robot and the obstacle through the time of the ultrasonic wave after being transmitted and reflected back by the obstacle, so that the robot can be positioned. According to some embodiments, the first sensing module may include at least two sensing modules, for example, a camera module and a laser radar module, and may be capable of more accurately detecting obstacles around the robot.
The camera module may be a stand-alone device (e.g., a camera, a video camera, a webcam, etc.) or may be included in various types of electronic devices (e.g., a mobile phone, a computer, a personal digital assistant, a reading aid, a tablet computer, etc.).
According to some embodiments, the first preset distance may be a distance between a set reference point on the robot and a set reference point on the obstacle. The first preset distance may be, for example, but is not limited to, a distance between a center of an orthographic projection of the robot on a horizontal plane and a center of an orthographic projection of the obstacle on the horizontal plane, and is not limited herein, as long as it is satisfied that the robot can collide with the obstacle in a short time when the distance between the robot and the obstacle is smaller than the first preset distance.
After the obstacle is positioned, the first direction may be determined according to a relative positional relationship between the robot and the obstacle. The robot decelerates and travels in the first direction toward the obstacle. Therefore, the robot can be slightly touched with the obstacle for the first time, and then the moving direction of the robot facing the obstacle is adjusted at a small angle for a plurality of times, so that the robot can move along the obstacle. According to some embodiments, the first direction may be substantially perpendicular to a collision surface (e.g., a wall surface) of the obstacle when the obstacle is positioned directly in front of the robot. And when the obstacle is not located right in front of the robot (e.g., left front or right front), the first direction may be at an angle of less than 90 degrees with respect to a surface of the obstacle to be collided, so that the robot can travel toward the obstacle in the first direction.
According to some embodiments, as shown in connection with fig. 6 and 7, the robot may further comprise a chassis 10 and at least one collision detection assembly. Each of the collision detecting assemblies may include an air pressure sensing module 12 and a hose 11 at least partially circumferentially disposed on an outer side of the chassis 10. The hose 11 is filled with a gas (e.g., air or other compressible gas), and the air pressure sensing module 12 is used for detecting the air pressure in the hose 11. In this case, step S102 may include: acquiring a detection signal of the air pressure sensing module; determining whether the air pressure in the hose changes or not according to the detection signal of the air pressure sensing module; and determining that the robot collides with an obstacle in response to determining that the air pressure in the hose changes. Therefore, whether the robot collides with the obstacle can be detected by detecting whether the air pressure in the hose changes, so that the robot is sensed to collide with the obstacle. In addition, the hose can play a role in buffering when colliding, and the robot is protected from being damaged due to collision.
It is possible, but not limited to, determining whether the robot collides with an obstacle according to whether the air pressure signal in the hose is greater than a set value. For example, it may also be determined whether the robot collides with an obstacle according to a rate of change of the air pressure signal in the hose.
The domestic sweeping robot shown in fig. 7 comprises 6 collision detection assemblies, and hoses of the 6 collision detection assemblies are uniformly distributed and surround the outer side of the whole chassis 10. However, this is not to be understood as a limitation on the number of collision detection assemblies and the arrangement of the collision detection assemblies included in the robot, and it is also contemplated that a different number of collision detection assemblies and other arrangements may be used. For example, the robot may include 1, 2, 4, 8, or any number of collision detection assemblies. As long as it is necessary to detect whether the robot is collided at any point or section of the periphery of the robot, a collision detecting component may be arranged at the point or section to be detected, accordingly. Preferably, the collision detecting assemblies are arranged in pairs, preferably symmetrically distributed in the circumferential direction, on the periphery of the robot for a low-cost and easy mounting and dismounting of the collision detecting assemblies.
According to some embodiments, one or more collision detection assemblies may be arranged circumferentially around only a portion of the circumference of the robot (e.g., in the range of 60 °, 100 °, 120 °, 180 °, or 300 °), or in any other distributed manner (e.g., non-uniformly distributed), depending on the actual requirements of collision detection. In a preferred embodiment, a plurality of collision detection assemblies may be arranged more densely in a section where the robot is more susceptible to collision (e.g., the front side in the robot traveling direction), while a small number (e.g., 2 or only 1) of collision detection assemblies or no collision detection assemblies are arranged in a section where the robot is less susceptible to collision (e.g., the rear side in the robot traveling direction).
According to some embodiments, at least one collision detection assembly may be disposed only on the same horizontal plane, or may be disposed on two different horizontal planes according to actual requirements of collision detection. In a preferred embodiment, the collision detection assemblies on one level may be respectively arranged offset in the circumferential direction with respect to the collision detection assemblies on the other level. Thus, the air pressure sensing modules of each collision detection assembly on different levels may be correspondingly circumferentially offset from each other. By this arrangement, it is advantageously possible to compensate for possible gaps between the collision detecting assemblies arranged on the same horizontal plane, so that, for example, when a robot is subjected to a collision or impact at the gap, a collision detecting assembly located on another horizontal plane and covering the gap in horizontal projection can still effectively detect the collision or impact.
The chassis 10 may be configured to carry the various components of the robot. Taking a household sweeping robot as an example, the robot may further include a controller (not shown), a sweeping mechanism 121, a moving mechanism 122, a driving mechanism, a housing 20, and the like. The camera module 111, the laser radar module 112, the cleaning mechanism 121, the moving mechanism 122, the driving mechanism, and the housing 20 may be fixed to the chassis 10. The at least one collision detecting assembly may be arranged outside the periphery of the chassis 10. The cleaning means 121 is configured, for example, in the form of a plurality of pivotable cleaning brushes, which can be arranged for cleaning a surface in contact therewith. The moving mechanism 122 is, for example, a wheel configured to freely move the robot. The drive mechanism may be any type of mechanism capable of driving the robot in translation, rotation, etc., such as a motor. The driving mechanism can drive each part of the robot to perform various operations. For example, the driving mechanism may drive the camera module 111 to perform a telescopic motion, a rotational motion, or the like. The housing 20 may be an enclosure adapted to protect other components of the robot against intrusion such as water, dust, etc. The housing may have any shape, such as a flat cylinder or a human-like shape simulating a human being, etc. For other types of robots, corresponding parts can be arranged on the chassis according to requirements so as to realize the functions of the robot.
According to some embodiments, the hose 11 may be received in a form-fitting manner in a recess in the side wall 110 of the chassis 10. Thereby, the hose can be reliably and easily positioned on the side wall of the chassis, and the connection of the hose and the air pressure sensing module is firmer and more stable. The recess is designed as a partial circular section, but other shapes of the recess are also conceivable, as long as the recess can accommodate, preferably form-fittingly, the hose. The end of the hose 11 may be bent toward the radial inner side of the chassis 10 so as to pass through a hole provided in the sidewall 110 of the chassis 10 and be connected to the air pressure sensing module 12 on the other side of the sidewall 110.
In one embodiment, one air pressure sensing module 12 may be connected to each end of the hose 11, so that the robot can be sensed more rapidly and precisely in case of collision. For example, when a certain point on the hose is collided, the air pressure sensing module closer to the point can firstly sense the collision more quickly, so that the robot can perform corresponding operations more quickly.
In another embodiment, only one air pressure sensing module may be connected to only one end of the hose, while the other end of the hose is closed, thereby saving costs and reducing the possibility of air leakage.
According to some embodiments, a hose protector 13 may be further disposed on the outer side of the hose 11 for buffering the impact applied to the hose 11 and for reducing the repeated compression deformation and rebound of the hose due to frequent impacts, thereby prolonging the service life of the hose and thus the robot. The hose protector 13 may partially, preferably completely, surround the outside of the hose. In one embodiment, only one hose protector may be provided in the circumferential direction of the robot, such that it in turn surrounds all hoses. In a further embodiment, several hose protectors can be provided in sections, i.e. one hose protector is associated with each hose. In another embodiment, hoses that are more susceptible to and/or less frequently impacted may be provided with hose protectors, while hoses that are less susceptible to and/or less frequently impacted may not be provided with hose protectors. In fig. 7 is shown that one hose protector 13 is arranged on the outside of each hose 11.
According to some embodiments, the hose protector 13 may be a flexible boot, but other forms of hose protectors are also contemplated. In order to make the hose protector play an effective role in protecting the hose, and at the same time, not to interfere too much with the deformation of the hose caused by collision, which may affect the detection of the air pressure change by the air pressure sensing module, the hose protector may be connected with the hose in a transition fit manner. Of course, a clearance or interference fit between the hose and the hose protector may be used depending on the particular application requirements.
In the above, whether the robot collides with the obstacle is sensed by detecting whether the air pressure in the hose at the periphery of the robot changes by using the air pressure sensing module, so that the robot and the obstacle are better in proximity. Of course, the distance between the robot and the obstacle may be detected multiple times during the travel of the robot toward the obstacle, and when the distance between the robot and the obstacle is sufficiently small, it may be approximately determined that the robot collides with the obstacle. Or, whether the robot collides with the obstacle may be determined in other manners, which is not limited herein.
After determining that the robot taps the obstacle, the robot is away from the obstacle in the second direction, step S103 may be performed.
According to some exemplary embodiments, step S103 may be that the robot retreats in the second direction to get away from the obstacle. In this case, referring to fig. 2, step S103 may include: determining a first ray r1 with a collision point with an obstacle as an end point, wherein the extension direction of the first ray r1 is opposite to the first direction D1; determining a perpendicular line p perpendicular to a collision surface of the obstacle and passing through the collision point; determining a second ray r2 which is positioned on the same side of the perpendicular p as the first ray r1 by taking a collision point with an obstacle as an end point; determining a direction of extension of the second ray r2 as the second direction D2; and the robot retreats in a second direction to get away from the obstacle. Thereby enabling the robot to travel better against the obstacle. When the robot cleaner is applied to a sweeping robot, the effect of edgewise sweeping can be improved. According to some embodiments, as shown in fig. 3, the second direction D2 may be parallel to the first direction D1. That is, the robot may stop after tapping the obstacle in the first direction in response to detecting that the robot collides with the obstacle, and then retreat as it is to get away from the obstacle. The second ray may form an included angle greater than zero with the first ray, that is, an included angle greater than zero may also be formed between the straight line in the second direction and the straight line in the first direction, as shown in fig. 2.
According to some embodiments, the robot may comprise an acceleration sensing module for detecting an acceleration direction of the robot. The acceleration direction detected by the acceleration sensing module can be acquired in response to the collision between the robot and the obstacle; and determining a collision position (i.e. a position of the collision point) with the obstacle when the robot travels towards the obstacle in the first direction according to the acceleration direction.
The extension direction of the ray may refer to: the ray extends outwards from its end point to form the direction of the ray. That is, the ray may be determined by an end point and the direction of extension. The second ray defined by the second direction and the collision point may refer to: the end point of the second ray is the collision point, and the extending direction of the second ray is the second direction. The collision surface of the obstacle may refer to a tangent plane of the obstacle passing the collision point.
Robot back-off may refer to: referring to fig. 7, the camera module 111 may be always located on a side close to the forward traveling direction with respect to the center of the chassis 10 during the forward traveling of the robot. Then during the robot's backward movement the camera module 111 is located on the side facing away from the backward movement direction with respect to the center of the chassis 10. Therefore, the camera module can acquire the environmental image in front of the robot in the process that the robot moves forwards, and the robot can move according to the environmental image. In the present disclosure, except for special statement, the traveling of the robot means that the robot travels forward, and the side where the camera module 111 is located is defined as the front of the robot.
According to other exemplary embodiments, step S103 may also adjust the travel direction for the robot to travel in the second direction to get away from the obstacle. In this case, as shown in fig. 4, step S103 may include: determining a first ray r1 with a collision point with an obstacle as an endpoint, the first ray extending in a direction opposite to the first direction D1; determining a perpendicular line p perpendicular to a collision surface of the obstacle and passing through the collision point; determining a third ray r3 which is positioned on a different side of the vertical line from the first ray by taking a collision point with an obstacle as an end point; determining an extension direction of the third ray r3 as the second direction D2; and the robot travels in a second direction to move away from the obstacle. Thereby being capable of improving the efficiency of the robot running along the edge. When being applied to the robot of sweeping the floor, can improve the efficiency of edgewise cleaning. In this case, according to some embodiments, the third direction D3 may be parallel to the first direction D1. The third direction may also form an included angle greater than zero with the first direction, for example, the travel angle may be adjusted clockwise or counterclockwise relative to the first direction (specifically, the travel angle may be adjusted clockwise or counterclockwise when the collision position of the robot with the obstacle is set), and the robot travels along the adjusted third direction. The determination method of the collision point and the determination method of the ray may be the same as the above-described exemplary embodiment.
The surface of the obstacle illustrated in fig. 2 to 4 is a plane, but the present invention is not limited to the case where the surface of the obstacle is a plane, and the present invention may be applied to an obstacle having a curved surface. For an obstacle with a curved surface, the collision surface of the obstacle is a tangent plane at the collision point.
According to some embodiments, step S103 may further comprise: determining a distance between the robot and an obstacle in a second direction; the robot is far away from the obstacle along the second direction, and the distance between the robot and the obstacle in the second direction is not larger than a second preset distance. Therefore, the traveling direction of the robot can be adjusted after the robot is away from the obstacle for a certain distance, and the cleaning efficiency is improved. The second preset distance can be as small as possible, and the robot can adjust the direction of traveling towards the obstacle only by meeting the requirement.
After the robot taps the obstacle and in response to detecting the collision the robot moves away from the obstacle, step S104 may be performed in which the robot travels in a third direction at a second collision speed, an included angle between the first direction and the third direction being smaller than a preset angle.
According to some embodiments, the robot may further comprise an acceleration sensing module for detecting an acceleration direction of the robot. Step S104 may include: in response to the fact that the robot collides with the obstacle, acquiring an acceleration direction detected by the acceleration sensing module; determining the collision position of the robot and the obstacle according to the acceleration direction; and determining the third direction according to the collision position of the robot and the obstacle. For example, when the collision position is located at the right front of the robot, the travel angle may be adjusted counterclockwise or clockwise with respect to the first direction, and the third direction may be determined, that is, the third direction may be determined by the first direction swinging counterclockwise or clockwise, so that the robot can travel forward against an obstacle. When the collision position is located at the left front of the robot, the travel angle may be adjusted clockwise or clockwise with respect to the first direction, and a third direction is determined, that is, the third direction may be determined by the first direction swinging clockwise or clockwise, so that the robot can travel forward against an obstacle. Wherein the angle between the first direction and the third direction may be 0 ° to 15 °, for example 2 °.
According to some embodiments, the acceleration sensing module may be an angular acceleration sensing module, such as but not limited to a gyroscope.
It should be noted that the present disclosure is not limited to the acceleration sensing module being arranged to determine the collision position of the robot with the obstacle. According to further embodiments, it may be provided that the robot comprises a plurality of the collision detecting assemblies, the plurality of hoses of the plurality of collision detecting assemblies being distributed along a circumference of the chassis to at least partially circumferentially surround an outer side of the robot. In this case, determining whether the robot collides with the obstacle may include: determining whether there is a change in air pressure in the plurality of hoses; and determining that the robot collides with an obstacle in response to determining that the air pressure of at least one of the plurality of hoses is changed. Accordingly, the robot traveling in the third direction toward the obstacle may include: determining a collision position of the robot with the obstacle according to the position of the at least one hose with changed air pressure; and determining the third direction according to the collision position of the robot and the obstacle. According to some embodiments, the location of each hose may be pre-marked. For example, the center of the chassis may be set as a center of a circle, and a reference line of zero degrees with the center of the circle as an end point and along a ray extending in the forward traveling direction of the robot may be set. The position of each hose can be represented by the corresponding central angle of each hose and the included angle between the connecting line of one end of each hose and the center of the chassis and the reference line. So that the location of the impact can be determined from the location of the at least one hose where the air pressure changes. According to some embodiments, it may be provided that one air pressure sensing module is connected to each of the two ends of each hose, so as to be able to determine which section of each hose is impacted. Since the distance between the collision point on the hose and the two air pressure sensing modules connected to the two ends of the hose is different, the time and the air pressure change curve of the air pressure change signal received by the two air pressure sensing modules may be different. Based on these differences further calculations, the robot can determine which section of hose has been collided. Accordingly, determining the collision location of the robot with the obstacle according to the location of the at least one hose may include: determining the collision position of each hose with the obstacle according to the detection signals of the two corresponding air pressure sensing modules of the at least one hose; determining the collision position of the robot and the obstacle according to the collision position of the at least one hose and the obstacle. So that the collision position of the robot with the obstacle can be accurately determined. In addition, when a certain point on the hose is collided, the air pressure sensing module closer to the point can firstly sense the impact faster and send corresponding signals to other processing components, so that the robot can perform corresponding operations more quickly.
According to some embodiments, instead of the above-mentioned starting the edgewise operation logic in response to detecting that the distance between the robot and the obstacle is smaller than the first preset distance, the edgewise operation logic may also be started in response to the air pressure sensing module sensing the collision, and steps S101 to S104 are performed. Thus, the air pressure sensing module senses the collision to start the logic of the edgewise operation, and the acceleration sensor is utilized to determine the collision position so as to determine the first direction according to the collision position to realize the traveling along the obstacle.
Based on this, as shown in fig. 8, step S101 may include, before: s301, acquiring an air pressure signal in the hose detected by the air pressure sensing module; step S302, determining whether the robot collides with an obstacle according to the air pressure signal; step S303, responding to the fact that the robot collides with an obstacle, and acquiring the acceleration direction detected by the acceleration sensing module; step S304, determining the collision position of the robot and the obstacle according to the acceleration direction; and step S305, determining the first direction according to the collision position of the robot and the obstacle. For example, when the collision position is located at the right front of the robot, the first direction may be determined by adjusting the travel angle counterclockwise or clockwise with respect to the initial travel direction of the robot before the collision, so that the robot can travel against the obstacle. When the collision position is located at the left front of the robot, the first direction may be determined by adjusting the travel angle clockwise or clockwise with respect to the initial travel direction of the robot before the collision, so that the robot can travel against the obstacle. In addition, the hose can cushion the collision and protect the robot from being damaged due to the collision. Of course, other ways of sensing the impact and determining the impact location may be used, and are not limited herein.
In the present disclosure, in response to detecting that the robot collides with an obstacle, the robot may immediately perform a stop and then move away from the obstacle. Therefore, the problem that the robot cannot accurately control the advancing direction of the robot due to the fact that the robot is bounced off in a collision mode can be solved.
According to some embodiments, the logic for performing the edgewise operation may be started in response to determining that the distance between the robot and the obstacle is less than the first preset distance or in response to the air pressure sensing module sensing the collision, so that the logic for performing the edgewise operation may be started by sensing the collision through the air pressure sensing module when the first sensing module does not timely detect the obstacle or when the obstacle located in the detection blind area of the first sensing module exists, and thus, the flexibility and the adaptability are better. In addition, when the first sensing module can detect the obstacle, the speed can be reduced in advance to lightly touch the obstacle, the number of times that the robot collides with the obstacle at the normal running speed is reduced, and damage to the robot caused by excessive strong and hard collision is prevented.
It is described above that the logic may be initiated to run along the edge in response to detecting that there is an obstacle in front of the robot and that the distance between the robot and the obstacle is less than a first preset distance. The edgewise run logic may also be initiated in response to sensing a collision of the robot with an obstacle. Both of the above two ways are to passively start running logic edgewise. It will be appreciated that the edgewise logic may also be actively activated in response to a user-entered activation command (e.g., the activation command may be entered via a key). An exemplary embodiment will be given below to describe how a user actively initiates the edgewise run logic.
According to an exemplary embodiment, the robot may be actively controlled to look for obstacles, initiating the edgewise run logic. In this case, as shown in fig. 9, the traveling method may further include: step S401, a map of a scene where the robot is located is established, wherein the map comprises obstacles with marked positions. Before step S101, the method may further include: s402, determining the real-time position of the robot in the scene; step S403, determining an obstacle to be collided; and step S404, determining the first direction according to the relative position relationship between the obstacle to be collided and the robot so as to enable the robot to move towards the obstacle to be collided along the first direction. Therefore, a user can control the robot to actively search for the barrier, the edgewise operation logic is actively started, and the flexibility and the universality of robot control are improved. According to some embodiments, the robot may first travel in the first direction at a normal operating speed towards the obstacle to collide. The robot may then decelerate and continue to travel towards the obstacle to be collided in response to detecting that the distance between the robot and the obstacle to be collided is less than a first preset distance (specific implementations have been described in the above). So that the robot can quickly approach the obstacle to be collided.
According to some embodiments, the robot may include a first sensing module (e.g., a camera module and a lidar module). Establishing a map of the scene in which the robot is located may include: determining whether an obstacle to be marked is detected or not according to the detection signal of the first sensing module in the traveling process of the robot; and in response to determining that an obstacle to be marked is detected, marking the obstacle in the map according to a relative positional relationship between the robot and the obstacle to be marked. For example, a positioning module (e.g., a GPS module) may be utilized to determine the real-time position of the robot, and the relative position relationship between the obstacle and the robot may be determined according to the detection signal of the first sensing module. Therefore, the position of the obstacle can be determined according to the real-time position of the robot and the relative position relation between the obstacle and the robot, and then the position of the obstacle is marked in the map. It is understood that the method is not limited to the above-mentioned one way to build a map of the scene of the robot, for example, a plurality of partial images of the scene may be acquired based on binocular vision and spliced to form an overall image (which may be used as a map) of the scene of the robot. A coordinate system of the whole image may be established, and then obstacles and their positions in the scene in which the robot is located may be determined from the whole image and the coordinate system.
After building a map of the scene in which the robot is located, the robot may proceed according to the map. For example, the robot actively seeks an obstacle to actively initiate edgewise run logic in response to detecting a start command input by the user. Of course, the robot may also operate according to other logic based on the map.
Due to human factors, new obstacles may be added to the scene where the robot is located, or obstacles which are not marked in the map due to the detection blind area of the first sensing module exist, and the control on the operation of the robot is influenced by the obstacles which are not marked in the map.
In order to solve the above technical problem, according to some embodiments, establishing a map of a scene in which the robot is located may further include: in response to determining that a robot collides with an obstacle, determining whether the obstacle currently colliding with the robot is included in the map; and in response to determining that the obstacle is not included in the map, marking the obstacle in the map. Therefore, when the robot collides with an obstacle, if the obstacle is determined not to be marked on the map, the map can be updated, and the obstacle can be marked on the map. Whether the robot collides with an obstacle can be sensed by using the collision detecting assembly including the air pressure sensing module and the hose. Alternatively, after marking the obstacle in the map, the edgewise run logic may also be initiated directly in response to determining that the robot has collided with the obstacle (the specific principles have been described in the above).
According to some embodiments, determining whether an obstacle currently colliding with the robot is included in the map may include: determining the real-time position of the robot in the scene when the robot collides with the obstacle; determining the position to be marked of the obstacle in the scene according to the collision position of the robot and the obstacle and the real-time position of the robot in the scene when the robot collides with the obstacle; comparing the position to be marked with the position of the marked obstacle in the map; and in response to the comparison result showing that the position to be marked is the same as the position of one marked obstacle in the map, determining that the obstacle which collides with the robot currently is included in the map. The real-time position of the robot, and the position of the obstacle currently colliding with the robot, can be determined with reference to the above. The collision position of the robot with the obstacle can be determined by referring to the above, for example, the collision position can be determined by using an acceleration sensing module.
According to some embodiments, after the robot lightly collides with the obstacle for multiple times at the collision speed, if the robot does not collide with the obstacle any more within a period of time, the robot can be controlled to exit from the edge advance logic and to operate at the normal traveling speed, so that the operation efficiency is improved.
After the edgewise operation logic is started, the robot may collide with the obstacle after adjusting the traveling direction for many times, and the obstacle cannot be avoided, which may seriously affect the operation efficiency.
Based on this, according to some embodiments, the step S104, before the robot travels in the third direction, may further include: determining whether the number of collisions of the robot after traveling at a speed less than the preset speed reaches a set number; and in response to determining that the number of collisions of the robot after traveling at a speed less than the preset speed reaches a set number, the robot avoiding the obstacle according to a first preset obstacle avoidance logic. Therefore, when the edgewise operation logic cannot avoid the obstacle, the operation logic is switched to another obstacle avoidance logic so as to avoid the obstacle and improve the operation efficiency. The set number of times may be set according to an actual scene where the robot is located, and a specific value thereof is not limited herein.
According to some embodiments, the robot avoiding the obstacle according to the first preset obstacle avoidance logic may include: the robot continues to move away from the obstacle; and in response to determining that the distance between the robot and the obstacle in the second direction is greater than a second preset distance, the robot travels in a fourth direction, wherein an angle between the fourth direction and the first direction may be greater than or equal to 90 °. That is, after step S103 is executed, if it is determined that the number of collisions of the robot after traveling at a speed less than the preset speed reaches a set number, the robot continues to move away from the obstacle (for example, greater than the first preset distance in the above description), and the traveling direction is adjusted by a large angle to avoid the obstacle. The method for adjusting the fourth direction relative to the first direction is the same as the method for adjusting the third direction relative to the first direction. For example, the traveling direction of the robot may be adjusted counterclockwise or clockwise with respect to the first direction according to the collision position of the robot with the obstacle, and the robot travels in the adjusted fourth direction to avoid the obstacle. Except that the angle between the fourth direction and the first direction is greater than or equal to 90 deg., and the angle between the third direction and the first direction is less than the preset angle.
It should be noted that the first preset obstacle avoidance logic is not limited to the above logic. For example, it is also possible to detect the height of an obstacle, and in response to the height of the obstacle being less than a set height, the robot may execute obstacle crossing logic to directly cross the obstacle for the purpose of avoiding the obstacle. In the practical application process, the obstacle avoidance logic of the robot can be various, the priority of each obstacle avoidance logic can be set, and each obstacle avoidance logic is executed in sequence according to the priority of the obstacle avoidance logic until a certain obstacle avoidance logic is executed to avoid the obstacle.
According to some embodiments, the priority of the edgewise operation logic can be set to be the highest, so that the function of traveling along an obstacle can be realized, for example, the function of edgewise cleaning of a household sweeping robot can be achieved. And when the logic operated along the edge can not avoid the obstacle, the corresponding obstacle avoidance logic can be sequentially executed according to the sequence from high to low in priority until the obstacle is avoided. After avoiding the obstacle, the obstacle avoidance logic can be switched to execute the required operation logic. According to some embodiments, the required operating logic may be executed, but is not limited to, for switching the obstacle avoidance logic in response to detecting a corresponding instruction input by the user.
According to another aspect of the present disclosure, there is also provided an electronic circuit, comprising: circuitry configured to perform the steps according to the travel method described above.
According to another aspect of the present disclosure, there is also provided a robot including: an electronic circuit as described above.
According to another aspect of the present disclosure, there is also provided an electronic apparatus, including: a processor; and a memory storing a program comprising instructions that, when executed by the processor, cause the electronic device to perform the travel method described above.
According to another aspect of the present disclosure, there is also provided a non-transitory computer-readable storage medium storing a program, the program comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to perform the travel method described above.
Fig. 10 is a block diagram illustrating an example of an electronic device according to an exemplary embodiment of the present disclosure. It is noted that the structure shown in fig. 10 is only one example, and the electronic device of the present disclosure may include only one or more of the constituent parts shown in fig. 10 according to a specific implementation.
The electronic device 2000 may be, for example, a general purpose computer (e.g., various computers such as a laptop computer, a tablet computer, etc.).
The electronic device 2000 may be configured to capture an image, process the captured image, and provide an audible prompt in response to data obtained by the processing.
The electronic device 2000 may include a camera 2004 for acquiring images. The video camera 2004 may include, but is not limited to, a webcam or a camera, etc. The electronic device 2000 may further include a sound output circuit 2005, the sound output circuit 2005 being configured to output a sound prompt. The sound output circuit 2005 may include, but is not limited to, an earphone, a speaker, a vibrator, or the like, and its corresponding driving circuit. The electronic device 2000 may further comprise electronic circuitry 2100, the electronic circuitry 2100 comprising circuitry configured to perform steps of an outfeed detection method as previously described (e.g., the method steps shown in the flowcharts of fig. 1, 5, 8, and 9).
According to some embodiments, the electronic device 2000 may further include image processing circuitry 2006, and the image processing circuitry 2006 may include circuitry configured to perform various image processing on images. The image processing circuitry 2006, for example, may include, but is not limited to, one or more of the following: circuitry configured to reduce noise in the image, circuitry configured to deblur the image, circuitry configured to geometrically correct the image, and so forth.
One or more of the various circuits described above (e.g., sound output circuit 2005, image processing circuit 2006, electronic circuit 2100) may be implemented using custom hardware and/or may be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. For example, one or more of the various circuits described above can be implemented by programming hardware (e.g., programmable logic circuits including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in an assembly language or hardware programming language (such as VERILOG, VHDL, C + +) using logic and algorithms according to the present disclosure.
According to some embodiments, electronic device 2000 may also include communications circuitry 2010, which communications circuitry 2010 may be any type of device or system that enables communication with an external device and/or with a network and may include, but is not limited to, a modem, a network card, an infrared communications device, a wireless communications device, and/or a chipset, such as a bluetooth device, 1302.11 device, a WiFi device, a WiMax device, a cellular communications device, and/or the like.
According to some embodiments, the electronic device 2000 may also include an input device 2011, which may be any type of device 2011 capable of inputting information to the electronic device 2000, and may include, but is not limited to, various sensors, mice, keyboards, touch screens, buttons, levers, microphones, and/or remote controls, among others.
According to some embodiments, the electronic device 2000 may also include an output device 2012, which output device 2012 may be any type of device capable of presenting information and may include, but is not limited to, a display, a visual output terminal, a vibrator, and/or a printer, among others. The vision-based output terminal may facilitate a user or a maintenance worker or the like to obtain output information from the electronic device 2000.
According to some embodiments, the electronic device 2000 may further comprise a processor 2001. The processor 2001 may be any type of processor and may include, but is not limited to, one or more general purpose processors and/or one or more special purpose processors (e.g., special purpose processing chips). The processor 2001 may be, for example, but not limited to, a central processing unit CPU or a microprocessor MPU or the like. The electronic device 2000 may also include a working memory 2002, which working memory 2002 may store programs (including instructions) and/or data (e.g., images, text, sound, and other intermediate data, etc.) useful for the operation of the processor 2001, and may include, but is not limited to, a random access memory and/or a read only memory device. The electronic device 2000 may also include a storage device 2003, which may include any non-transitory storage device, which may be non-transitory and may implement any storage device for data storage, and may include, but is not limited to, a disk drive, an optical storage device, a solid state memory, a floppy disk, a flexible disk, a hard disk, a magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a ROM (read only memory), a RAM (random access memory), a cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. The working memory 2002 and the storage device 2003 may be collectively referred to as "memory" and may be used concurrently with each other in some cases.
According to some embodiments, the processor 2001 may control and schedule at least one of the camera 2004, the sound output circuit 2005, the image processing circuit 2006, the communication circuit 2010, the electronic circuit 2100, and other various devices and circuits included in the electronic device 2000. According to some embodiments, at least some of the various components described in FIG. 10 may be interconnected and/or in communication by a bus 2013.
Software elements (programs) may reside in the working memory 2002 including, but not limited to, an operating system 2002a, one or more application programs 2002b, drivers, and/or other data and code.
According to some embodiments, instructions for performing the aforementioned control and scheduling may be included in the operating system 2002a or one or more application programs 2002 b.
According to some embodiments, instructions to perform method steps described in the present disclosure (e.g., the method steps shown in the flowcharts of fig. 1, 5, 8, and 9) may be included in the one or more application programs 2002b, and the various modules of the electronic device 2000 described above may be implemented by the processor 2001 reading and executing the instructions of the one or more application programs 2002 b. In other words, the electronic device 2000 may comprise a processor 2001 as well as a memory (e.g. working memory 2002 and/or storage device 2003) storing a program comprising instructions which, when executed by the processor 2001, cause the processor 2001 to perform a method according to various embodiments of the present disclosure.
According to some embodiments, some or all of the operations performed by at least one of the sound output circuit 2005, the image processing circuit 2006, and the electronic circuit 2100 may be implemented by the processor 2001 reading and executing instructions of one or more application programs 2002.
Executable code or source code of instructions of the software elements (programs) may be stored in a non-transitory computer readable storage medium, such as the storage device 2003, and may be stored in the working memory 2001 (possibly compiled and/or installed) upon execution. Accordingly, the present disclosure provides a computer readable storage medium storing a program comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to perform a method as described in various embodiments of the present disclosure. According to another embodiment, the executable code or source code of the instructions of the software elements (programs) may also be downloaded from a remote location.
It will also be appreciated that various modifications may be made in accordance with specific requirements. For example, customized hardware might also be used and/or individual circuits, units, modules, or elements might be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. For example, some or all of the circuits, units, modules, or elements encompassed by the disclosed methods and apparatus may be implemented by programming hardware (e.g., programmable logic circuitry including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in an assembly language or hardware programming language such as VERILOG, VHDL, C + +, using logic and algorithms in accordance with the present disclosure.
The processor 2001 in the electronic device 2000 may be distributed over a network according to some embodiments. For example, some processes may be performed using one processor while other processes may be performed by another processor that is remote from the one processor. Other modules of the electronic device 2000 may also be similarly distributed. As such, the electronic device 2000 may be interpreted as a distributed computing system performing processing at multiple locations.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the above-described methods, systems and apparatus are merely exemplary embodiments or examples and that the scope of the present invention is not limited by these embodiments or examples, but only by the claims as issued and their equivalents. Various elements in the embodiments or examples may be omitted or may be replaced with equivalents thereof. Further, the steps may be performed in an order different from that described in the present disclosure. Further, various elements in the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced with equivalent elements that appear after the present disclosure.
Some exemplary aspects of the disclosure are described below.
Aspect 1. a method of traveling of a robot, comprising:
the robot moves towards the obstacle along a first direction, and the moving speed is less than the preset speed;
determining whether the robot collides with an obstacle;
in response to determining that the robot collides with an obstacle, the robot moves away from the obstacle in a second direction; and
the robot advances towards the barrier along the third direction, and the speed of advancing is less than predetermine speed, the first direction with the contained angle between the third direction is less than predetermine the angle.
Aspect 2 the method of traveling of aspect 1, wherein the robot includes a first sensing module,
wherein before the robot travels toward the obstacle in the first direction, the traveling method further comprises:
acquiring a detection signal of the first sensing module; and
determining the distance between the robot and an obstacle according to the detection signal of the first sensing module,
wherein the robot travels in a first direction towards the obstacle is performed in response to determining that a distance between the robot and the obstacle is less than a first preset distance.
Aspect 3. the method of traveling of aspect 2, wherein the first sensing module includes at least one of the following sensing modules: camera module, laser radar module and ultrasonic sensing module.
Aspect 4 the method of traveling according to aspect 1, wherein the robot includes a chassis and at least one collision detection assembly, each collision detection assembly including a gas pressure sensing module and a hose disposed at least partially around an outside of the chassis, an inside of the hose being filled with a gas, the gas pressure sensing module being configured to detect a gas pressure in the hose, wherein determining whether the robot collides with an obstacle includes:
acquiring a detection signal of the air pressure sensing module;
determining whether the air pressure in the hose changes or not according to the detection signal of the air pressure sensing module; and
determining that the robot collides with an obstacle in response to determining that the air pressure in the hose changes.
Aspect 5. the travel method of aspect 4, wherein the robot further comprises an acceleration sensing module for detecting an acceleration direction of the robot,
wherein the robot traveling in a third direction toward the obstacle comprises:
in response to the fact that the robot collides with the obstacle, acquiring an acceleration direction detected by the acceleration sensing module;
determining the collision position of the robot and the obstacle according to the acceleration direction; and
and determining the third direction according to the collision position of the robot and the obstacle.
Aspect 6 the method of traveling of aspect 5, wherein the acceleration sensing module is an angular acceleration sensing module.
Aspect 7 the method of traveling of aspect 4, wherein the robot includes a plurality of the collision detection assemblies, wherein traveling the robot toward the obstacle in the third direction includes:
determining a collision position of the robot with the obstacle according to the position of at least one hose with changed air pressure; and
and determining the third direction according to the collision position of the robot and the obstacle.
Aspect 8 the method of advancing as set forth in aspect 7, wherein one air pressure sensing module is connected to each of both ends of the hose,
wherein determining a collision location of the robot with the obstacle from the location of the at least one hose comprises:
determining the collision position of each hose with the obstacle according to the detection signals of the two corresponding air pressure sensing modules of the at least one hose;
determining the collision position of the robot and the obstacle according to the collision position of the at least one hose and the obstacle.
Aspect 9 the method of traveling according to aspect 1, wherein the robot includes a chassis, at least one collision detecting assembly including a gas pressure sensing module and a hose disposed at least partially around an outside of the chassis, an inside of the hose being filled with gas, and an acceleration sensing module for sensing gas pressure in the hose,
wherein before the robot travels toward the obstacle in the first direction, the traveling method further comprises:
acquiring an air pressure signal in the hose detected by the air pressure sensing module;
determining whether the robot collides with an obstacle according to the air pressure signal;
in response to the fact that the robot collides with the obstacle, acquiring the acceleration direction detected by the acceleration sensing module;
determining the collision position of the robot and the obstacle according to the acceleration direction; and
and determining the first direction according to the collision position of the robot and the obstacle.
Aspect 10 the method of traveling of aspect 1, further comprising:
and establishing a map of the scene where the robot is located, wherein the map comprises the obstacles with marked positions.
Aspect 11 the method of traveling according to aspect 10, before the robot travels toward the obstacle in the first direction, further comprising:
determining a real-time position of the robot in a scene;
determining an obstacle to be collided; and
and determining the first direction according to the relative position relation between the obstacle to be collided and the robot.
Aspect 12 the method of traveling of aspect 10, wherein the establishing a map of the scene in which the robot is located includes:
in response to determining that a robot collides with an obstacle, determining whether the obstacle currently colliding with the robot is included in the map; and
in response to determining that the obstacle is not included in the map, marking the obstacle in the map.
The travel method of aspect 12, wherein determining whether an obstacle currently colliding with the robot is included in the map includes:
determining the real-time position of the robot in the scene when the robot collides with the obstacle;
determining the position of the obstacle to be marked in the scene according to the collision position of the robot and the obstacle and the real-time position of the robot in the scene when the robot collides with the obstacle;
comparing the position to be marked with the position of the marked obstacle in the map; and
and in response to the comparison result showing that the position to be marked is the same as the position of one marked obstacle in the map, determining that the obstacle which collides with the robot currently is included in the map.
Aspect 14 the method of traveling of aspect 10, wherein the robot includes a first sensing module,
wherein, establishing the map of the scene where the robot is located further comprises:
determining whether an obstacle to be marked is detected or not according to the detection signal of the first sensing module in the traveling process of the robot; and
in response to determining that an obstacle to be marked is detected, marking the obstacle in the map according to a relative positional relationship between the robot and the obstacle to be marked.
The method of traveling of any of aspects 1-14, wherein, in response to determining that the robot collides with the obstacle, the robot moving away from the obstacle in the second direction includes:
determining a first ray by taking a collision point with an obstacle as an end point, wherein the extending direction of the first ray is opposite to the first direction;
determining a perpendicular line perpendicular to a collision surface of the obstacle and passing through the collision point;
determining a second ray which is positioned on the same side of the vertical line as the first ray by taking a collision point with an obstacle as an end point;
determining an extension direction of the second ray as the second direction; and
the robot backs up in a second direction to get away from the obstacle.
The method of traveling of aspect 15, wherein the second direction is parallel to the first direction.
The method of traveling of any of aspects 1-14, wherein, in response to determining that the robot collides with the obstacle, the robot moving away from the obstacle in the second direction includes:
determining a first ray by taking a collision point with an obstacle as an end point, wherein the extending direction of the first ray is opposite to the first direction;
determining a perpendicular line perpendicular to a collision surface of the obstacle and passing through the collision point;
determining a third ray which is positioned on the different side of the vertical line from the first ray by taking a collision point with an obstacle as an end point;
determining an extension direction of the third ray as the second direction; and
the robot travels in a second direction to move away from the obstacle.
The method of traveling of aspect 17, wherein the first direction is parallel to the third direction.
The method of traveling of any of aspects 1-14, wherein, in response to determining that the robot collides with the obstacle, the robot moving away from the obstacle in the second direction includes:
determining a distance between the robot and an obstacle in a second direction; and
the robot is far away from the obstacle along the second direction, and the distance between the robot and the obstacle in the second direction is not larger than a second preset distance.
Aspect 20 the method of traveling of any one of aspects 1 to 14, wherein the preset angle is 0 ° to 15 °.
Aspect 21 the method of traveling of any of aspects 1-14, wherein the robot is a sweeping robot.
Aspect 22 the method of traveling of aspect 21, wherein the obstruction comprises a wall.
Aspect 23 the method of traveling of any of aspects 1-14, before the robot travels toward the obstacle in the third direction, further comprising:
determining whether the number of collisions of the robot after traveling at a speed less than the preset speed reaches a set number; and
in response to determining that the number of collisions of the robot after traveling at a speed less than the preset speed reaches a set number, the robot avoids an obstacle according to a first preset obstacle avoidance logic.
Aspect 24 the method of traveling of aspect 23, wherein the robot avoiding the obstacle according to a first preset obstacle avoidance logic comprises:
the robot continues to move away from the obstacle; and
in response to determining that the distance between the robot and the obstacle in the second direction is greater than a second preset distance, the robot travels in a fourth direction, wherein an angle between the fourth direction and the first direction is greater than or equal to 90 °.
Aspect 25 is an electronic circuit comprising:
circuitry configured to perform the steps of the travel method of any of aspects 1-24.
Aspect 26 a robot, comprising:
the electronic circuit of aspect 25.
An electronic device, comprising:
a processor; and
a memory storing a program comprising instructions that, when executed by the processor, cause the electronic device to perform the travel method of any of aspects 1-24.
A non-transitory computer readable storage medium storing a program, the program comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to perform the method of traveling according to any of aspects 1-24.

Claims (10)

1. A method of travel of a robot, comprising:
the robot moves towards the obstacle along a first direction, and the moving speed is less than the preset speed;
determining whether the robot collides with an obstacle;
in response to determining that the robot collides with an obstacle, the robot moves away from the obstacle in a second direction; and
the robot advances towards the barrier along the third direction, and the speed of advancing is less than predetermine speed, the first direction with the contained angle between the third direction is less than predetermine the angle.
2. The method of travel of claim 1, wherein the robot includes a first sensing module,
wherein before the robot travels toward the obstacle in the first direction, the traveling method further comprises:
acquiring a detection signal of the first sensing module; and
determining the distance between the robot and an obstacle according to the detection signal of the first sensing module,
wherein the robot travels in a first direction towards the obstacle is performed in response to determining that a distance between the robot and the obstacle is less than a first preset distance.
3. The travel method of claim 1, wherein the robot includes a chassis and at least one collision detection assembly, each of the collision detection assemblies including a gas pressure sensing module and a hose disposed at least partially around an outside of the chassis, an inside of the hose being filled with gas, the gas pressure sensing module being for sensing gas pressure in the hose,
wherein determining whether the robot collides with an obstacle comprises:
acquiring a detection signal of the air pressure sensing module;
determining whether the air pressure in the hose changes or not according to the detection signal of the air pressure sensing module; and
determining that the robot collides with an obstacle in response to determining that the air pressure in the hose changes.
4. The travel method of claim 3, wherein the robot further comprises an acceleration sensing module for detecting an acceleration direction of the robot,
wherein the robot traveling in a third direction toward the obstacle comprises:
in response to the fact that the robot collides with the obstacle, acquiring an acceleration direction detected by the acceleration sensing module;
determining the collision position of the robot and the obstacle according to the acceleration direction; and
and determining the third direction according to the collision position of the robot and the obstacle.
5. The travel method of claim 1, further comprising:
and establishing a map of the scene where the robot is located, wherein the map comprises the obstacles with marked positions.
6. The method of traveling of claim 5, before the robot travels toward the obstacle in the first direction, further comprising:
determining a real-time position of the robot in a scene;
determining an obstacle to be collided; and
and determining the first direction according to the relative position relation between the obstacle to be collided and the robot.
7. An electronic circuit, comprising:
circuitry configured to perform the steps of the method of traveling of any of claims 1-6.
8. A robot, comprising:
the electronic circuit of claim 7.
9. An electronic device, comprising:
a processor; and
a memory storing a program comprising instructions that, when executed by the processor, cause the electronic device to perform the travel method of any of claims 1-6.
10. A non-transitory computer readable storage medium storing a program, the program comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to perform the method of traveling of any of claims 1-6.
CN202010528602.XA 2020-06-11 2020-06-11 Robot and its moving method, equipment, circuit and medium Pending CN111638719A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010528602.XA CN111638719A (en) 2020-06-11 2020-06-11 Robot and its moving method, equipment, circuit and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010528602.XA CN111638719A (en) 2020-06-11 2020-06-11 Robot and its moving method, equipment, circuit and medium

Publications (1)

Publication Number Publication Date
CN111638719A true CN111638719A (en) 2020-09-08

Family

ID=72328420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010528602.XA Pending CN111638719A (en) 2020-06-11 2020-06-11 Robot and its moving method, equipment, circuit and medium

Country Status (1)

Country Link
CN (1) CN111638719A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112987725A (en) * 2021-02-07 2021-06-18 珠海市一微半导体有限公司 Obstacle-based avoidance method, chip and cleaning robot
CN113703437A (en) * 2021-04-15 2021-11-26 北京石头世纪科技股份有限公司 Robot obstacle avoidance method and device, robot, storage medium and electronic equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI262777B (en) * 2004-04-21 2006-10-01 Jason Yan Robotic vacuum cleaner
CN101201280A (en) * 2006-12-11 2008-06-18 财团法人工业技术研究院 Collision detecting device, collision detecting method as well as robot and suction cleaner using the same
CN101259000A (en) * 2007-03-07 2008-09-10 得利诚健康生活科技股份有限公司 Floor cleaning device
CN104757910A (en) * 2014-11-26 2015-07-08 深圳市银星智能科技股份有限公司 Smart floor sweeping robot and control method thereof
TW201545699A (en) * 2014-06-12 2015-12-16 Uni Ring Tech Co Ltd Traveling method of autonomous cleaning device
CN206443655U (en) * 2016-09-13 2017-08-29 深圳市银星智能科技股份有限公司 The touching sensing device and robot of a kind of robot
CN107885213A (en) * 2017-11-22 2018-04-06 广东艾可里宁机器人智能装备有限公司 A kind of sweeping robot indoor navigation system and method
CN108283466A (en) * 2017-12-27 2018-07-17 信利光电股份有限公司 The obstacle height detection device and method and crossover device and method of a kind of sweeping robot
CN108553041A (en) * 2018-03-19 2018-09-21 珠海市微半导体有限公司 A kind of control method robot trapped judgment method and its got rid of poverty
CN208034718U (en) * 2018-03-28 2018-11-02 湖南万为智能机器人技术有限公司 A kind of collision detecting device of wheeled mobile robot
CN109531585A (en) * 2017-09-22 2019-03-29 松下知识产权经营株式会社 Robot

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI262777B (en) * 2004-04-21 2006-10-01 Jason Yan Robotic vacuum cleaner
CN101201280A (en) * 2006-12-11 2008-06-18 财团法人工业技术研究院 Collision detecting device, collision detecting method as well as robot and suction cleaner using the same
CN101259000A (en) * 2007-03-07 2008-09-10 得利诚健康生活科技股份有限公司 Floor cleaning device
TW201545699A (en) * 2014-06-12 2015-12-16 Uni Ring Tech Co Ltd Traveling method of autonomous cleaning device
JP2016002453A (en) * 2014-06-12 2016-01-12 聯潤科技股▲ふん▼有限公司 Travel method of self-propelled cleaning apparatus
CN104757910A (en) * 2014-11-26 2015-07-08 深圳市银星智能科技股份有限公司 Smart floor sweeping robot and control method thereof
CN206443655U (en) * 2016-09-13 2017-08-29 深圳市银星智能科技股份有限公司 The touching sensing device and robot of a kind of robot
CN109531585A (en) * 2017-09-22 2019-03-29 松下知识产权经营株式会社 Robot
CN107885213A (en) * 2017-11-22 2018-04-06 广东艾可里宁机器人智能装备有限公司 A kind of sweeping robot indoor navigation system and method
CN108283466A (en) * 2017-12-27 2018-07-17 信利光电股份有限公司 The obstacle height detection device and method and crossover device and method of a kind of sweeping robot
CN108553041A (en) * 2018-03-19 2018-09-21 珠海市微半导体有限公司 A kind of control method robot trapped judgment method and its got rid of poverty
CN208034718U (en) * 2018-03-28 2018-11-02 湖南万为智能机器人技术有限公司 A kind of collision detecting device of wheeled mobile robot

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112987725A (en) * 2021-02-07 2021-06-18 珠海市一微半导体有限公司 Obstacle-based avoidance method, chip and cleaning robot
CN113703437A (en) * 2021-04-15 2021-11-26 北京石头世纪科技股份有限公司 Robot obstacle avoidance method and device, robot, storage medium and electronic equipment
WO2022218177A1 (en) * 2021-04-15 2022-10-20 北京石头创新科技有限公司 Obstacle avoidance method and apparatus for robot, robot, storage medium, and electronic device

Similar Documents

Publication Publication Date Title
US11654574B2 (en) Cleaning robot
US8521329B2 (en) Obstruction-determining apparatus for preventing mobile robot from becoming obstructed and boundary-estimation method and medium using the obstruction-determining apparatus
US20130218342A1 (en) Control method for cleaning robots
CN111638719A (en) Robot and its moving method, equipment, circuit and medium
US20180120852A1 (en) Mobile robot and navigating method for mobile robot
US9254870B2 (en) Method of generating optimum parking path of unmanned driving vehicle, and unmanned driving vehicle adopting the method
JP6971223B2 (en) A system having an autonomous mobile robot and a base station of an autonomous mobile robot, a base station of an autonomous mobile robot, a method for an autonomous mobile robot, and an automatic docking method for an autonomous mobile robot to a base station.
JP6946459B2 (en) Robot motion control method based on map prediction
US20180141213A1 (en) Anti-collision system and anti-collision method
JP2007323402A (en) Self-propelled equipment and its program
CN102890508A (en) Self-propelled electronic device and method for controlling behavior of self-propelled electronic device
US10042366B2 (en) Control method and system for adjusting relative position of mobile household device with respect to human
US20130218343A1 (en) Control method for cleaning robots
JP2021103593A (en) Autonomous mobile device, map information processing method, and program
KR20140087486A (en) Method for generating work path of mobile robot using virtual wall layer
JP3206661U (en) Self-propelled collision prevention, collision mitigation and wall running system
US20190206211A1 (en) Moving devices and controlling methods, remote controlling systems and computer products thereof
JP4962255B2 (en) Self-propelled device
JP2008023142A (en) Self-propelled vacuum cleaner and program
CN113974507B (en) Carpet detection method and device for cleaning robot, cleaning robot and medium
KR20070087759A (en) Moving control device and method of roving robot
CN116457159A (en) Safety system and method for use in robotic operation
CN114489076A (en) Rectangular sweeping robot control method and device and rectangular sweeping robot
KR102441050B1 (en) Apparatus and method for controlling collision of vehicle
CN212932959U (en) Mobile robot and ToF detection assembly

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination