CN114052561A - Self-moving robot - Google Patents

Self-moving robot Download PDF

Info

Publication number
CN114052561A
CN114052561A CN202010763845.1A CN202010763845A CN114052561A CN 114052561 A CN114052561 A CN 114052561A CN 202010763845 A CN202010763845 A CN 202010763845A CN 114052561 A CN114052561 A CN 114052561A
Authority
CN
China
Prior art keywords
self
obstacle
moving robot
sensor
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010763845.1A
Other languages
Chinese (zh)
Other versions
CN114052561B (en
Inventor
王旭宁
田宏图
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharkninja China Technology Co Ltd
Original Assignee
Sharkninja China Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharkninja China Technology Co Ltd filed Critical Sharkninja China Technology Co Ltd
Priority to CN202010763845.1A priority Critical patent/CN114052561B/en
Publication of CN114052561A publication Critical patent/CN114052561A/en
Application granted granted Critical
Publication of CN114052561B publication Critical patent/CN114052561B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated

Abstract

The present disclosure provides a self-moving robot, wherein the self-moving robot comprises a laser ranging device, an edge sensor and a controller. The laser ranging device is configured to be able to find a first profile of an obstacle; a second profile of the obstacle is detectable by the edge sensor; a controller is configured to enable the self-moving robot to travel along the obstacle selectively using the first profile and/or the second profile. The self-moving robot can effectively detect the obstacle through the combination of the laser ranging device and the edge sensor, and the situation that the obstacle cannot be detected and then the self-moving robot cannot advance along the target obstacle is avoided.

Description

Self-moving robot
Technical Field
The utility model belongs to the technical field of artificial intelligence, specifically provide a from mobile robot.
Background
Along with the improvement of living water, the intelligent sweeper is entering more and more families.
In order to achieve the purpose that the sweeping robot automatically cleans a room, the existing sweeping robot has a navigation function, and the sweeping robot can automatically clean the room by virtue of the navigation function. In the navigation modes commonly used by the sweeping robot, the laser navigation mode occupies a very large proportion.
Considering that the along-the-wall cleaning is also an important component of the navigation planning of the sweeping robot, in order to realize the along-the-wall cleaning of the sweeping robot, most sweeping robots are further provided with along-the-wall sensors (also called as edge sensors), so that the sweeping robots can guide the sweeping robots to carry out the along-the-wall cleaning operation through signals detected by the along-the-wall sensors (the distance between the along-the-wall sensors and the wall body can be determined according to the signals).
However, during the wall-following operation of the sweeping robot, the wall-following sensors cannot detect all obstacles, such as the suspended cabinet portion of the cabinet. Wherein the height of the suspended barrier is less than the maximum height of the self-moving robot. In other words, the self-moving machine cannot pass under a flying obstacle.
Disclosure of Invention
The present disclosure is directed to providing a self-moving robot to be able to detect an obstacle to be edgewise and perform an edgewise work on the obstacle.
The self-moving robot provided by the present disclosure includes:
a laser ranging device configured to enable finding a first profile of an obstacle;
an edge sensor configured to be able to find a second profile of the aforementioned obstacle;
a controller configured to enable the self-moving robot to travel along the obstacle selectively using the first profile and/or the second profile.
Optionally, the controller is further configured to cause the self-moving robot to travel along the obstacle according to the second signal when the laser ranging device finds the first profile and thus generates a first signal, and when the edge sensor finds the second profile and thus generates a second signal.
Optionally, the controller is further configured to cause the self-moving robot to travel along or approach the obstacle according to the first signal when the laser ranging device finds the first profile and thus generates a first signal, and when the edge sensor does not find the second profile.
Optionally, the controller is further configured to obtain an included angle between a forward direction of the self-moving robot and a target surface on the obstacle after determining that the obstacle is located behind the self-moving robot according to the first signal; and rotating the self-moving robot to the direction close to the obstacle by the included angle so that the advancing direction of the self-moving robot is parallel to or tangent to the target surface on the obstacle.
Optionally, the controller is further configured to cause the self-moving robot to travel along the obstacle after determining that the distance between the self-moving robot and the obstacle satisfies an edge condition according to the first signal.
Optionally, the self-moving robot further comprises a collision sensor disposed at a front side of the self-moving robot; the controller is further configured to receive a signal from the collision sensor and determine that the self-moving robot collides with a surface of an obstacle; further enabling the laser ranging device to acquire a first contour of the obstacle, and further acquiring at least part of the geometric shape of the obstacle according to the first contour; further acquiring an included angle between the advancing direction of the self-moving robot and the surface of the obstacle according to the geometric shape; and further rotating the self-moving robot by the included angle so that the advancing direction of the self-moving robot is parallel to or tangent to the one surface of the obstacle.
Optionally, the left side and the right side of the self-moving robot are respectively provided with the edge sensor, and the controller is further configured to make the self-moving robot travel along the obstacle with the strength of a signal currently received by the edge sensor as a reference after the self-moving robot turns through the included angle; wherein the reference is used to characterize an edgewise distance between the self-moving robot and the one surface of the obstacle.
Optionally, the controller is further configured to obtain the size of the obstacle according to the contour; and when the size is larger than a size threshold value and the robot rotates by the included angle, the self-moving robot is caused to travel along the obstacle with the strength of the signal currently received by the edge sensor as a reference.
Optionally, the aforementioned edge sensor is an infrared pair edge sensor, an infrared TOF edge sensor or an infrared triangulation edge sensor.
Optionally, the laser ranging device is a laser radar arranged on the top of the self-moving robot; and/or the first profile and the second profile are profiles of the same obstacle.
As can be understood by those skilled in the art, the self-moving robot disclosed in the present disclosure has at least the following beneficial effects:
1. the self-moving robot of the present disclosure can find a first contour of an obstacle through a laser ranging device, find a second contour of the obstacle through an edge sensor, and then cause a controller to selectively use the first contour and/or the second contour to cause the self-moving robot to travel along the obstacle. In short, the self-moving robot of the present disclosure can effectively detect an obstacle through a combination of the laser ranging device and the edge sensor, and avoid a situation that the obstacle cannot be detected, thereby causing the self-moving robot to be unable to travel along a target obstacle.
2. The self-moving robot of the present disclosure controls the self-moving robot to travel along the obstacle by causing the controller to control the self-moving robot to travel along the obstacle according to the profile detected by the edgewise sensor when the profile of the obstacle is detected by both the laser ranging device and the edgewise sensor. In short, when the self-moving robot detects obstacles through the laser ranging device and the edge sensor, the self-moving robot can be guided to carry out edge operation through the edge sensor which acquires edge data more quickly, and the response frequency of edge signals acquired when the self-moving robot is along the edge is ensured, so that the self-moving robot can accurately identify the outline of the obstacle, and the work efficiency when the robot is along the edge is ensured.
3. The self-moving robot of the present disclosure controls the self-moving robot to travel along the obstacle according to the profile detected by the laser ranging device by having the controller control the self-moving robot to travel along the obstacle when the obstacle is detected by the laser ranging device and the obstacle is not detected by the edge sensor. Because laser rangefinder has the detection distance for edgewise sensor long, promptly, the great advantage of detection range, consequently, this self-moving robot of disclosure can in time discover the barrier through laser rangefinder, has avoided edgewise sensor can not detect the target barrier and lead to the situation that self-moving robot can't carry out the edgewise operation to the target barrier.
4. The method can also obtain the included angle between the advancing direction of the self-moving robot and the target surface on the obstacle through the laser ranging device when the obstacle is detected by the laser ranging device and the obstacle is not detected by the edgewise sensor, and further enable the self-moving robot to rotate the included angle towards the direction close to the obstacle, so that the advancing direction of the self-moving robot is parallel to or tangent to the target surface on the obstacle, and the self-moving robot can carry out edgewise operation on the target surface.
5. The present disclosure can also enable the self-moving robot to travel along the obstacle by determining whether a distance between the self-moving robot and the obstacle satisfies an edge condition when the obstacle is detected by the laser ranging device and the obstacle is not detected by the edge sensor, and after the edge condition is satisfied. The self-moving robot can detect the suspended obstacle through the laser ranging device, and then carry out edge operation on the suspended obstacle.
6. By arranging a collision sensor at the front side of the self-moving robot and triggering the collision sensor, judging that an obstacle exists in front of the self-moving robot, and determining that the distance between the self-moving robot and the obstacle (specifically, the point of collision with the collision sensor) is 0 at the moment, the accurate distance between the self-moving robot and the obstacle is acquired at the moment. And then the position relation and the distance relation between the edge sensor and the collision point on the self-moving robot can be determined according to the position relation and the distance relation.
Further, an included angle between the advancing direction of the self-moving robot and the target surface on the obstacle is obtained through the laser ranging device, and then the self-moving robot is enabled to rotate towards the direction close to the obstacle, so that the advancing direction of the self-moving robot is parallel to or tangent to the target surface on the obstacle, and the self-moving robot is enabled to carry out edge operation on the target surface. And thus the edgewise sensor is brought into close proximity with the target surface of the obstacle, guiding the self-moving robot to perform the edgewise operation.
Further, the accurate distance between the edge sensor and the target surface can be acquired based on the angle of rotation of the self-moving robot, the positional relationship and the distance relationship between the edge sensor and the collision point before the self-moving robot turns, and the like. Therefore, the edge sensor can be calibrated according to the accurate distance, and specifically, the relationship between the signal acquired by the edge sensor and the edge distance is calibrated. Where edgewise distance refers to the distance between the edgewise sensor and the target surface.
Further, since the size of the obstacle is small, the edgewise distance of the self-moving robot is small. Therefore, if the size of the obstacle is small, after the self-moving robot rotates through the included angle, the edge sensor is not calibrated any more, and the calculation amount of the controller is reduced.
Drawings
Preferred embodiments of the present disclosure are described below, by way of example, with reference to the accompanying drawings, in which:
figure 1 is a rear view of a sweeping robot in a preferred embodiment of the present disclosure;
fig. 2 is a schematic view of an electric control structure of the sweeping robot in the preferred embodiment of the present disclosure;
fig. 3 is a schematic view illustrating the effect of the sweeping robot when encountering a suspended obstacle in the preferred embodiment of the present disclosure;
fig. 4 is a schematic view of a sweeping robot in a first state at an outer wall corner in a preferred embodiment of the present disclosure;
fig. 5 is a schematic view of a second state of the sweeping robot at an outer wall corner in the preferred embodiment of the present disclosure;
fig. 6 is a schematic view of a third state of the sweeping robot at an outer wall corner in the preferred embodiment of the present disclosure;
fig. 7 is a schematic view of the sweeping robot in the preferred embodiment of the present disclosure in a state on the table legs;
fig. 8 is a schematic view of a sweeping robot in a first state at an inner wall corner in a preferred embodiment of the present disclosure;
fig. 9 is a schematic view of a second state of the sweeping robot at an inner wall corner in the preferred embodiment of the present disclosure.
List of reference numerals:
1. a sweeping robot; 11. a laser radar; 12. an edge sensor; 13. a controller; 14. a collision sensor; 15. a traveling wheel;
2. a wall body;
3. a table leg.
Detailed Description
It should be understood by those skilled in the art that the embodiments described below are only preferred embodiments of the present disclosure, and do not mean that the present disclosure can be implemented only by the preferred embodiments, which are merely for explaining the technical principles of the present disclosure and are not intended to limit the scope of the present disclosure. All other embodiments that can be derived by one of ordinary skill in the art from the preferred embodiments provided by the disclosure without undue experimentation will still fall within the scope of the disclosure. For example, the self-moving robot of the present disclosure may also be any feasible device such as a mopping robot, a sweeping and mopping robot, a navigation robot, etc. Likewise, the laser ranging device may acquire any other laser device capable of detecting an obstacle with a laser radar.
It should be noted that in the description of the present disclosure, the terms "center", "upper", "lower", "top", "bottom", "left", "right", "vertical", "horizontal", "inner", "outer", and the like, which indicate directions or positional relationships, are based on the directions or positional relationships shown in the drawings, which are merely for convenience of description, and do not indicate or imply that the device or element must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Furthermore, it should be noted that, in the description of the present disclosure, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as being fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; either directly or indirectly through intervening media, or through the communication between two elements. The specific meaning of the above terms in the present disclosure can be understood by those skilled in the art as appropriate.
Furthermore, in the description of the present disclosure, each functional module (for example, a controller) may be a physical module composed of a plurality of structures, members, or electronic components, or may be a virtual module composed of a plurality of programs; each functional module may be a module that exists independently of each other, or may be a module that is functionally divided from an overall module. It should be understood by those skilled in the art that the technical solutions described in the present disclosure can be implemented without any change in the configuration, implementation, and positional relationship of the functional modules, which does not depart from the technical principles of the present disclosure, and therefore, the functional modules should fall within the protection scope of the present disclosure.
As shown in fig. 1 and 2, in a preferred embodiment of the present disclosure, the sweeping robot 1 includes a laser radar 11, a edgewise sensor 12, a controller 13, a collision sensor 14, and road wheels 15.
The laser radar 11 is arranged at the top of the sweeping robot 1 and used for scanning obstacles around the sweeping robot 1 and feeding back a detection result to the sweeping robot 1.
The edgewise sensor 12 is provided in lateral front of the sweeping robot 1, and the height of the edgewise sensor 12 is lower than the height of the laser radar 11. Specifically, the front of the left side and the front of the right side of the sweeping robot 1 are respectively provided with one edgewise sensor 12. The edge sensor 12 is used for detecting the distance between the obstacle and the sweeping robot 1 in the process that the sweeping robot 1 travels along the edge, and feeding the distance back to the sweeping robot 1, so that the sweeping robot 1 approaches to or leaves the obstacle according to the distance. Preferably, the edge sensor 12 is an infrared pair edge sensor. Alternatively, the edgewise sensor 12 may be configured as an infrared TOF edgewise sensor or an infrared triangulation edgewise sensor, as desired by those skilled in the art.
The controller 13 is in communication connection with the laser radar 11, the edge sensor 12 and the collision sensor 14 respectively, and the controller 13 can receive signals sent by the laser radar 11, the edge sensor 12 and the collision sensor 14 and can control the working state of the sweeping robot 1 according to the received corresponding signals.
The collision sensor 14 is disposed at the front of the sweeping robot 1, and is configured to sense that the sweeping robot 1 collides with an obstacle in front of the sweeping robot, and send a signal generated by the collision to the controller 13.
The walking wheels 15 are arranged at the bottom of the sweeping robot 1 and used for supporting the sweeping robot 1 to walk.
Since other structures of the sweeping robot 1 are well known to those skilled in the art and are not directly related to the technical solution to be protected by the present disclosure, detailed description thereof is omitted here.
The operation principle of the sweeping robot 1 in the preferred embodiment of the present disclosure will be described in detail with reference to fig. 3 to 9. Since most of the work and the change of the work posture of the sweeping robot 1 information processing are realized or controlled by the controller 13 in the technical solution of the present disclosure, the working principle of the sweeping robot 1 will be mainly described in detail below with reference to the laser radar 11, the edgewise sensor 12, the controller 13 and the collision sensor 14.
In a preferred embodiment of the present disclosure, lidar 11 is configured to be able to find a first profile of an obstacle; the edge sensor 12 is configured to be able to find a second profile of the obstacle; the controller 13 is configured to enable the sweeping robot 1 to travel along the obstacle using the first profile and/or the second profile selectively. Wherein the first and second profiles can exist on the same obstacle at the same time, and there may be a portion of overlap between the first and second profiles.
The method is specific to the suspended obstacles as follows:
the controller 13 is configured to cause the sweeping robot 1 to travel along or close to an obstacle in accordance with the first signal when the laser radar 11 has found the obstacle and acquired the first profile of the obstacle and thus generated the first signal, and when the edge sensor 12 has not found the obstacle.
As an example, as shown in fig. 3, when the sweeping robot 1 moves to the wall 2 (or cabinet, table, etc.) with a space at the bottom, the edge sensor 12 cannot detect the wall 2 above it, and therefore cannot acquire the second contour, but the laser radar 11 can detect the part of the wall. At this time, the controller 13 can receive only the signal of the wall 2 from the laser radar 11, but cannot receive the signal of the wall 2 from the edge sensor 12. The controller 13 controls the sweeping robot 1 to walk along the wall 2 according to the first signal received from the laser radar 11.
Wherein, the first signal comprises the distance between the sweeping robot 1 and the wall body 2.
In order to avoid that the distance between the sweeping robot 1 and the wall body 2 is too large, so that the effect of edge cannot be achieved. The controller 13 is further configured to read the distance from the first signal, compare the distance with the edgewise condition, and cause the sweeping robot 1 to travel along the wall 2 after the distance satisfies the edgewise condition; when the distance does not satisfy the edge condition, the sweeping robot 1 is moved towards the direction close to the wall body 2 until the distance satisfies the edge condition.
The edge condition means that the distance between the sweeping robot 1 and the wall 2 is less than or equal to a preset distance, the preset distance is the distance between the sweeping robot 1 and the wall 2 during normal edge operation, and the distance value can be obtained through actual measurement, for example, the distance value can be 0.5cm, 1cm, 1.2cm, and the like.
Based on the foregoing description, it can be understood by those skilled in the art that the present disclosure can provide the controller 13 with an edge signal through the laser radar 11, so that the controller 13 controls the sweeping robot 1 to perform an edge sweeping operation on the suspended obstacle.
For an obstacle with an outer corner, the following applies in particular:
the controller 13 is configured to cause the sweeping robot 1 to approach an obstacle according to the first signal when the laser radar 11 finds the obstacle and acquires the first profile of the obstacle and thus generates the first signal, and when the edge sensor 12 does not find the obstacle.
As an example, as shown in fig. 4, when the sweeping robot 1 travels along one wall surface of the wall body 2 to an outer corner of the wall body, the edge sensor 12 cannot detect the signal of the wall body 2, and therefore, the sweeping robot 1 cannot perform subsequent edge operation by only relying on the edge sensor 12. The next surface of the wall 2 can now be detected by the lidar 11. That is, the controller 13 can control the sweeping robot 1 to continue walking along the next plane of the wall body 2 according to the first signal received from the laser radar 11.
Wherein the first signal comprises information of the orientation of the wall 2.
Specifically, the controller 13 can read the direction information of the wall 2 from the first signal, and then rotate the sweeping robot 1 through a corresponding included angle according to the direction information, so as to be parallel to the next plane of the wall 2, and continue the edgewise operation.
More specifically, as the sweeping robot 1 continues to walk in fig. 4, the outer right angle of the wall 2 gradually falls behind the right side of the sweeping robot 1. At this time, the laser radar 11 detects that the outer right angle of the wall 2 is located at the right rear side, and detects that the next plane of the wall 2 is also located at the right rear side of the sweeping robot 1. At this time, the controller 13 can obtain the included angle between the advancing direction of the sweeping robot 1 and the next plane of the wall body 2, and then rotate the sweeping robot 1 toward the direction close to the next plane of the wall body 2 (as shown in fig. 4 to 6), so that the advancing direction of the sweeping robot 1 is parallel to the next plane of the wall body 2, and further, the sweeping robot 1 continues to advance along the plane (as shown in fig. 6).
Based on the foregoing description, as can be understood by those skilled in the art, the present disclosure can obtain the included angle between the advancing direction of the sweeping robot 1 and the target surface on the wall 2 through the laser radar 11, and then cause the controller 13 to control the sweeping robot 1 to rotate by the corresponding included angle so as to be parallel to the target plane. Therefore, the sweeping robot 1 disclosed by the invention can more reliably perform edge operation on the wall body 2 with the outer right angle, and avoids the situation that the wall body 2 cannot be completely edge-cleaned to cause cleaning leakage.
The cylindrical table leg comprises the following specific steps:
the controller 13 is likewise configured to cause the sweeping robot 1 to approach an obstacle in accordance with the first signal when the laser radar 11 has found the obstacle and acquired the first profile of the obstacle and thus generated the first signal, and when the edgewise sensor 12 has not found the obstacle.
As an example, as shown in fig. 7, when the sweeping robot 1 walks to the table legs 3, since the diameter of the table legs 3 is small, the edge sensor 12 can not detect the signal of the profile of the entire circumference of the table legs 3, and therefore, the sweeping robot 1 can not perform the subsequent edge operation by only relying on the edge sensor 12. In this case, however, the laser radar 11 is able to detect the contour of the table leg 3 at all times. That is, the controller 13 can control the sweeping robot 1 to walk around the table legs 3 according to the first signal received from the laser radar 11.
Wherein the first signal comprises information of the orientation of the table leg 3.
Specifically, the controller 13 can read out the orientation information of the table legs 3 from the first signal, and then rotate the sweeping robot 1 through a corresponding included angle according to the orientation information, so as to be tangent to the circumferential surfaces of the table legs 3, and continue the edgewise sweeping operation.
More specifically, as the sweeping robot 1 continues to walk in fig. 7, the table legs 3 gradually fall behind the right side of the sweeping robot 1. At this time, the laser radar 11 detects that the table leg 3 is located at the rear of the right side of the sweeping robot 1, the controller 13 can accordingly obtain an included angle between the advancing direction of the sweeping robot 1 and a connecting line between the table leg 3 and the laser radar 11, and then the sweeping robot 1 rotates the included angle towards a direction close to the table leg 3, so that the advancing direction of the sweeping robot 1 is tangent to the circumferential surface of the table leg 3, and the sweeping robot 1 can perform edge operation around the table leg 3.
Based on the foregoing description, it can be understood by those skilled in the art that the present disclosure can obtain the included angle between the advancing direction of the sweeping robot 1 and the connecting line between the table leg 3 and the laser radar 11 through the laser radar 11, and then cause the controller 13 to control the sweeping robot 1 to rotate by the corresponding included angle so as to be tangent to the circumferential surface of the table leg 3. Therefore, the sweeping robot 1 disclosed by the invention can realize walking around the cylindrical table legs 3, so that the edgewise operation on the cylindrical table legs 3 is more reliable, and the cleaning effect of the ground at the positions of the table legs 3 is ensured.
For an obstacle with an inner corner, the following is specific:
the controller 13 is configured to receive the signal triggered by the collision sensor 14 and to determine that the sweeping robot has collided with a surface of an obstacle; enabling the laser radar 11 to acquire a first profile of the obstacle, and further acquiring at least part of the geometric shape of the obstacle according to the first profile; further acquiring an included angle between the advancing direction of the sweeping robot 1 and one surface of the obstacle according to the geometric shape; and then the sweeping robot 1 rotates by a corresponding included angle, so that the advancing direction of the sweeping robot 1 is parallel to or tangent to one surface of the obstacle.
As an example, as shown in fig. 8, when the sweeping robot 1 travels along one wall surface of the wall body 2 to an inner corner of the wall body, the collision sensor 14 collides with the wall in front of the sweeping robot 1 and is thus triggered. At this time, the laser radar 11 is enabled to obtain a first contour of the wall 2, and send a first signal corresponding to the first contour to the controller 13, the controller 13 obtains the geometric shape (whether the geometric shape is a right angle, an obtuse angle, or an acute angle) of the inner corner according to the first contour, and then obtains the included angle between the advancing direction of the sweeping robot 1 and the front surface (i.e. the included angle between two adjacent walls at the inner corner) according to the geometric shape. The sweeping robot 1 is then rotated through this angle from the attitude shown in figure 8 to the attitude shown in figure 9. So that the edgewise sensor 12 changes from a position where it can detect the wall in the up-and-down direction in fig. 9 to a position where it can detect the wall in the left-and-right direction in fig. 9, and the edgewise sensor 12 continues to guide the sweeping robot 1 to perform the edgewise work.
As can be understood by those skilled in the art, since the distance between the sweeping robot 1 and the front wall is 0 when the collision sensor 14 is triggered, the accurate distance between the front end of the sweeping robot 1 and the front wall can be obtained. Since the relative position relationship between the edgewise sensor 12 and the front end of the sweeping robot 1 is fixed, the accurate distance between the edgewise sensor 12 and the wall in the left-right direction in fig. 9 can be obtained after the sweeping robot 1 rotates by the aforementioned included angle. The edge signal detected by the edge sensor 12 can now be calibrated with the exact distance as a reference. Specifically, the relationship between the signal acquired by the edge sensor 12 and the edge distance is calibrated. Here, the edgewise distance refers to a distance between the edgewise sensor 12 and a wall in the left-right direction in fig. 9. Thereby improving the accuracy of the edge sensor 12.
Based on the foregoing description, it can be understood by those skilled in the art that the present disclosure enables the sweeping robot 1 to continue the edgewise sweeping operation by turning at the inner corner of the wall body 2 through the laser radar 11 and the collision sensor 14, and also enables the edgewise sensor 12 to be calibrated in the process.
It will be appreciated by those skilled in the art that if the sweeping robot 1 hits a post of a small size (e.g. the table leg 3 shown in fig. 7), the time for the edge sensor 12 to detect the post will be short, the distance traveled by the sweeping robot 1 during this process will be short, and the variation in the distance between the sweeping robot 1 and the post will be small (or even negligible), so that the calibration of the edge sensor 12 is not necessary.
On this basis, the controller 13 is also configured to be able to obtain, after the collision sensor 14 is triggered, the size of the obstacle from the profile obtained by the lidar 11; and when the size is larger than the size threshold value and the sweeping robot 1 rotates through the corresponding included angle, enabling the sweeping robot 1 to travel along the obstacle by taking the intensity of the signal currently received by the edge sensor 12 as a reference.
So far, the technical solutions of the present disclosure have been described in connection with the foregoing embodiments, but it is easily understood by those skilled in the art that the scope of the present disclosure is not limited to only these specific embodiments. The technical solutions in the above embodiments can be split and combined, and equivalent changes or substitutions can be made on related technical features by those skilled in the art without departing from the technical principles of the present disclosure, and any changes, equivalents, improvements, and the like made within the technical concept and/or technical principles of the present disclosure will fall within the protection scope of the present disclosure.

Claims (10)

1. A self-moving robot, characterized by comprising:
a laser ranging device configured to enable finding a first profile of an obstacle;
an edge sensor configured to be able to find a second contour of the obstacle;
a controller configured to enable the self-moving robot to travel along the obstacle selectively using the first profile and/or the second profile.
2. The self-moving robot of claim 1, wherein the controller is further configured to cause the self-moving robot to travel along the obstacle in accordance with a first signal when the laser ranging device finds the first profile and thereby generates a first signal, and when the edgewise sensor finds the second profile and thereby generates a second signal.
3. The self-moving robot of claim 1, wherein the controller is further configured to cause the self-moving robot to travel along or near the obstacle in accordance with the first signal when the laser ranging device finds the first profile and thereby generates a first signal, and when the edgewise sensor does not find the second profile.
4. The self-moving robot according to claim 3, wherein the controller is further configured to acquire an angle between a forward direction of the self-moving robot and an upper target surface of the obstacle after determining that the obstacle is located laterally behind the self-moving robot from the first signal; and further enabling the self-moving robot to rotate the included angle towards the direction close to the obstacle, so that the advancing direction of the self-moving robot is parallel to or tangent to the target surface on the obstacle.
5. The self-moving robot according to claim 3, wherein the controller is further configured to cause the self-moving robot to travel along the obstacle after determining from the first signal that the distance between the self-moving robot and the obstacle satisfies an edge condition.
6. The self-moving robot according to claim 1, further comprising a collision sensor provided at a front side of the self-moving robot;
the controller is further configured to receive a signal from the collision sensor that the collision sensor is triggered, and to determine that the self-moving robot has collided with a surface of an obstacle; enabling the laser ranging device to acquire a first contour of the obstacle, and further acquiring at least part of the geometric shape of the obstacle according to the first contour; further acquiring an included angle between the advancing direction of the self-moving robot and the surface of the obstacle according to the geometric shape; and then the self-moving robot is rotated by the included angle, so that the advancing direction of the self-moving robot is parallel to or tangent to the surface of the obstacle.
7. The self-moving robot according to claim 6, wherein the left and right sides of the self-moving robot are provided with the edge sensors, respectively,
the controller is further configured to cause the self-moving robot to travel along the obstacle with reference to the intensity of the signal currently received by the edge sensor after the self-moving robot has rotated through the included angle;
wherein the reference is used to characterize an edgewise distance between the self-moving robot and the one surface of the obstacle.
8. The self-moving robot of claim 7, wherein the controller is further configured to be able to obtain a size of the obstacle from the contour; and when the size is larger than a size threshold value and after the self-moving robot rotates by the included angle, enabling the self-moving robot to travel along the obstacle by taking the intensity of the signal currently received by the edge sensor as a reference.
9. A self-moving robot as claimed in any of claims 1 to 8, wherein the edgewise sensor is an infrared pair tube edgewise sensor, an infrared TOF edgewise sensor or an infrared triangulation edgewise sensor.
10. The self-moving robot according to any one of claims 1 to 8, wherein the laser ranging device is a laser radar provided on a roof of the self-moving robot; and/or the like and/or,
the first and second profiles are profiles of the same obstacle.
CN202010763845.1A 2020-08-01 2020-08-01 Self-moving robot Active CN114052561B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010763845.1A CN114052561B (en) 2020-08-01 2020-08-01 Self-moving robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010763845.1A CN114052561B (en) 2020-08-01 2020-08-01 Self-moving robot

Publications (2)

Publication Number Publication Date
CN114052561A true CN114052561A (en) 2022-02-18
CN114052561B CN114052561B (en) 2023-08-04

Family

ID=80231367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010763845.1A Active CN114052561B (en) 2020-08-01 2020-08-01 Self-moving robot

Country Status (1)

Country Link
CN (1) CN114052561B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114983293A (en) * 2022-06-30 2022-09-02 深圳银星智能集团股份有限公司 Self-moving robot
CN115251765A (en) * 2022-07-04 2022-11-01 麦岩智能科技(北京)有限公司 Cleaning robot edge sweeping control method based on multiple sensors
CN116026315A (en) * 2023-03-22 2023-04-28 南京信息工程大学 Ventilating duct scene modeling and robot positioning method based on multi-sensor fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170156560A1 (en) * 2014-07-01 2017-06-08 Samsung Electronics Co., Ltd. Cleaning robot and controlling method thereof
CN207082015U (en) * 2017-08-01 2018-03-09 深圳市银星智能科技股份有限公司 Mobile robot
CN108181904A (en) * 2017-12-29 2018-06-19 深圳市艾特智能科技有限公司 Obstacle Avoidance, system, readable storage medium storing program for executing and robot
CN108803588A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 The control system of robot
CN109528101A (en) * 2019-01-04 2019-03-29 云鲸智能科技(东莞)有限公司 Turning method, mobile robot and the storage medium of mobile robot
CN110141160A (en) * 2019-05-29 2019-08-20 深圳市银星智能科技股份有限公司 Method for cleaning along wall surface by cleaning robot and cleaning robot
CN110955246A (en) * 2019-12-12 2020-04-03 深圳乐动机器人有限公司 Cleaning robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170156560A1 (en) * 2014-07-01 2017-06-08 Samsung Electronics Co., Ltd. Cleaning robot and controlling method thereof
CN108803588A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 The control system of robot
CN207082015U (en) * 2017-08-01 2018-03-09 深圳市银星智能科技股份有限公司 Mobile robot
CN108181904A (en) * 2017-12-29 2018-06-19 深圳市艾特智能科技有限公司 Obstacle Avoidance, system, readable storage medium storing program for executing and robot
CN109528101A (en) * 2019-01-04 2019-03-29 云鲸智能科技(东莞)有限公司 Turning method, mobile robot and the storage medium of mobile robot
CN110141160A (en) * 2019-05-29 2019-08-20 深圳市银星智能科技股份有限公司 Method for cleaning along wall surface by cleaning robot and cleaning robot
CN110955246A (en) * 2019-12-12 2020-04-03 深圳乐动机器人有限公司 Cleaning robot

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114983293A (en) * 2022-06-30 2022-09-02 深圳银星智能集团股份有限公司 Self-moving robot
WO2024001636A1 (en) * 2022-06-30 2024-01-04 深圳银星智能集团股份有限公司 Self-moving robot
CN115251765A (en) * 2022-07-04 2022-11-01 麦岩智能科技(北京)有限公司 Cleaning robot edge sweeping control method based on multiple sensors
CN116026315A (en) * 2023-03-22 2023-04-28 南京信息工程大学 Ventilating duct scene modeling and robot positioning method based on multi-sensor fusion

Also Published As

Publication number Publication date
CN114052561B (en) 2023-08-04

Similar Documents

Publication Publication Date Title
CN114052561B (en) Self-moving robot
US11256267B2 (en) Autonomous lawn mower, self-moving device, and method for recognizing obstacle by same
CN107041718B (en) Cleaning robot and control method thereof
EP3782774B1 (en) Mobile robot
CN107765688B (en) Autonomous mobile robot and automatic docking control method and device thereof
CN109997089B (en) Ground processing machine and ground processing method
US10653282B2 (en) Cleaning robot
CN112493924B (en) Cleaning robot and control method thereof
KR20210113986A (en) mobile robot
CN210402103U (en) Obstacle detection system and automatic navigation vehicle
CN112415998A (en) Obstacle classification and obstacle avoidance control system based on TOF camera
CN109407675B (en) Obstacle avoidance method and chip for robot returning seat and autonomous mobile robot
EP3835907A1 (en) Self-mobile device, automatic operating system and control method thereof
JP5826795B2 (en) Autonomous mobile body, its control system, and self-position detection method
CN112327878A (en) Obstacle classification and obstacle avoidance control method based on TOF camera
CN108762259B (en) Mowing robot traversal path planning method based on wireless signal intensity
CN112051844B (en) Self-moving robot and control method thereof
CN110580047B (en) Anti-falling traveling method of autonomous robot and autonomous robot
CN112214015A (en) Self-moving robot and recharging method, system and computer storage medium thereof
US20200201347A1 (en) Self-moving device and method for controlling movement path of same
US10765284B2 (en) Cleaning robot
US20220197299A1 (en) Recharging method for mobile robot and mobile robot
CN112450810A (en) Sweeping robot and control method for same
CN210016300U (en) Self-moving device and charging docking system
CN213934205U (en) Self-moving equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant