CN114052561B - Self-moving robot - Google Patents

Self-moving robot Download PDF

Info

Publication number
CN114052561B
CN114052561B CN202010763845.1A CN202010763845A CN114052561B CN 114052561 B CN114052561 B CN 114052561B CN 202010763845 A CN202010763845 A CN 202010763845A CN 114052561 B CN114052561 B CN 114052561B
Authority
CN
China
Prior art keywords
self
obstacle
moving robot
robot
profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010763845.1A
Other languages
Chinese (zh)
Other versions
CN114052561A (en
Inventor
王旭宁
田宏图
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharkninja China Technology Co Ltd
Original Assignee
Sharkninja China Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharkninja China Technology Co Ltd filed Critical Sharkninja China Technology Co Ltd
Priority to CN202010763845.1A priority Critical patent/CN114052561B/en
Publication of CN114052561A publication Critical patent/CN114052561A/en
Application granted granted Critical
Publication of CN114052561B publication Critical patent/CN114052561B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated

Abstract

The present disclosure provides a self-moving robot, characterized in that the self-moving robot includes a laser ranging device, an edge sensor, and a controller. The laser ranging device is configured to be able to find a first profile of the obstacle; the edge sensor is configured to be able to find a second profile of the obstacle; a controller is configured to selectively use the first profile and/or the second profile to cause the self-moving robot to travel along the obstacle. The self-moving robot can effectively detect the obstacle through the combination of the laser ranging device and the edge sensor, and avoid the situation that the obstacle cannot be detected, and then the self-moving robot cannot travel along the target obstacle.

Description

Self-moving robot
Technical Field
The disclosure belongs to the technical field of artificial intelligence, and particularly provides a self-moving robot.
Background
With the improvement of living water, intelligent sweeper is entering more and more families.
In order to achieve the purpose that the sweeping robot automatically cleans rooms, the existing sweeping robots have navigation functions, and the sweeping robot can automatically clean the rooms by means of the navigation functions. Among navigation modes commonly used for sweeping robots, the laser navigation mode occupies a very large specific gravity.
Considering that the wall-following cleaning is also an important component of the navigation planning of the floor sweeping robot, in order to realize the wall-following cleaning of the floor sweeping robots, most of the floor sweeping robots are further provided with wall-following sensors (also called edge sensors) so that the floor sweeping robots can guide the floor sweeping robots to perform the wall-following cleaning operation through signals detected by the wall-following sensors (the distance between the wall-following sensors and the wall body can be determined according to the signals).
However, during the course of the robot working along the wall, the wall sensor cannot detect all obstacles, such as the body portion of the suspended cabinet. Wherein the height of the suspended obstacle is less than the maximum height of the self-moving robot. In other words, the self-moving machine cannot pass under the overhead obstacle.
Disclosure of Invention
The present disclosure aims to provide a self-moving robot so as to be able to detect an obstacle to be edged and perform an edging operation on the obstacle.
The present disclosure provides a self-moving robot including:
a laser ranging device configured to be able to find a first profile of an obstacle;
an edge sensor configured to be able to find a second profile of the aforementioned obstacle;
a controller configured to enable the self-moving robot to travel along the obstacle using the first profile and/or the second profile selectively.
Optionally, the controller is further configured to cause the self-moving robot to travel along the obstacle according to the second signal when the laser ranging device finds the first profile and thus generates the first signal, and when the edge sensor finds the second profile and thus generates the second signal.
Optionally, the controller is further configured to cause the self-moving robot to travel along or close to the obstacle according to the first signal when the laser ranging device finds the first profile and thus generates the first signal, and when the edge sensor does not find the second profile.
Optionally, the controller is further configured to obtain an angle between a forward direction of the self-moving robot and a target surface on the obstacle after determining that the obstacle is located laterally rearward of the self-moving robot based on the first signal; and rotating the self-moving robot in a direction approaching the obstacle by the included angle so that the advancing direction of the self-moving robot is parallel or tangential to the target surface on the obstacle.
Optionally, the controller is further configured to cause the self-moving robot to travel along the obstacle after determining that a distance between the self-moving robot and the obstacle satisfies a limbed condition based on the first signal.
Optionally, the self-moving robot further includes a collision sensor disposed at a front side of the self-moving robot; the controller is further configured to receive a signal that the collision sensor is triggered and thereby determine that the self-moving robot has collided with a surface of an obstacle; further, the laser ranging device acquires a first contour of the obstacle, and further acquires the geometric shape of at least part of the obstacle according to the first contour; acquiring an included angle between the advancing direction of the self-moving robot and the one surface of the obstacle according to the geometric shape; and further, the self-moving robot is rotated by the included angle so that the advancing direction of the self-moving robot is parallel or tangential to the one surface of the obstacle.
Optionally, the left side and the right side of the self-mobile robot are respectively provided with the edge sensors, and the controller is further configured to cause the self-mobile robot to travel along the obstacle with the intensity of a signal currently received by the edge sensor as a reference after the self-mobile robot rotates by the included angle; wherein the reference is used to characterize an edge distance between the self-moving robot and the one surface of the obstacle.
Optionally, the controller is further configured to obtain a size of the obstacle according to the profile; and when the size is larger than a size threshold, and after the robot rotates by the included angle, making the self-moving robot travel along the obstacle with the intensity of the signal currently received by the edge sensor as a reference.
Optionally, the aforementioned edge sensor is an infrared pair-tube edge sensor, an infrared TOF edge sensor, or an infrared triangulation edge sensor.
Optionally, the laser ranging device is a laser radar arranged on the top of the self-moving robot; and/or the first contour and the second contour are the contours of the same obstacle.
As can be appreciated by those skilled in the art, the self-moving robot disclosed in the present disclosure has at least the following advantages:
1. the self-moving robot of the present disclosure is capable of finding a first profile of an obstacle through a laser ranging device, finding a second profile of the obstacle through an edge sensor, and then causing a controller to selectively use the first profile and/or the second profile to cause the self-moving robot to travel along the obstacle. In short, the self-moving robot can effectively detect the obstacle through the combination of the laser ranging device and the edge sensor, and avoid the situation that the obstacle cannot be detected, and then the self-moving robot cannot travel along the target obstacle.
2. The self-moving robot of the present disclosure controls the self-moving robot to travel along the obstacle according to the profile detected by the edge sensor by causing the controller to control the self-moving robot to travel along the obstacle when the profile of the obstacle is detected by both the laser ranging device and the edge sensor. In short, when the laser ranging device and the edge sensor detect the obstacle, the edge sensor which is faster in edge data acquisition can guide the self-moving robot to carry out edge operation, so that the response frequency of the edge signal acquired when the self-moving robot edges is ensured, the self-moving robot can accurately identify the outline of the obstacle, and the working efficiency in edge is ensured.
3. The self-moving robot of the present disclosure controls the self-moving robot to travel along the obstacle according to the profile detected by the laser ranging device when the laser ranging device detects the obstacle and the edge sensor does not detect the obstacle. Because the laser ranging device has the advantages of long detection distance, namely larger detection range, compared with the edge sensor, the self-moving robot can timely find the obstacle through the laser ranging device, and the situation that the self-moving robot cannot carry out edge operation on the target obstacle due to the fact that the edge sensor cannot detect the target obstacle is avoided.
4. The laser ranging device can acquire the included angle between the advancing direction of the self-moving robot and the target surface on the obstacle through the laser ranging device when the laser ranging device detects the obstacle and the edge sensor does not detect the obstacle, so that the self-moving robot rotates the included angle towards the direction close to the obstacle, the advancing direction of the self-moving robot is parallel or tangential to the target surface on the obstacle, and the self-moving robot can carry out edge operation on the target surface.
5. The present disclosure is also capable of, when the laser ranging device detects an obstacle and the edge sensor does not detect an obstacle, passing through whether a distance between the self-moving robot and the obstacle satisfies an edge condition, and after the satisfaction, causing the self-moving robot to travel along the obstacle. The self-moving robot can detect the suspended obstacle through the laser ranging device, and then edge the suspended obstacle.
6. By providing the collision sensor on the front side of the self-moving robot, and when the collision sensor is triggered, it is determined that an obstacle exists in front of the self-moving robot, and the distance between the self-moving robot and the obstacle (specifically, the point of collision with the collision sensor) at this time is determined to be 0, the accurate distance between the self-moving robot and the obstacle is obtained at this time. And then the position relation and the distance relation between the edge sensor and the collision point on the self-moving robot can be determined according to the position relation and the distance relation.
Further, the included angle between the advancing direction of the self-moving robot and the target surface on the obstacle is obtained through the laser ranging device, and then the self-moving robot rotates towards the direction close to the obstacle, so that the advancing direction of the self-moving robot is parallel or tangential to the target surface on the obstacle, and the self-moving robot performs edge-following operation on the target surface. And thus cause the edge sensor to be brought into close proximity with the target surface of the obstacle, leading from the mobile robot to perform the edge work.
Further, according to the angle through which the self-moving robot turns, the positional relationship and the distance relationship between the edge sensor and the collision point before the self-moving robot turns, the accurate distance between the edge sensor and the target surface can be obtained. Therefore, the edge sensor can be calibrated according to the accurate distance, and specifically, the relation between the signals acquired by the edge sensor and the edge distance is calibrated. Wherein the edge distance refers to the distance between the edge sensor and the target surface.
Further, since the size of the obstacle is small, the edge distance from the mobile robot is small. Therefore, if the size of the obstacle is smaller, the edge sensor is not calibrated after the self-moving robot rotates by an included angle, so that the operation amount of the controller is reduced.
Drawings
Preferred embodiments of the present disclosure will now be described, by way of example, with reference to the accompanying drawings, in which:
FIG. 1 is a rear view of a sweeping robot in a preferred embodiment of the present disclosure;
fig. 2 is a schematic view of an electric control structure of a sweeping robot according to a preferred embodiment of the present disclosure;
FIG. 3 is a schematic view of the effect of the sweeping robot encountering a suspended obstacle in a preferred embodiment of the disclosure;
FIG. 4 is a schematic view of a first state of the sweeping robot at an exterior corner in a preferred embodiment of the present disclosure;
FIG. 5 is a schematic view of a second state of the sweeping robot at an exterior corner in a preferred embodiment of the present disclosure;
FIG. 6 is a schematic view of a third state of the sweeping robot at an exterior corner in a preferred embodiment of the present disclosure;
fig. 7 is a schematic view showing a state of the sweeping robot at a table leg in the preferred embodiment of the present disclosure;
FIG. 8 is a schematic view of a first state of the sweeping robot at an interior corner in a preferred embodiment of the present disclosure;
fig. 9 is a schematic view of a second state of the sweeping robot at an inner corner in the preferred embodiment of the present disclosure.
List of reference numerals:
1. a sweeping robot; 11. a laser radar; 12. edge sensors; 13. a controller; 14. a collision sensor; 15. a walking wheel;
2. a wall body;
3. a table leg.
Detailed Description
It should be understood by those skilled in the art that the embodiments described below are only preferred embodiments of the present disclosure, and do not represent that the present disclosure can be realized only by the preferred embodiments, which are merely for explaining the technical principles of the present disclosure, not for limiting the scope of the present disclosure. Based on the preferred embodiments provided by the present disclosure, all other embodiments that may be obtained by one of ordinary skill in the art without inventive effort shall still fall within the scope of the present disclosure. For example, the self-moving robot of the present disclosure may also be any viable device such as a mopping robot, a sweeping and mopping robot, a navigation robot, and the like. Likewise, the laser ranging device may acquire any other laser device capable of detecting an obstacle by the laser radar.
It should be noted that, in the description of the present disclosure, terms such as "center," "upper," "lower," "top," "bottom," "left," "right," "vertical," "horizontal," "inner," "outer," and the like indicate directional or positional relationships, which are based on the directional or positional relationships shown in the drawings, are merely for convenience of description, and do not indicate or imply that the devices or elements must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Furthermore, it should be noted that, in the description of the present disclosure, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected, can be indirectly connected through an intermediate medium, and can also be communicated with the inside of two elements. The specific meaning of the terms in the present disclosure may be understood by those skilled in the art according to the specific circumstances.
Further, in the description of the present disclosure, each functional module (for example, a controller) may be a physical module formed by a plurality of structures, components, or electronic devices, or may be a virtual module formed by a plurality of programs; the functional modules may be independent modules or may be functionally divided by a single integral module. It should be understood by those skilled in the art that, on the premise of being able to implement the technical solution described in the present disclosure, the structural manner, implementation manner and positional relationship of each functional module do not deviate from the technical principles of the present disclosure in any way, and therefore, all functional modules shall fall within the protection scope of the present disclosure.
As shown in fig. 1 and 2, in a preferred embodiment of the present disclosure, the sweeping robot 1 includes a laser radar 11, an edge sensor 12, a controller 13, a collision sensor 14, and a traveling wheel 15.
The laser radar 11 is provided on the top of the robot cleaner 1, scans for obstacles around the robot cleaner 1, and feeds back the detection result to the robot cleaner 1.
The edge sensor 12 is disposed in front of the side of the sweeping robot 1, and the height of the edge sensor 12 is lower than that of the laser radar 11. Specifically, one edge sensor 12 is provided in front of the left side and the front of the right side of the robot cleaner 1, respectively. The edge sensor 12 is used for detecting the distance between the obstacle and the sweeping robot 1 in the process of the sweeping robot 1 travelling along the edge, and feeding back to the sweeping robot 1 to enable the sweeping robot 1 to approach or depart from the obstacle according to the distance. Preferably, the edge sensor 12 is an infrared pair-pipe edge sensor. Alternatively, the edge sensor 12 may be configured as an infrared TOF edge sensor or an infrared triangulation edge sensor as desired by those skilled in the art.
The controller 13 is in communication connection with the laser radar 11, the edge sensor 12 and the collision sensor 14 respectively, and the controller 13 can receive signals sent by the laser radar 11, the edge sensor 12 and the collision sensor 14 and can control the working state of the sweeping robot 1 according to the received corresponding signals.
The collision sensor 14 is provided at the front of the robot cleaner 1 to sense that the robot cleaner 1 collides with an obstacle in front, and transmits a signal generated by the collision to the controller 13.
The walking wheel 15 is arranged at the bottom of the sweeping robot 1 and is used for supporting the sweeping robot 1 to walk.
Since other structures of the robot 1 are well known to those skilled in the art and are not directly related to the technical solution to be protected by the present disclosure, a detailed description thereof will be omitted herein.
The working principle of the sweeping robot 1 in the preferred embodiment of the present disclosure will be described in detail with reference to fig. 3 to 9. Since most of the work of the information processing of the sweeping robot 1 and the change of the working posture are realized or controlled by the controller 13 in the technical scheme of the present disclosure, the working principle of the sweeping robot 1 will be described in detail below mainly in conjunction with the laser radar 11, the edge sensor 12, the controller 13 and the collision sensor 14.
In a preferred embodiment of the present disclosure, lidar 11 is configured to be able to find a first profile of an obstacle; the edge sensor 12 is configured to be able to find a second profile of the obstacle; the controller 13 is configured to enable the sweeping robot 1 to travel along the obstacle, selectively using the first profile and/or the second profile. Wherein the first profile and the second profile can be present on the same obstacle at the same time, and there may be a portion of overlap between the first profile and the second profile.
For a suspended obstacle, the method specifically comprises the following steps:
the controller 13 is configured to cause the sweeping robot 1 to travel along or approach an obstacle according to the first signal when the lidar 11 finds the obstacle and acquires the first profile of the obstacle and thus generates the first signal, and when the edge sensor 12 does not find the obstacle.
As an example, as shown in fig. 3, when the robot 1 moves to the wall 2 (or a cabinet, a table, or the like) having a space at the bottom, the edge sensor 12 cannot detect the wall 2 thereabove, and thus cannot acquire the second contour, and the lidar 11 can detect the portion of the wall. At this time, the controller 13 can only receive the signal of the wall 2 from the lidar 11, but cannot receive the signal of the wall 2 from the edge sensor 12. The controller 13 controls the sweeping robot 1 to walk along the wall 2 according to the first signal received from the lidar 11.
Wherein the first signal comprises the distance between the robot 1 and the wall 2.
In order to avoid that the effect along the edge cannot be achieved due to the overlarge distance between the sweeping robot 1 and the wall body 2. The controller 13 is further configured to read the distance from the first signal, compare the distance with the edge condition, and cause the robot 1 to travel along the wall 2 after the distance satisfies the edge condition; when the distance does not satisfy the edge condition, the robot 1 is moved in a direction approaching the wall 2 until the distance satisfies the edge condition.
The edge condition means that the distance between the sweeping robot 1 and the wall body 2 is smaller than or equal to a preset distance, and the preset distance is the distance between the sweeping robot 1 and the wall body 2 during normal edge operation, and the distance value can be obtained through actual measurement, for example, can be 0.5cm, 1cm, 1.2cm and the like.
Based on the foregoing description, those skilled in the art will appreciate that the present disclosure is capable of providing edgewise signals to controller 13 via lidar 11, thereby enabling controller 13 to control sweeping robot 1 to perform an edgewise sweeping operation on a suspended obstacle.
For an obstacle with an outer corner, the following is specific:
the controller 13 is configured to cause the sweeping robot 1 to approach an obstacle according to the first signal when the laser radar 11 finds the obstacle and acquires the first profile of the obstacle and thus generates the first signal, and when the edge sensor 12 does not find the obstacle.
As an example, as shown in fig. 4, when the robot 1 walks to the outer corner of the wall 2 along one wall surface of the wall 2, the edge sensor 12 cannot detect the signal of the wall 2, and thus the robot 1 cannot perform the subsequent edge work by means of the edge sensor 12 alone. At this time, the lidar 11 can detect the next surface of the wall 2. That is, the controller 13 can control the sweeping robot 1 to continue to walk along the next plane of the wall 2 according to the first signal received from the lidar 11.
Wherein the first signal comprises azimuth information of the wall 2.
Specifically, the controller 13 can read the azimuth information of the wall body 2 from the first signal, and then rotate the sweeping robot 1 by a corresponding included angle according to the azimuth information, so that the sweeping robot is parallel to the next plane of the wall body 2, and the edge-following operation is continued.
Further specifically, as the robot 1 continues to walk in fig. 4, the outer right angle of the wall 2 gradually falls to the rear right side of the robot 1. At this time, the lidar 11 detects that the outer right angle of the wall 2 is located at the right rear side, and simultaneously detects that the next plane of the wall 2 is also located at the right rear side of the sweeping robot 1. At this time, the controller 13 can obtain the angle between the advancing direction of the robot 1 and the next plane of the wall 2 accordingly, and then rotate the robot 1 by the angle toward the next plane of the wall 2 (as shown in fig. 4 to 6), so that the advancing direction of the robot 1 is parallel to the next plane of the wall 2, and further the robot 1 continues to travel along the plane (as shown in fig. 6).
Based on the foregoing description, it will be understood by those skilled in the art that the present disclosure can obtain an angle between the advancing direction of the robot 1 and the target surface on the wall 2 by the laser radar 11, and then cause the controller 13 to control the robot 1 to rotate by a corresponding angle so as to be parallel to the target plane. Therefore, the robot 1 for sweeping floor can work more reliably on the edges of the wall body 2 with the outer right angle, and avoid the situation that the wall body 2 cannot be completely edged and cleaning is missed.
For a cylindrical table leg, the following is specific:
the controller 13 is likewise configured to bring the sweeping robot 1 close to an obstacle according to the first signal when the lidar 11 finds the obstacle and acquires the first profile of the obstacle and thus generates the first signal, and when the edge sensor 12 does not find the obstacle.
As an example, as shown in fig. 7, when the sweeping robot 1 walks to the table leg 3, since the diameter of the table leg 3 is small, the edge sensor 12 likewise cannot detect the signal of the entire circumferential surface contour of the table leg 3, and thus the sweeping robot 1 cannot be caused to perform the subsequent edge work by means of the edge sensor 12 alone. In this case, the lidar 11 can always detect the contour of the table leg 3. That is, the controller 13 can control the sweeping robot 1 to walk around the table leg 3 according to the first signal received from the lidar 11.
Wherein the first signal comprises the orientation information of the table leg 3.
Specifically, the controller 13 can read the azimuth information of the table leg 3 from the first signal, and then rotate the sweeping robot 1 by a corresponding angle according to the azimuth information, so that the sweeping robot is tangent to the circumferential surface of the table leg 3, and the edge sweeping operation is continued.
Further specifically, as the robot cleaner 1 continues to walk in fig. 7, the table leg 3 gradually falls to the rear right of the robot cleaner 1. At this time, the lidar 11 detects that the table leg 3 is located at the right rear side of the sweeping robot 1, and the controller 13 can acquire an angle between the advancing direction of the sweeping robot 1 and the line connecting the table leg 3 and the lidar 11 accordingly, then rotate the sweeping robot 1 toward the direction close to the table leg 3 by the angle, make the advancing direction of the sweeping robot 1 tangential to the circumferential surface of the table leg 3, and further make the sweeping robot 1 perform the edge-following operation around the table leg 3.
Based on the foregoing description, it can be understood by those skilled in the art that the present disclosure can obtain an angle between the advancing direction of the sweeping robot 1 and the line connecting the table leg 3 and the laser radar 11 through the laser radar 11, and then cause the controller 13 to control the sweeping robot 1 to rotate by the corresponding angle so as to be tangent to the circumferential surface of the table leg 3. Therefore, the robot 1 for sweeping floor of the present disclosure can realize walking around the cylindrical table leg 3, thereby making the operation of the cylindrical table leg 3 along the edge more reliable, and ensuring the cleaning effect of the floor at the table leg 3.
For an obstacle with an inside corner, the following is specific:
the controller 13 is configured to be able to receive the signal triggered by the collision sensor 14 and thus determine that the sweeping robot has collided with one surface of the obstacle; further causing the lidar 11 to acquire a first contour of the obstacle, and further acquiring a geometry of at least part of the obstacle from the first contour; obtaining an included angle between the advancing direction of the robot 1 and one surface of the obstacle according to the geometric shape; and further the sweeping robot 1 is rotated by a corresponding angle so that the advancing direction of the sweeping robot 1 is parallel or tangential to one surface of the obstacle.
As an example, as shown in fig. 8, when the robot cleaner 1 walks along one wall surface of the wall body 2 to the inner corner of the wall body, the collision sensor 14 collides with the wall in front of the robot cleaner 1 and is thus triggered. At this time, the lidar 11 is caused to acquire a first contour of the wall body 2, and a first signal corresponding to the first contour is sent to the controller 13, and the controller 13 acquires the geometry (whether right angle, obtuse angle or acute angle) of the inner corner according to the first contour, and then acquires the angle between the advancing direction of the robot 1 and the front surface (i.e., the angle between two adjacent walls at the inner corner) according to the geometry. And further rotates the sweeping robot 1 through the angle from the posture shown in fig. 8 to the posture shown in fig. 9. So that the edge sensor 12 is changed from a position capable of detecting the wall in the up-down direction in fig. 9 to a position capable of detecting the wall in the left-right direction in fig. 9, and the edge sensor 12 is further caused to continue to instruct the sweeping robot 1 to perform the edge work.
As will be appreciated by those skilled in the art, since the distance between the robot cleaner 1 and the front wall is 0 when the collision sensor 14 is triggered, an accurate distance between the front end of the robot cleaner 1 and the front wall can be obtained. Since the relative positional relationship between the edge sensor 12 and the front end of the robot 1 is fixed, an accurate distance between the edge sensor 12 and the wall in the left-right direction in fig. 9 can be obtained after the robot 1 rotates by the aforementioned angle. The edge signal detected by the edge sensor 12 can now be calibrated with this exact distance as a reference. Specifically, the relationship between the signal acquired by the edge sensor 12 and the edge distance is calibrated. Here, the edge distance refers to a distance between the edge sensor 12 and the wall in the left-right direction in fig. 9. Thereby improving the accuracy of the edge sensor 12.
Based on the foregoing description, it can be appreciated by those skilled in the art that the present disclosure enables the robot 1 to make turns at the interior corners of the wall 2 to continue the edge-sweeping operation by the lidar 11 and the collision sensor 14, while also enabling calibration of the edge sensor 12 during the process.
It will be appreciated by those skilled in the art that if the robot 1 collides with a pole of small size (e.g., the table leg 3 shown in fig. 7), the time for which the edge sensor 12 detects the pole will be short, the distance traveled by the robot 1 during this process will be short, and the distance between the robot 1 and the pole will vary little (even negligible), thus eliminating the need for calibrating the edge sensor 12.
Based on this, the controller 13 is further configured to be able to acquire the size of the obstacle from the profile acquired by the lidar 11 after the collision sensor 14 is triggered; and when the size is larger than the size threshold value, and after the sweeping robot 1 rotates by a corresponding included angle, the sweeping robot 1 is made to travel along the obstacle by taking the intensity of the signal currently received by the edge sensor 12 as a reference.
Thus far, the technical solution of the present disclosure has been described in connection with the foregoing embodiments, but it is easily understood by those skilled in the art that the protective scope of the present disclosure is not limited to only these specific embodiments. The technical solutions in the above embodiments may be split and combined by those skilled in the art without departing from the technical principles of the present disclosure, and equivalent modifications or substitutions may be made to related technical features, which all fall within the scope of the present disclosure.

Claims (9)

1. A self-moving robot, characterized in that it comprises:
a laser ranging device configured to be able to find a first profile of an obstacle;
an edge sensor configured to be able to find a second profile of the obstacle;
a controller configured to enable the self-moving robot to travel along the obstacle using selectively the first profile and/or the second profile;
the obstacle comprises a suspended obstacle, the first profile is found by the laser ranging device and a first signal is generated, and the self-moving robot is caused to travel along or approach the obstacle according to the first signal when the second profile is not found by the edge sensor;
the laser ranging device provides a signal for the controller along edges, and the controller controls the self-moving robot to carry out the operation of cleaning the suspended obstacle along edges.
2. The self-moving robot of claim 1, wherein the controller is further configured to cause the self-moving robot to travel along the obstacle in accordance with the second signal when the laser ranging device finds the first profile and thus generates a first signal, and when the edge sensor finds the second profile and thus generates a second signal.
3. The self-moving robot of claim 1, wherein the controller is further configured to obtain an angle between a forward direction of the self-moving robot and a target surface on the obstacle after determining from the first signal that the obstacle is located laterally behind the self-moving robot; and further rotating the self-moving robot toward a direction approaching the obstacle by the included angle so that the advancing direction of the self-moving robot is parallel or tangential to the target surface on the obstacle.
4. The self-moving robot of claim 1, wherein the controller is further configured to cause the self-moving robot to travel along the obstacle after determining from the first signal that a distance between the self-moving robot and the obstacle satisfies a limbed condition.
5. The self-moving robot according to claim 1, further comprising a collision sensor provided at a front side of the self-moving robot;
the controller is further configured to receive a signal that the collision sensor is triggered and thereby determine that the self-moving robot has collided with a surface of an obstacle; further, the laser ranging device acquires a first contour of the obstacle, and further acquires the geometric shape of at least part of the obstacle according to the first contour; obtaining an included angle between the advancing direction of the self-moving robot and the one surface of the obstacle according to the geometric shape; and further rotating the self-moving robot by the included angle so that the advancing direction of the self-moving robot is parallel or tangential to the one surface of the obstacle.
6. The self-moving robot according to claim 5, wherein the edge sensors are provided on left and right sides of the self-moving robot, respectively,
the controller is further configured to cause the self-moving robot to travel along the obstacle with reference to the strength of the signal currently received by the edge sensor after the self-moving robot turns through the included angle;
wherein the reference is used to characterize an edge distance between the self-moving robot and the one surface of the obstacle.
7. The self-moving robot of claim 6, wherein the controller is further configured to obtain a size of the obstacle from the profile; and further causing the self-moving robot to travel along the obstacle with the strength of the signal currently received by the edge sensor as a reference when the size is greater than a size threshold and after the self-moving robot has rotated through the included angle.
8. The self-moving robot of any one of claims 1 to 7, wherein the edge sensor is an infrared pair-pipe edge sensor, an infrared TOF edge sensor, or an infrared delta-distance edge sensor.
9. The self-moving robot according to any one of claims 1 to 7, wherein the laser ranging device is a laser radar provided on top of the self-moving robot; and/or the number of the groups of groups,
the first profile and the second profile are profiles of the same obstacle.
CN202010763845.1A 2020-08-01 2020-08-01 Self-moving robot Active CN114052561B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010763845.1A CN114052561B (en) 2020-08-01 2020-08-01 Self-moving robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010763845.1A CN114052561B (en) 2020-08-01 2020-08-01 Self-moving robot

Publications (2)

Publication Number Publication Date
CN114052561A CN114052561A (en) 2022-02-18
CN114052561B true CN114052561B (en) 2023-08-04

Family

ID=80231367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010763845.1A Active CN114052561B (en) 2020-08-01 2020-08-01 Self-moving robot

Country Status (1)

Country Link
CN (1) CN114052561B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114983293A (en) * 2022-06-30 2022-09-02 深圳银星智能集团股份有限公司 Self-moving robot
CN115251765A (en) * 2022-07-04 2022-11-01 麦岩智能科技(北京)有限公司 Cleaning robot edge sweeping control method based on multiple sensors
CN116026315B (en) * 2023-03-22 2023-06-13 南京信息工程大学 Ventilating duct scene modeling and robot positioning method based on multi-sensor fusion

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207082015U (en) * 2017-08-01 2018-03-09 深圳市银星智能科技股份有限公司 Mobile robot
CN108181904A (en) * 2017-12-29 2018-06-19 深圳市艾特智能科技有限公司 Obstacle Avoidance, system, readable storage medium storing program for executing and robot
CN108803588A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 The control system of robot
CN109528101A (en) * 2019-01-04 2019-03-29 云鲸智能科技(东莞)有限公司 Turning method, mobile robot and the storage medium of mobile robot
CN110141160A (en) * 2019-05-29 2019-08-20 深圳市银星智能科技股份有限公司 Method for cleaning along wall surface by cleaning robot and cleaning robot
CN110955246A (en) * 2019-12-12 2020-04-03 深圳乐动机器人有限公司 Cleaning robot

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106572776B (en) * 2014-07-01 2020-04-24 三星电子株式会社 Cleaning robot and control method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108803588A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 The control system of robot
CN207082015U (en) * 2017-08-01 2018-03-09 深圳市银星智能科技股份有限公司 Mobile robot
CN108181904A (en) * 2017-12-29 2018-06-19 深圳市艾特智能科技有限公司 Obstacle Avoidance, system, readable storage medium storing program for executing and robot
CN109528101A (en) * 2019-01-04 2019-03-29 云鲸智能科技(东莞)有限公司 Turning method, mobile robot and the storage medium of mobile robot
CN110141160A (en) * 2019-05-29 2019-08-20 深圳市银星智能科技股份有限公司 Method for cleaning along wall surface by cleaning robot and cleaning robot
CN110955246A (en) * 2019-12-12 2020-04-03 深圳乐动机器人有限公司 Cleaning robot

Also Published As

Publication number Publication date
CN114052561A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN114052561B (en) Self-moving robot
CN109310049B (en) Intelligent mower, self-moving equipment and obstacle recognition method of intelligent mower
US11547255B2 (en) Cleaning robot
EP3782774B1 (en) Mobile robot
CN109997089B (en) Ground processing machine and ground processing method
CN109567678B (en) Sweeping control method and device of sweeping robot and sweeping robot
CN112493924B (en) Cleaning robot and control method thereof
US20210161067A1 (en) Moving robot, moving robot system, and method for moving to charging station of moving robot
CN112051844B (en) Self-moving robot and control method thereof
KR20210113986A (en) mobile robot
US20140115797A1 (en) Self-driven floor cleaning device
CN112415998A (en) Obstacle classification and obstacle avoidance control system based on TOF camera
WO2019062119A1 (en) Autonomous mobile robot and control method and device for automatic docking thereof
US20100076599A1 (en) Manually driven determination of a region of interest (roi) or a path of interest (poi) for a robotic device
JP5826795B2 (en) Autonomous mobile body, its control system, and self-position detection method
CN112327878A (en) Obstacle classification and obstacle avoidance control method based on TOF camera
CN210402103U (en) Obstacle detection system and automatic navigation vehicle
US10765284B2 (en) Cleaning robot
US20200201347A1 (en) Self-moving device and method for controlling movement path of same
KR102033676B1 (en) Charging System for Mobile Robot and Method thereof
CN216454804U (en) Base station and cleaning robot system
EP4191360A1 (en) Distance measurement device and robotic vacuum cleaner
CN110088701B (en) Operating method for a self-propelled cleaning device and such a cleaning device
CN210931169U (en) Robot
CN213934205U (en) Self-moving equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant