CN109984678B - Cleaning robot and cleaning method thereof - Google Patents

Cleaning robot and cleaning method thereof Download PDF

Info

Publication number
CN109984678B
CN109984678B CN201711481866.9A CN201711481866A CN109984678B CN 109984678 B CN109984678 B CN 109984678B CN 201711481866 A CN201711481866 A CN 201711481866A CN 109984678 B CN109984678 B CN 109984678B
Authority
CN
China
Prior art keywords
cleaning
cleaning robot
area
virtual area
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711481866.9A
Other languages
Chinese (zh)
Other versions
CN109984678A (en
Inventor
许思晨
张一茗
陈震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qfeeltech Beijing Co Ltd
Original Assignee
Qfeeltech Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qfeeltech Beijing Co Ltd filed Critical Qfeeltech Beijing Co Ltd
Priority to CN201711481866.9A priority Critical patent/CN109984678B/en
Publication of CN109984678A publication Critical patent/CN109984678A/en
Application granted granted Critical
Publication of CN109984678B publication Critical patent/CN109984678B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a cleaning robot and a cleaning method of the cleaning robot, wherein the method comprises the following steps: setting a virtual area; and controlling the cleaning robot to clean a virtual area inside the actual working area. The method realizes the efficient cleaning of the unknown area by dividing the actual working area into the virtual areas and cleaning the interior of the virtual areas under the condition of no initial map.

Description

Cleaning robot and cleaning method thereof
Technical Field
The invention belongs to the field of cleaning robots, and particularly relates to a cleaning robot and a cleaning method of the cleaning robot.
Background
In recent years, cleaning robots have been rapidly developed, and it is the most important point for such robots to achieve comprehensive and efficient cleaning of unknown environments. Most cleaning robots can achieve a high cleaning rate, but mainly have the following problems: a random sweeping mode of a first, conventional cleaning robot often misses a large number of uncleaned areas, and the cleaned areas are often repeatedly cleaned, resulting in low cleaning efficiency; secondly, some novel sweeper need establish the initial map of actual work area in advance with the help of range finding equipment such as laser radar distancer, then the sweeper can clean on having the map, thereby increased the cost of range finding equipment (for example the price of general laser radar distancer is from several hundred yuan to several thousand yuan etc.) and increased the cost of sweeper on the one hand, on the other hand, increases the volume, the weight and the complexity of equipment.
Disclosure of Invention
The present invention is directed to the above problem, and in an embodiment of the present invention, a cleaning method of a cleaning robot includes:
setting a virtual area, wherein the cleaning robot is positioned inside or on the boundary of the virtual area;
and controlling the cleaning robot to clean a virtual area inside the actual working area.
Preferably, the virtual area is set by a preset instruction for controlling the movement distance and the rotation angle of the cleaning robot, and the instruction enables a closed space surrounded by the movement track of the cleaning robot in the actual working area to form the virtual area.
Preferably, the process of controlling the cleaning robot to clean the virtual area inside the actual working area includes:
executing the command for controlling the movement distance and the rotation angle of the cleaning robot to obtain a closed space;
cleaning the interior of the enclosed space.
Preferably, in the process of obtaining the closed space by executing the command for controlling the movement distance and the rotation angle of the cleaning robot, after encountering an obstacle, the cleaning robot is controlled to turn to a preset turn direction and to run along the edge of the obstacle, the sum of the projection length of a track running along the edge of the obstacle in the current direction before encountering the obstacle and the running length of the track in the current direction is calculated in real time, and when the calculated value is greater than the preset distance in the current direction, the command for controlling the movement distance and the rotation angle of the cleaning robot is continuously executed.
Preferably, the virtual area is determined by a preset area and a preset proportion; wherein the preset area is set on the display by a user.
Preferably, the user sets the preset area by drawing a frame on the display; and/or
The user sets the preset area by inputting the length and the width of the virtual area or inputting the length or the width of the virtual area and the ratio of the length to the width; and/or
And setting the preset area by the user in a form of inputting a coordinate range.
Preferably, the process of controlling the cleaning robot to clean the virtual area inside the actual working area includes:
controlling the cleaning robot to move to a boundary point of a boundary of the virtual area along the current direction;
controlling the cleaning robot to move around the virtual area from the intersection point along the boundary until the cleaning robot returns to the intersection point, wherein the virtual area is formed by a closed space surrounded by the moving track of the cleaning robot; if the cleaning robot encounters an obstacle when running around the virtual area along the boundary, controlling the cleaning robot to run along the edge of the obstacle in the virtual area;
cleaning the interior of the enclosed space.
Preferably, the process of cleaning the inside of the closed space includes:
s510: controlling the cleaning robot to move in a first cleaning direction inside the virtual area to meet the boundary of the virtual area or the boundary of the cleaned area;
s520: controlling the cleaning robot to turn to a first steering to a first offset direction which is the same as the meeting boundary direction and continue to run for a first offset length, and turn to a second cleaning direction which is parallel and opposite to the first cleaning direction;
s530: controlling the cleaning robot to move in a second cleaning direction inside the virtual area to meet the boundary of the virtual area or the boundary of the cleaned area;
s540: controlling the cleaning robot to turn to a second deviation direction which is the same as the meeting boundary direction in a second turning direction and continue to run for a second deviation length, and turning to the first cleaning direction in the second turning direction;
s550: repeating the steps S510 to S540 until the track of the cleaning robot traverses the operable area in the virtual area.
Preferably, the first offset length is a projection length of a travel track along the first offset direction on a first cleaning direction perpendicular; and/or the second deviation length is the projection length of the running track along the second deviation direction on the second cleaning direction vertical line.
Preferably, if the cleaning robot encounters an obstacle when running in the virtual area along the current cleaning direction, the cleaning robot is controlled to run along the edge of the obstacle, the projection length of a trajectory running along the edge of the obstacle on the perpendicular to the current cleaning direction is calculated in real time, when the projection length is equal to a third offset length, the cleaning robot is controlled to turn to the cleaning direction opposite to the current cleaning direction to continue running, and when the projection length is equal to 0, the cleaning robot is controlled to turn to the current cleaning direction to continue running.
Preferably, the cleaning method of the cleaning robot further includes: the cleaned area is recorded.
Preferably, the cleaning method of the cleaning robot further includes: dividing regions outside the virtual region in the actual working region according to the virtual region to obtain a plurality of expanded virtual regions;
and after cleaning the current virtual area, sequentially cleaning other extended virtual areas.
Preferably, the cleaning of the other extended virtual areas is performed sequentially in sequence, and the next extended virtual area to be cleaned is determined according to the distance between the current position of the cleaning robot after the cleaning of the current virtual area and the other extended virtual areas.
In another embodiment of the present invention, there is also provided a cleaning robot including: the obstacle detection device comprises an obstacle sensing module, a motion module, a positioning module and a control module;
the obstacle sensing module is connected with the control module and used for sensing obstacle information;
the motion module is connected with the control module and is used for driving the cleaning robot to move under the control of the control module;
the positioning module is connected with the control module and used for positioning the cleaning robot in a virtual area;
the control module is used for executing the cleaning method of the cleaning robot in any embodiment.
Preferably, the obstacle sensing module is one or more of a collision sensor, an infrared sensor, a cliff sensor and a TOF ranging sensor; the positioning module is one or more of a camera vision positioning module, a laser ranging positioning module or a milemeter.
Under the condition of no initial map, the method for cleaning the interior of the virtual area by dividing the virtual area in the actual working area and controlling the cleaning robot realizes the high-efficiency cleaning of the unknown area.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a cleaning method of a cleaning robot according to an embodiment of the present invention;
FIGS. 2A and 2B are schematic views of an enclosed space according to an embodiment of the present invention;
fig. 3A and 3B are flowcharts illustrating a cleaning robot according to an embodiment of the present invention cleaning a virtual area inside an actual working area;
fig. 4A to 4E are cleaning circuit diagrams generated in the process of dividing the closed space by the cleaning robot;
FIG. 5 is a flow chart of the cleaning of the interior of an enclosure according to an embodiment of the present invention;
fig. 6 is a flowchart of a cleaning method of a cleaning robot according to another embodiment of the present invention;
FIG. 7 is a diagram illustrating an extended virtual area according to an embodiment of the present invention;
fig. 8 is a configuration diagram of a cleaning robot according to an embodiment of the present invention;
FIG. 9A is a schematic diagram of an actual working area according to an embodiment of the present invention;
fig. 9B to 9O are cleaning line diagrams generated during the cleaning of the actual working area shown in fig. 9A by the cleaning robot.
Detailed Description
In order to make the technical features and effects of the invention more obvious, the technical solution of the invention is further described below with reference to the accompanying drawings, the invention can also be described or implemented by other different specific examples, and any equivalent changes made by those skilled in the art within the scope of the claims are within the scope of the invention.
In the description herein, references to the description of "an embodiment," "a particular embodiment," "such as" or "some embodiments," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The sequence of steps involved in the various embodiments is provided to schematically illustrate the practice of the invention, and the sequence of steps is not limited and can be suitably adjusted as desired.
As shown in fig. 1, fig. 1 is a flowchart of a cleaning method of a cleaning robot according to an embodiment of the present invention. The embodiment is suitable for the condition without an initial map, and realizes efficient cleaning of an unknown area by dividing the virtual area in the actual working area and controlling the cleaning robot to clean the inside of the virtual area. Specifically, the method comprises the following steps:
step S100, setting a virtual area, wherein the cleaning robot is positioned in the virtual area or on the boundary;
and step S200, controlling the cleaning robot to clean a virtual area in the actual working area.
In an embodiment of the present invention, the virtual area in step S100 is set by preset instructions for controlling the movement distance and the rotation angle of the cleaning robot, and the instructions enable a closed space surrounded by the movement track of the cleaning robot in the actual working area to form the virtual area. The instructions for controlling the moving distance and the rotating angle of the cleaning robot can be, for example: 1) controlling the cleaning robot to run for a first preset length (which can be measured by an odometer on the cleaning robot) along the current direction by taking the current position of the cleaning robot as an initial position, as shown from point a to point b in fig. 2A; 2) turning to a preset turn (the preset turn includes both a turning direction and a turning angle, in this case, a counterclockwise rotation of 90 ° with reference to the current direction of the cleaning robot); 3) running a second preset length in the current direction, from point b to point c in fig. 2A; 4) steering a preset steering (still 90 counterclockwise in this example); 5) running a first preset length in the current direction, as shown from point c to point d in fig. 2A; 6) steering a preset steering (same as above); 7) the second preset length is continued to be run in the current direction to return to the starting position, as shown from point d to point a in fig. 2A. As can be seen from fig. 2A, the instructions cause the closed space (i.e., the rectangle abcd) surrounded by the moving track of the cleaning robot in the actual working area to constitute the virtual area. The first preset length and the second preset length are not limited and can be set according to requirements. The preset steering includes both a turning direction and a turning angle, and may be a left turning or a right turning, and the angle of each turning may be set as required, which is not limited in the present invention. When the first preset length is 2m, the second preset length is 1m, the preset turning direction is left turning, and the turning angle is 90 degrees, the closed space obtained by executing the command for controlling the movement distance and the turning angle of the cleaning robot is shown in fig. 2A.
In practice, considering the influence of the obstacle, adding the operation logic of encountering the obstacle to the command for controlling the moving distance and the rotation angle of the cleaning robot, if the obstacle is encountered, as shown in fig. 2B, controlling the cleaning robot to turn to a preset turn (in this case, turn 90 degrees counterclockwise) and run along the edge of the obstacle (either to be close to the edge of the obstacle or to be kept at a constant distance or a distance within a certain range from the edge of the obstacle; in this case, the travel track of the cleaning robot from the point c1 to the point c2 to the point c 3), calculating the sum of the projection length (in this case, the length of the line segment between the point c2 and the point c 3) of the travel track along the edge of the obstacle in the current direction (i.e., the direction from the point B to the point c 1) before the obstacle is encountered and the travel length (in this case, the length of the line segment between the point c2 and the point c 3) of the current direction (i.e., the line segment between the point B and the point c 1) in the current direction (i.e., the direction from the point B to the point c 1) in real time (i.e., the length of the line segment between the point 1 and the point B) of the point c 1) of the current direction of the line segment between the point B The sum of the length of the segment and the length from c2 to c 3), when the calculated value is greater than the preset distance in the current direction (refer to the length of the segment between b and c in fig. 2A), the cleaning robot does not continue to run along the obstacle, and instead continues to execute the command for controlling the moving distance and the rotating angle of the cleaning robot (in this case, the command similar to the command from b to c in fig. 2A, that is, the command for controlling the cleaning robot to turn from c3 to d to preset turn and move from c3 to d).
Further, as shown in fig. 3A, the controlling the cleaning robot to clean the virtual area inside the actual working area in step S200 includes:
and step S210, executing the command for controlling the movement distance and the rotation angle of the cleaning robot to obtain a closed space, wherein the boundary of the closed space can be regarded as the boundary of a virtual area. The process of executing the instruction refers to the above embodiment, which is not described herein again, and the shape of the closed space set by the instruction is not limited in the present invention.
Step S220, cleaning the inside of the closed space.
In an embodiment of the invention, the virtual area in the step S100 is determined by a preset area and a preset ratio.
The preset area is set by a user on a display (the display can be arranged on the cleaning robot or the mobile terminal), the preset area on the display corresponds to the virtual area of the actual working area, and the corresponding relation can calculate the preset proportion (the preset area on the display can be understood as the expansion multiple of the virtual area in the actual working area). The mark representing the cleaning robot may or may not be displayed in a predetermined area on the display, and in the latter case, the cleaning robot may be defaulted to be at the center or a certain point or a certain position (e.g., at the midpoint of a border) of the predetermined area. In some embodiments, the preset area is set as follows:
the user sets the preset area in a picture frame form on the display (including selecting a preset shape on the display and performing operations such as dragging, zooming and the like on the preset shape); and/or
The user sets the preset area by inputting the length and the width of the virtual area or inputting the length or the width of the virtual area and the ratio of the length to the width, and the corresponding preset ratio is 1; and/or
The user sets the preset area by inputting a coordinate range, in detail, a relative coordinate, i.e., a displayed relative coordinate with respect to the cleaning robot on the display, or a relative coordinate with respect to the cleaning robot in a virtual area. In the above case, the relative coordinates in the virtual area with respect to the cleaning robot are obtained by multiplying the displayed relative coordinates on the display by a preset ratio.
The preset proportion can be preset by a program according to experience, or the longest and widest dimensions of a room (actual working area) can be input by a user in an initialization stage, and an appropriate proportion is obtained by calculating the dimension of the mobile phone display and the maximum possible dimension of the room through an algorithm. For example, the preset ratio of the display to the actual working area is set to be 1:100, the size of the display of the mobile phone is 12cm × 7cm, the area of the corresponding actual working area is 6m × 6m, 1cm on the display of the mobile phone represents 1m of the actual working area, and a 6cm × 6cm virtual area on the mobile phone represents a range of 6m × 6m of the actual working area.
Further, as shown in fig. 3B, when the virtual area is determined by the preset area and the preset ratio, a relationship between the virtual area and the current position of the cleaning robot is preset, for example, the cleaning robot is at a center of the preset area, a certain fixed point, or a certain specific position (for example, a midpoint of a boundary of a certain edge), and the step S200 of controlling the cleaning robot to clean the virtual area inside the actual working area includes:
and step S210', controlling the cleaning robot to move to a boundary point with the boundary of the virtual area along the current direction. The virtual area boundary is a boundary around the virtual area, as shown by a dotted line frame in fig. 4A, the current direction is a current direction of the cleaning robot, as shown by an initial position arrow in fig. 4A, and the boundary point is shown by a point a in fig. 4A.
In implementation, as shown in fig. 4B, if an obstacle is encountered in the process of moving to the boundary of the virtual area, the cleaning robot is controlled to move along the edge of the obstacle in the virtual area; if a new obstacle is encountered in the process of running along the edge of the obstacle, controlling the cleaning robot to run along the edge of the new obstacle in the virtual area; if the cleaning robot is detected to move to the boundary point of the boundary between the obstacle edge and the virtual area in the process of moving along the obstacle edge, executing a step S220'; if the obstacle is not encountered any more within a certain continuous time after the obstacle is encountered, the obstacle is considered to be moved, and step S210' is performed.
Step S220', controlling the cleaning robot to move around the virtual area from the boundary point (point a in fig. 4B) along the boundary until the boundary point (as shown in fig. 4C) is reached, wherein a closed space defined by the movement trajectory of the cleaning robot forms the virtual area, and the boundary of the closed space is the boundary of the virtual area;
step S230', cleaning the inside of the closed space.
In step S220', if the cleaning robot encounters an obstacle while moving around the virtual area along the boundary (for example, the distance between the cleaning robot and the obstacle in front is smaller than a certain threshold value as sensed by the collision sensor after collision occurs or as measured by the distance measuring sensor), the cleaning robot is controlled to move along the edge of the obstacle in the virtual area. If no obstacle is encountered any more within a certain continuous time after the obstacle is encountered, it is considered that a dynamic obstacle is encountered (e.g., the obstacle is moved, as shown in fig. 4D), and since the detection distance of the sensor is very close, the robot can only determine that the obstacle around the robot is moved, but cannot determine the current environment in which the distance from the center of the robot exceeds a predetermined threshold (e.g., 30cm, the radius of the cleaning robot does not usually exceed 15cm), for this case, the process returns to step S210' to search the nearest virtual area boundary again, but the previous boundary point is used as the end position (as shown in fig. 4E).
In an embodiment of the present invention, a cleaning method in a zigzag shape is adopted to clean the inside of a closed space, as shown in fig. 5, the cleaning method includes:
s510: controlling the cleaning robot to travel inside the virtual area in the first cleaning direction to meet the boundary of the virtual area or the boundary of the cleaned area, in particular to meet the boundary of the virtual area, may be understood as the cleaning robot is about to reach the boundary of the virtual area, e.g. at a distance from the boundary of the virtual area.
In practice, the first cleaning direction may be determined according to a direction of a boundary where the current position of the cleaning robot is located, and the present invention is not particularly limited thereto. Before step S510 is executed, the cleaning robot may be further controlled to move to one end of any boundary in the virtual area, and preferably, the cleaning robot is controlled to move to the nearest end of the boundary closest to the current position of the cleaning robot.
S520: and controlling the cleaning robot to turn to a first steering to a first offset direction which is the same as the meeting boundary direction to continue to run for a first offset length, and turn to a second cleaning direction which is parallel and opposite to the first cleaning direction.
Preferably, the first steering direction and the second steering direction include both the turning direction and the turning angle. The first steering may be left or right, and the specific steering direction and steering angle are preset by a program or determined according to the distribution characteristics of the virtual area. The first deviation length is the projection length of the first deviation direction running track on the first cleaning direction perpendicular line. Preferably, the first offset length and the second offset length are generally the main brush width, preferably 10cm to 15cm, which enables the cleaning robot to avoid missing uncleaned areas when cleaning in adjacent cleaning directions during cleaning.
S530: and controlling the cleaning robot to move in the second cleaning direction inside the virtual area to meet the boundary of the virtual area or the boundary of the cleaned area.
S540: and controlling the cleaning robot to turn to a second deviation direction which is the same as the meeting boundary direction in a second turning direction and continue to run for a second deviation length, and turning to the first cleaning direction in the second turning direction.
Preferably, the second turning direction is opposite to the first turning direction, and the first offset direction and the second offset direction may be the same or different, but the first offset direction and the second offset direction are the same in the direction perpendicular to the cleaning direction. The first deviation length is the same as the second deviation length, the first deviation length is the projection length of the moving track along the first deviation direction on the perpendicular line of the first cleaning direction, and the second deviation length is the projection length of the moving track along the second deviation direction on the perpendicular line of the second cleaning direction.
S550: repeating the steps S510 to S540 until the track of the cleaning robot traverses the operable area in the virtual area.
If the cleaning robot runs in the virtual area along the current cleaning direction and meets an obstacle, the cleaning robot is controlled to run along the edge of the obstacle, the projection length of a track running along the edge of the obstacle on the perpendicular line of the current cleaning direction is calculated in real time, when the projection length is equal to a third offset length, the cleaning robot is controlled to turn to the cleaning direction opposite to the current cleaning direction to continue running, and when the projection length is equal to 0, the cleaning robot is controlled to turn to the current cleaning direction to continue running. It is to be noted that, in the present invention, the current cleaning direction is the first cleaning direction or the second cleaning direction. The direction of the cleaning robot when moving along the obstacle edge is not the first cleaning direction, the second cleaning direction, or the current cleaning direction.
Other methods can be used to clean the interior of the enclosed space by those skilled in the art, and the invention is not limited to the specific cleaning process for the interior of the enclosed space.
In an embodiment of the present invention, the method further includes: and generating a map during the movement of the cleaning robot, and recording the cleaned area in the map.
In an embodiment of the present invention, as shown in fig. 6, the cleaning method of the cleaning robot further includes:
and step S300, dividing the areas outside the virtual area in the actual working area according to the virtual area to obtain a plurality of expanded virtual areas.
And step S400, after the current virtual area is cleaned, sequentially cleaning other extended virtual areas.
In some embodiments, step S300 may be to translate the virtual area to an area inside the actual working area and outside the virtual area, and the expanded virtual area may form a multi-layer area, where a first layer of expanded virtual area closest to the virtual area coincides with one boundary or a part of a boundary of the virtual area, and an expanded virtual area on an outer layer coincides with at least a part of a boundary of an expanded virtual area on an inner layer. Therefore, the actual working area is completely covered by the virtual area and the multilayer expansion area. As shown in fig. 7, the solid line region is an actual working region, the middle dark dense-grid region is a virtual region, and the remaining dotted line regions are extended virtual regions, where the oblique grid region is a first-layer extended virtual region.
And S400, sequentially cleaning other extended virtual areas, namely judging the next extended virtual area to be cleaned according to the distance between the current position of the cleaning robot after the cleaning of the current virtual area and the other extended virtual areas. Preferably, when the cleaning robot finishes cleaning the current virtual area (including the current extended virtual area), the extended virtual area closest to the current position of the cleaning robot is set as the extended virtual area to be cleaned next.
This embodiment is through dividing actual work area into a plurality of virtual areas that can cover whole actual work area to clean virtual area inside one by one, can realize the high-efficient comprehensive cleanness to unknown region.
In an embodiment of the present invention, as shown in fig. 8, fig. 8 is a schematic view of a cleaning robot in an embodiment of the present invention. The cleaning robot provided by the embodiment can realize efficient cleaning of unknown areas by dividing the actual working area into a plurality of virtual areas capable of covering the whole actual working area and sequentially cleaning the interiors of the virtual areas under the condition of no initial map.
Specifically, the method comprises the following steps: obstacle sensing module 810, motion module 820, positioning module 830, and control module 840. The obstacle sensing module 810 is connected to the control module 840 for sensing obstacle information (such as obstacle distance information, position information, and/or shape information). The motion module 820 is connected to the control module 840 and is used for driving the cleaning robot to move under the control of the control module. The positioning module 830 is connected to the control module 840 for positioning the cleaning robot in the virtual area. The control module 840 is configured to perform any of the embodiments of the cleaning method of the cleaning robot described above.
In detail, the obstacle sensing module may be one or more of a collision sensor, an infrared sensor, a cliff sensor, and a TOF ranging sensor, and the positioning module may be one or more of a camera vision positioning module, a laser ranging positioning module (LDS), or a odometer. Of course, in specific implementation, in order to accurately position the cleaning robot, an ultrasonic sensor may be further disposed on the cleaning robot.
To more clearly illustrate the technical solution of the present invention, a detailed description is given below with a specific embodiment, the actual working area is as shown in fig. 9A, and the cleaning process of the cleaning robot includes:
1) the set virtual area is shown as a dotted area in fig. 9B. The virtual area can be obtained by setting a preset area through a display by a user, or can be set by a preset instruction for controlling the movement distance and the rotation angle of the cleaning robot.
2) If the virtual area is set according to a preset area set by a user, obtaining a closed space of the virtual area in the following mode: controlling the cleaning robot to move to a boundary point A with the boundary of the virtual area along the current direction; and controlling the cleaning robot to move around the virtual area from the intersection point along the boundary until the virtual area returns to the intersection point, wherein the moving track of the cleaning robot forms a closed space, such as an area surrounded by arrows in fig. 9B.
If the virtual area is set by the command, a closed space, such as an area surrounded by arrows in fig. 9B, is obtained by executing the command.
3) After the closed space is obtained, the inside of the virtual area is cleaned (zigzag cleaning) in steps S510 to S540 according to the previous embodiment, and the cleaning trajectory of the cleaning robot is as shown by the arrow direction in fig. 9C until the cleaning robot makes the next turn to cross the boundary of the virtual area or the cleaning robot makes the next turn to repeatedly clean the cleaned area.
After the cleaning robot finishes every zigzag cleaning, it is determined whether there are other uncleaned areas in the virtual area, if yes, the cleaning robot is controlled to move to the position closest to the current position of the cleaning robot in the uncleaned area in the virtual area, as shown in fig. 9D and 9F, and then the uncleaned area in the virtual area is cleaned according to the steps S510 to S540 of the foregoing embodiment, and the cleaning trajectory is shown in fig. 9E and 9G.
4) After the virtual area is cleaned, the area outside the virtual area in the actual working area is divided into a plurality of extended virtual areas, as shown in fig. 9H, into 3 extended virtual areas 901, 902, 903.
According to the distance between the current position of the cleaning robot after the cleaning of the current virtual area and other extended virtual areas, the next extended virtual area to be cleaned is determined to be 901, the cleaning robot is controlled to move to one end point of the extended virtual area 901, and then the cleaning robot is controlled to run around the extended virtual area 901 for a circle to obtain a closed space of the extended virtual area 901, as shown in fig. 9H, an area surrounded by an arrow in the extended virtual area 901.
5) See step 3) to clean the inside of the extended virtual area 901, the cleaning trajectory is shown in fig. 9I and 9J.
6) According to the distance between the current position of the cleaning robot after the cleaning of the current virtual area and other extended virtual areas, the next extended virtual area to be cleaned is determined to be 902, the cleaning robot is controlled to move to one end point of the extended virtual area 902, and then the cleaning robot is controlled to run for a circle around the extended virtual area 902 to obtain a closed space of the extended virtual area 902, such as an area surrounded by an arrow in the extended virtual area 902 in fig. 9K.
7) See step 3) to clean the inside of the extended virtual area 902, the cleaning trajectory is shown in fig. 9L.
8) According to the distance between the current position of the cleaning robot after the cleaning of the current virtual area and other extended virtual areas, the next extended virtual area to be cleaned is judged to be 903, the cleaning robot is controlled to move to one end point of the extended virtual area 903, and then the cleaning robot is controlled to run for a circle around the extended virtual area 903 to obtain a closed space of the extended virtual area 903, such as an area surrounded by an arrow in the extended virtual area 903 in fig. 9M.
9) See step 3) to clean the inside of the extended virtual area 903, and the cleaning trajectory is shown in fig. 9N and 9O.
To this end, according to the technical solution of the present invention, for an unknown actual working area before cleaning in fig. 9A, without an initial map, the cleaning robot implements efficient cleaning of the unknown area by dividing the actual working area into a plurality of virtual areas and sequentially cleaning the interiors of the virtual areas one by one.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only for the purpose of illustrating the present invention, and any person skilled in the art can modify and change the above embodiments without departing from the spirit and scope of the present invention. Therefore, the scope of the claims should be accorded the full scope of the claims.

Claims (12)

1. A cleaning method of a cleaning robot, characterized by comprising:
setting a virtual area, wherein a cleaning robot is positioned in the virtual area;
executing an instruction for controlling the movement distance and the rotation angle of the cleaning robot to obtain a closed space;
cleaning the interior of the enclosed space;
wherein the process of cleaning the interior of the enclosed space comprises:
s510: controlling the cleaning robot to move in the first cleaning direction inside the virtual area to meet one of the boundary of the virtual area and the boundary of the cleaned area, and then executing step S520;
s520: controlling the cleaning robot to turn to a first steering to a first offset direction which is the same as the meeting boundary direction and continue to run for a first offset length, and turn to a second cleaning direction which is parallel and opposite to the first cleaning direction;
s530: controlling the cleaning robot to travel inside the virtual area in a second cleaning direction to meet one of a boundary of the virtual area and a boundary of the cleaned area;
s540: controlling the cleaning robot to turn to a second deviation direction which is the same as the meeting boundary direction in a second turning direction and continue to run for a second deviation length, and turning to the first cleaning direction in the second turning direction;
s550: repeating steps S510 to S540 until the cleaning robot next turns to cross the boundary of the virtual area or the cleaning robot next turns to repeatedly clean the cleaned area;
s560: judging whether other uncleaned areas exist in the virtual area, if so, controlling the cleaning robot to move to the position, closest to the current position of the cleaning robot, in the uncleaned area in the virtual area, and continuing to execute the step S510 and the following steps;
if the cleaning robot runs in the virtual area along the current cleaning direction and meets an obstacle, the cleaning robot is controlled to run along the edge of the obstacle, the projection length of a track running along the edge of the obstacle on the perpendicular line of the current cleaning direction is calculated in real time, when the projection length is equal to a third offset length, the cleaning robot is controlled to turn to the cleaning direction opposite to the current cleaning direction to continue running, and when the projection length is equal to 0, the cleaning robot is controlled to turn to the current cleaning direction to continue running.
2. The method of claim 1, wherein the virtual area is set by preset instructions for controlling the movement distance and the rotation angle of the cleaning robot, and the instructions enable the virtual area to be formed by a closed space surrounded by the movement track of the cleaning robot in the actual working area.
3. The method as claimed in claim 1, wherein in the process of obtaining the closed space by executing the command for controlling the moving distance and the rotating angle of the cleaning robot, after an obstacle is encountered, the cleaning robot is controlled to turn to a preset turn direction and to run along the edge of the obstacle, the sum of the projected length of the track running along the edge of the obstacle in the current direction before the obstacle is encountered and the run length of the current direction is calculated in real time, and when the calculated value is greater than the preset distance in the current direction, the command for controlling the moving distance and the rotating angle of the cleaning robot is continuously executed.
4. The method of claim 1, wherein the virtual area is determined by a predetermined area and a predetermined ratio; wherein the preset area is set on the display by a user.
5. The method of claim 4, wherein the user sets the preset region by drawing a frame on the display; and/or
The user sets the preset area by inputting the length and the width of the virtual area or inputting the length or the width of the virtual area and the ratio of the length to the width; and/or
And setting the preset area by the user in a form of inputting a coordinate range.
6. The method of claim 4, wherein executing the instructions for controlling the distance of movement and the angle of rotation of the cleaning robot to obtain the enclosed space comprises:
controlling the cleaning robot to move to a boundary point of a boundary of the virtual area along the current direction;
controlling the cleaning robot to move around the virtual area from the intersection point along the boundary until the cleaning robot returns to the intersection point, wherein the virtual area is formed by a closed space surrounded by the moving track of the cleaning robot; and if the cleaning robot encounters an obstacle when the cleaning robot runs around the virtual area along the boundary, controlling the cleaning robot to run along the edge of the obstacle in the virtual area.
7. The method of claim 1, wherein the first offset length is a projected length of a trajectory running along the first offset direction in a perpendicular direction to a first cleaning direction; and/or the second deviation length is the projection length of the running track along the second deviation direction on the second cleaning direction vertical line.
8. The method of claim 2 or 3, wherein the instruction comprises:
1) controlling the cleaning robot to run for a first preset length along the current direction by taking the current position of the cleaning robot as an initial position;
2) steering preset steering;
3) running for a second preset length along the current direction;
4) steering preset steering;
5) running for a first preset length along the current direction;
6) steering preset steering;
7) and continuing to run the second preset length along the current direction to return to the starting position.
9. The method of claim 1, further comprising: dividing regions outside the virtual region in the actual working region according to the virtual region to obtain a plurality of expanded virtual regions;
and after cleaning the current virtual area, sequentially cleaning other extended virtual areas.
10. The method according to claim 9, wherein the sequentially cleaning of the other extended virtual areas is to determine the next extended virtual area to be cleaned according to the distance between the current position of the cleaning robot after the cleaning of the current virtual area and the other extended virtual areas.
11. A cleaning robot, characterized by comprising: the obstacle detection device comprises an obstacle sensing module, a motion module, a positioning module and a control module;
the obstacle sensing module is connected with the control module and used for sensing obstacle information;
the motion module is connected with the control module and is used for driving the cleaning robot to move under the control of the control module;
the positioning module is connected with the control module and used for positioning the cleaning robot in a virtual area;
the control module is configured to perform the method of any one of claims 1 to 9.
12. The cleaning robot of claim 11, wherein the obstacle sensing module is one or more of a collision sensor, an infrared sensor, a cliff sensor, and a TOF ranging sensor; the positioning module is one or more of a camera vision positioning module, a laser ranging positioning module or a milemeter.
CN201711481866.9A 2017-12-29 2017-12-29 Cleaning robot and cleaning method thereof Active CN109984678B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711481866.9A CN109984678B (en) 2017-12-29 2017-12-29 Cleaning robot and cleaning method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711481866.9A CN109984678B (en) 2017-12-29 2017-12-29 Cleaning robot and cleaning method thereof

Publications (2)

Publication Number Publication Date
CN109984678A CN109984678A (en) 2019-07-09
CN109984678B true CN109984678B (en) 2021-08-06

Family

ID=67109259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711481866.9A Active CN109984678B (en) 2017-12-29 2017-12-29 Cleaning robot and cleaning method thereof

Country Status (1)

Country Link
CN (1) CN109984678B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108247647B (en) 2018-01-24 2021-06-22 速感科技(北京)有限公司 Cleaning robot
KR102305206B1 (en) * 2019-07-11 2021-09-28 엘지전자 주식회사 Robot cleaner for cleaning in consideration of floor state through artificial intelligence and operating method thereof
CN110456789A (en) * 2019-07-23 2019-11-15 中国矿业大学 A kind of complete coverage path planning method of clean robot
CN110731734B (en) * 2019-08-12 2021-04-30 珠海市一微半导体有限公司 Control method and chip for planning and cleaning of intelligent robot and cleaning robot
CN110477813B (en) * 2019-08-12 2021-11-09 珠海市一微半导体有限公司 Laser type cleaning robot and control method thereof
CN112716392B (en) * 2019-10-28 2022-08-23 深圳拓邦股份有限公司 Control method of cleaning equipment and cleaning equipment
CN112799389B (en) * 2019-11-12 2022-05-13 苏州宝时得电动工具有限公司 Automatic walking area path planning method and automatic walking equipment
CN111552286B (en) * 2020-04-22 2024-05-07 深圳市优必选科技股份有限公司 Robot and movement control method and device thereof
JP7335566B2 (en) * 2020-05-19 2023-08-30 ベストスキップ株式会社 Calling system and area information setting method for calling system
CN114355871A (en) * 2020-09-30 2022-04-15 好样科技有限公司 Self-walking device and control method thereof
CN114819921B (en) * 2022-06-27 2022-11-15 深圳市信润富联数字科技有限公司 Mold ex-warehouse method, device, equipment and readable storage medium
CN115251766A (en) * 2022-07-08 2022-11-01 尚科宁家(中国)科技有限公司 Cleaning method of cleaning robot and cleaning robot
CN115033005A (en) * 2022-08-10 2022-09-09 湖南朗国视觉识别研究院有限公司 Floor sweeping method, sweeping robot and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103099583A (en) * 2011-11-14 2013-05-15 三星电子株式会社 Robot cleaner and control method thereof
CN105793790A (en) * 2013-12-19 2016-07-20 伊莱克斯公司 Prioritizing cleaning areas
CN107390698A (en) * 2017-08-31 2017-11-24 珠海市微半导体有限公司 The benefit of sweeping robot sweeps method and chip

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102541056A (en) * 2010-12-16 2012-07-04 莱克电气股份有限公司 Obstacle processing method for robot
WO2015090398A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
CN106998984B (en) * 2014-12-16 2021-07-27 伊莱克斯公司 Cleaning method for a robotic cleaning device
JP6429639B2 (en) * 2015-01-21 2018-11-28 シャープ株式会社 Self-propelled electronic device
CN106527423B (en) * 2015-09-15 2020-01-21 小米科技有限责任公司 Cleaning robot and control method thereof
CN107368079B (en) * 2017-08-31 2019-09-06 珠海市一微半导体有限公司 The planing method and chip in robot cleaning path

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103099583A (en) * 2011-11-14 2013-05-15 三星电子株式会社 Robot cleaner and control method thereof
CN105793790A (en) * 2013-12-19 2016-07-20 伊莱克斯公司 Prioritizing cleaning areas
CN107390698A (en) * 2017-08-31 2017-11-24 珠海市微半导体有限公司 The benefit of sweeping robot sweeps method and chip

Also Published As

Publication number Publication date
CN109984678A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109984678B (en) Cleaning robot and cleaning method thereof
Hornung et al. Navigation in three-dimensional cluttered environments for mobile manipulation
CN107688342A (en) The obstruction-avoiding control system and method for robot
US8576235B1 (en) Visibility transition planning for dynamic camera control
US9880553B1 (en) System and method for robot supervisory control with an augmented reality user interface
CN109242963B (en) Three-dimensional scene simulation device and equipment
KR20180070062A (en) A movable object and a method for controlling the same
US20200409379A1 (en) Machine learning method and mobile robot
JP4518033B2 (en) Route creation method, moving body, and moving body control system
Sarmiento et al. An efficient motion strategy to compute expected-time locally optimal continuous search paths in known environments
EP3753683A1 (en) Method and system for generating a robotic program for industrial coating
CN114035572A (en) Obstacle avoidance and itinerant method and system of mowing robot
CN114431771B (en) Sweeping method of sweeping robot and related device
KR20100117931A (en) Method of generating sweeping work path for mobile robot
CN113749562B (en) Sweeping robot and control method, device, equipment and storage medium thereof
KR20230151736A (en) Method and system for selecting movement routs of mobile robot performing multiple tasks based on deep reinforcement learning
US11697203B2 (en) Electronic apparatus and controlling method thereof
Sprunk et al. Improved non-linear spline fitting for teaching trajectories to mobile robots
CN112220405A (en) Self-moving tool cleaning route updating method, device, computer equipment and medium
Zhu et al. Online motion generation using accumulated swept volumes
JP2023522179A (en) Configuring the robot operating environment, including sensor placement
Sugiyama et al. Value function approximation on non-linear manifolds for robot motor control
Stamford et al. Pathfinding in partially explored games environments: The application of the A* Algorithm with occupancy grids in Unity3D
Vatcha et al. Practical motion planning in unknown and unpredictable environments
Kraetzschmar et al. Application of neurosymbolic integration for environment modelling in mobile robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant