CN110801180A - Operation method and device of cleaning robot - Google Patents

Operation method and device of cleaning robot Download PDF

Info

Publication number
CN110801180A
CN110801180A CN201810880193.2A CN201810880193A CN110801180A CN 110801180 A CN110801180 A CN 110801180A CN 201810880193 A CN201810880193 A CN 201810880193A CN 110801180 A CN110801180 A CN 110801180A
Authority
CN
China
Prior art keywords
cleaning robot
cleaning
straight
area
main direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810880193.2A
Other languages
Chinese (zh)
Other versions
CN110801180B (en
Inventor
许思晨
张一茗
陈震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Quick Sense Technology (beijing) Co Ltd
Qfeeltech Beijing Co Ltd
Original Assignee
Quick Sense Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quick Sense Technology (beijing) Co Ltd filed Critical Quick Sense Technology (beijing) Co Ltd
Priority to CN201810880193.2A priority Critical patent/CN110801180B/en
Publication of CN110801180A publication Critical patent/CN110801180A/en
Application granted granted Critical
Publication of CN110801180B publication Critical patent/CN110801180B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Abstract

The invention discloses an operation method and device of a cleaning robot. Wherein, the method comprises the following steps: acquiring an environment image of an actual environment where the cleaning robot is located; extracting straight-edge features from the acquired environment image; determining a straight edge direction corresponding to the direction of the longest straight edge feature in the actual environment according to the direction of the longest straight edge feature in the straight edge features, and determining a main direction according to the straight edge direction, wherein the main direction is parallel or vertical to the straight edge direction; adjusting a current orientation of the cleaning robot to a main direction. When the cleaning robot starts cleaning or is switched to a new area to be cleaned, the cleaning robot is automatically adjusted to be in the main direction in the actual environment, so that the cleaning robot avoids omission of an uncleaned area, and the technical effect of effectively improving the cleaning coverage rate is achieved.

Description

Operation method and device of cleaning robot
Technical Field
The invention relates to the field of robots, in particular to an operation method and device of a cleaning robot.
Background
In recent years, cleaning robots have been developed rapidly, and it is the most important point for such robots to achieve efficient and thorough cleaning of an area to be cleaned. Although most cleaning robots can achieve a high cleaning rate at present, there are often missed areas, fig. 1a is a schematic diagram of a missed area in a cleaning area in the prior art, and as shown in fig. 1a, when a cleaning robot is placed on the ground and starts to work, the cleaning robot usually performs cleaning work directly according to the orientation of the cleaning robot when the cleaning robot is placed, due to the randomness of the orientation of the cleaning robot when the cleaning robot is placed, if the initial direction of the cleaning robot is not adjusted, when the cleaning robot performs cleaning by using a general zigzag cleaning route, corners of a room may be missed by the cleaning robot during cleaning, such as a wedge-shaped area a (at the start of cleaning) and an area B (when the illustrated closed cleaning area is cleaned) shown in fig. 1 a. Fig. 1b is a schematic view of the cleaning robot covering the whole area to be cleaned, and as shown in fig. 1b, it is desirable to be able to cover the area to be cleaned to the maximum extent without generating the wedge-shaped uncleaned area in fig. 1a (area A, B and the upper left and lower right corner areas of fig. 1a, etc.).
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a cleaning robot and an operation method thereof, which can automatically adjust the orientation of the cleaning robot to the optimal direction capable of most completely covering an area to be cleaned when the cleaning robot is initially placed or the cleaning robot changes a working area, so that the coverage rate of the cleaning robot on the area to be cleaned is improved.
According to an aspect of an embodiment of the present invention, there is provided an operating method of a cleaning robot, including: acquiring an environment image of an actual environment where the cleaning robot is located; extracting straight-edge features from the acquired environment image; determining a straight edge direction corresponding to the direction of the longest straight edge feature in an actual environment according to the direction of the longest straight edge feature in the straight edge features, and determining a main direction according to the straight edge direction, wherein the main direction is parallel to or perpendicular to the straight edge direction; adjusting a current orientation of the cleaning robot to the primary direction.
Optionally, the determining, by the environment image being an electronic map, a main direction according to a longest straight-side feature of the straight-side features includes: determining a working range of the cleaning robot in the electronic map.
Optionally, the electronic map is obtained through distance measurement information measured by a distance measurement device.
Optionally, the electronic map is acquired by an image acquisition device, an inertial measurement unit and a milemeter.
Optionally, the environmental image is a photograph taken by an image acquisition device of the cleaning robot, and the image acquisition device includes at least one of a depth camera, a monocular camera, and a binocular camera.
Optionally, adjusting the current orientation of the cleaning robot to the primary direction comprises: determining the shooting orientation when the image acquisition device of the cleaning robot shoots a picture; determining an orientation difference according to the shooting orientation and the current orientation of the cleaning robot; determining an adjusting angle from the current orientation to the main direction according to the orientation and the orientation difference during shooting; and adjusting the cleaning robot from the current orientation to the main direction according to the adjustment angle.
Optionally, the image capturing device includes a depth camera and/or a binocular camera, the acquiring an environment image of an actual environment where the cleaning robot is located, and extracting a straight-edge feature from the acquired environment image includes: the depth camera and/or binocular camera takes a second photograph, wherein the second photograph includes 3D information of the actual environment; and determining straight-edge features in the second photo according to the acquired 3D information.
Optionally, the method further includes: dividing a cleaning area according to the main direction; cleaning in a zigzag route with the main direction as a starting direction in the divided cleaning area; or in the divided cleaning area, the main direction is taken as the initial direction, the cleaning area runs to the front obstacle, and the cleaning area rotates by 90 degrees and then performs cleaning in a bow-shaped route.
According to another aspect of an embodiment of the present invention, there is provided an operating method of a cleaning robot, including: the method comprises the steps that an image acquisition device which is arranged at the top of the cleaning robot and provided with a vertically upward lens shoots a picture of the actual environment where the cleaning robot is located, wherein the picture is a picture of the lower surface of a horizontal plane above the top of the cleaning robot; extracting straight-edge features from the acquired photos; determining a main direction according to a direction of a longest one of the straight-sided features, wherein the main direction is parallel or perpendicular to the direction of the longest straight-sided feature; adjusting a current orientation of the cleaning robot to the primary direction.
According to another aspect of an embodiment of the present invention, there is provided a cleaning robot including: a power supply assembly, a control assembly, an operating assembly, a cleaning assembly, and a sensor, wherein the control assembly includes a processor; the power supply assembly is used for supplying energy to the cleaning robot; the sensor is used for acquiring an environment image of the actual environment where the cleaning robot is located; the processor is used for running a program, wherein the program executes the running method in any one of the above modes when running; the running component drives the cleaning robot to be adjusted to be parallel to the main direction based on the program run by the processor.
In the embodiment of the invention, the main direction is determined according to the longest straight edge characteristic in the actual environment where the cleaning robot is located, so that when the cleaning robot starts cleaning or is converted into a new area to be cleaned, the cleaning robot is automatically adjusted to be the main direction in the actual environment, thereby avoiding missing an uncleaned area, achieving the technical effect of effectively improving the cleaning coverage rate, and solving the technical problems of high cleaning cost and low working efficiency of cleaning the area which is divided according to any direction of the cleaning robot in the related art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1a is a schematic view of a leakage area in a cleaning area in the prior art;
FIG. 1b is a schematic view of the cleaning robot covering the area to be cleaned in its entirety;
FIG. 2 is a flow chart of a method of operating a cleaning robot in accordance with an embodiment of the present invention;
FIG. 3a is a schematic view of a straight edge feature angled from the orientation of a cleaning robot in accordance with one embodiment of the present invention;
FIG. 3b is a schematic view of a straight edge feature angled from the orientation of the cleaning robot in accordance with one embodiment of the present invention;
FIG. 4 is a flowchart of a region division method for a cleaning robot according to the embodiment shown in FIG. 3b, in which a picture of the actual environment of the cleaning robot is taken by an image capture device disposed on the top of the cleaning robot and having a lens disposed vertically upward;
FIG. 5 is a schematic view of an initial environment of a cleaning robot according to one embodiment of the present invention;
fig. 6 is a flowchart of a region division method for a cleaning robot according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a cleaning robot performing TOF ranging in accordance with one embodiment of the present disclosure;
FIG. 8 is a schematic diagram of an electronic map according to one embodiment of the invention;
FIG. 9 is a schematic illustration of determining a primary direction on an electronic map, according to one embodiment of the invention;
FIG. 10 is a schematic illustration of region division according to a primary direction according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of a cleaning robot finding a boundary of a cleaning zone in accordance with one embodiment of the present invention;
fig. 12 is a schematic view of a cleaning area boundary where a cleaning robot finds an area where the cleaning robot is located according to an embodiment of the present invention;
fig. 13 is a schematic view of a partial area where a cleaning robot according to an embodiment of the present invention is cleaned;
FIG. 14 is a schematic view of a remaining part area where a cleaning robot cleans according to one embodiment of the present invention;
FIG. 15 is a schematic view of a cleaning robot according to an embodiment of the present invention dividing a cleaning region according to an arbitrary robot orientation;
fig. 16 is a flowchart of another operation method of a cleaning robot according to an embodiment of the present invention;
fig. 17 is a schematic structural view of a cleaning robot according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a cleaning method for a cleaning robot, where the steps shown in the flowchart of the drawings may be implemented by combining electronic components and/or devices with corresponding functions, or implemented on hardware such as a PCB by using integrated circuit technology, or implemented in a computer system capable of implementing a set of computer-executable instructions; also, while a logical order is shown in the flow diagrams, in some cases, the steps shown or described may be performed in an order different than here.
Fig. 2 is a flowchart of an operation method of a cleaning robot according to an embodiment of the present invention, as shown in fig. 2, the method including the steps of:
step S202, obtaining an environment image of the actual environment where the cleaning robot is located;
step S204, extracting straight-edge features from the acquired environment image;
in step S206, a straight-side direction corresponding to the direction of the longest straight-side feature in the actual environment is determined according to the direction of the longest straight-side feature in the straight-side features, and a main direction is determined according to the straight-side direction, wherein the main direction is parallel to or perpendicular to the straight-side direction. Parallel or perpendicular here means approximately parallel or perpendicular within a tolerance range, for example if the tolerance range is ± 2 °, the angle between the straight edge direction and the main direction may be-2 ° to 2 ° or 88 ° to 92 °. The straight-edge feature is a recognizable feature of a certain straight edge in the actual environment in the environment image.
Step S208, the current orientation of the cleaning robot is adjusted to the main direction.
Through the steps, the mode that the main direction is determined by the longest straight edge in the actual environment where the cleaning robot is located can be adopted, the cleaning robot is automatically adjusted to the direction capable of covering the area to be cleaned to the maximum extent, the unclean area is avoided being omitted, the effect of effectively improving the cleaning coverage rate is achieved, meanwhile, the number of times of collision with obstacles can be effectively reduced, damage to furniture and walls is reduced, the service life of the cleaning robot is prolonged, and the cleaning efficiency is improved.
The main direction is parallel or perpendicular to the straight-side direction corresponding to the direction of the longest straight-side feature in the environment image in which the cleaning robot is located, which can reduce missing unclean regions A, B as in fig. 1a, and can improve the coverage of the region to be cleaned. In the actual working process of the cleaning robot, the parallelism or perpendicularity of the main direction and the straight-edge direction may not be absolutely parallel or perpendicular due to measurement accuracy or accumulated error, so the protection scope of the present invention includes that a certain error exists between the parallelism or perpendicularity of the main direction and the straight-edge direction, for example, if the error is ± 2 °, the included angle between the straight-edge direction and the main direction may be-2 ° to 2 ° (corresponding to the case that the straight-edge direction is parallel to the main direction) or 88 ° to 92 ° (corresponding to the case that the straight-edge direction is perpendicular to the main direction).
Before or after the cleaning robot adjusts the current orientation to the main direction, the cleaning area can be divided according to the main direction; and cleaning according to the divided cleaning regions and the adjusted main direction of the cleaning robot. There are various ways of dividing the cleaning region according to the main direction. For example, the division is made at a fixed angle to the main direction, or according to a fixed division rule, etc. The specific manner of dividing the cleaning area is explained in detail below. The cleaning according to the divided cleaning regions may be performed by dividing all the cleaning regions and then cleaning the divided cleaning regions. In this embodiment, a cleaning region is first divided, the cleaning region is cleaned, and then a next cleaning region is divided after the cleaning of the cleaning region is completed, and the above operations are repeated until the cleaning region that has been cleaned covers the range of the sub-region (for example, a bedroom, a living room, and the like).
In an embodiment of the present invention, the obtaining the environmental image of the actual environment of the cleaning robot includes obtaining a two-dimensional overhead electronic map of the actual environment of the cleaning robot. The electronic map may be in various forms, for example, an electronic map of a clean area including only one area.
In one embodiment of the present invention, the electronic map may be a map of a large area, for example, a whole house, including a plurality of sub-areas such as bedrooms, living rooms, bathrooms, kitchens, study rooms, etc., separated by walls and doors and windows. In this embodiment, the cleaning robot performs the cleaning work in accordance with the sub-area during the cleaning process. When the sub-area is cleaned, it is necessary to determine the sub-area in which the current working range of the cleaning robot is located from the electronic map with the larger range, and then determine the straight-edge feature in the local electronic map of the sub-area representing the current working range. Since the direction and the shape of the object in the electronic map are the same as those of the object in the actual environment, and the sizes of the object in the electronic map are in an equal proportional relationship, the longest straight-side feature in the electronic map corresponds to the longest straight side in the actual environment, and the direction of the longest straight-side feature in the electronic map is also the same as that of the corresponding straight side in the actual environment, so that the longest straight-side feature can be determined from the straight-side features in the local electronic map, the straight side of the longest straight-side feature in the actual environment is determined, the main direction is determined according to the direction of the straight side, and the subsequent steps are executed. The straight edge feature is an identifiable feature in the electronic map that represents a straight edge of an obstacle or a different sub-area interface (i.e., a wall or door separating sub-areas).
As an alternative embodiment, the electronic map may be obtained by using ranging information measured by the ranging device. For example, during one rotation in situ or one rotation around a predetermined point, the cleaning robot provided with the distance measuring device measures the actual environment of the cleaning robot to obtain distance measuring information between the cleaning robot and the edge side of the surrounding object facing the cleaning robot in the actual environment, and the relative distance between the surrounding object and the cleaning robot and the distance between the surrounding objects can be obtained through the distance measuring information, so that an electronic map of the actual environment of the cleaning robot is established. The surrounding object includes any object capable of blocking a ranging medium (for example, in the case of a laser radar, the ranging medium blocks laser rays emitted by the ranging medium), such as indoor furniture, obstacles, walls, doors, and the like, within the ranging range of the ranging device. The distance measuring device is arranged on the cleaning robot and can be an infrared distance measuring device or a laser radar (including TOF) and the like. Extracting straight-edge features from the electronic map, selecting longest straight-edge features from the straight-edge features, determining a corresponding straight edge in an actual environment according to the longest straight-edge features, determining a main direction according to the direction of the straight edge, and executing the following steps. The straight-edge feature is an identifiable feature in the electronic map.
As an optional embodiment, the electronic map may be obtained by calculating an image obtained by the image acquisition device, And motion data obtained by the inertial measurement unit And the odometer through a SLAM (Simultaneous Localization And Mapping) algorithm. For example, the inertial measurement unit IMU acquires acceleration and angular velocity data, the odometer acquires distance data, the position information of the cleaning robot is obtained through calculation by combining the acceleration and angular velocity data and the distance data, and then the image acquisition device acquires image information (such as 2D information acquired by a monocular camera or a binocular camera or 3D information acquired by a depth camera) of each object in the actual environment, so that the position information of each object in the actual environment is acquired, and an electronic map is established; and the accumulated errors of the IMU and the odometer can be eliminated by the image information of each object in the actual environment, so that a more accurate electronic map is obtained, and the process is called SLAM.
In each embodiment of the above-mentioned electronic map as the environment image, since the electronic map includes the position information and the direction information of each object in the actual environment where the cleaning robot is located, the straight-edge feature of each object in the electronic map can be obtained, and the longest straight-edge feature can be obtained from the straight-edge feature. In an embodiment in which the electronic map is used as the environment image, since the electronic map of the present invention is a 2D top view electronic map, and the direction of the longest straight-sided feature in the electronic map is a straight-sided direction corresponding to the direction of the longest straight-sided feature in the actual environment, the main direction can be determined by a relationship of "the main direction is parallel or perpendicular to the straight-sided direction", and the main direction is the same direction in the electronic map as in the actual environment. Since the electronic map further includes the position and current orientation information of the cleaning robot, an included angle between the current orientation of the cleaning robot (which is the same as the current orientation of the cleaning robot in the actual environment in the electronic map) and the main direction can be determined through the electronic map. Then, the current orientation of the cleaning robot is adjusted to the main direction, which is the same in the actual environment or in the electronic map, as well.
It should be noted that the electronic map constructed in the above manner can be used for repositioning the cleaning robot, where repositioning refers to a process in which the cleaning robot repositions its own position on an existing electronic map during indoor cruising.
As an alternative embodiment, the environmental image is a photograph taken by an image capturing device of the cleaning robot, and the image capturing device includes at least one of a depth camera, a monocular camera, and a binocular camera. The image acquisition device is used for taking a picture containing straight-edge features of the surrounding environment, wherein the straight-edge features are recognizable features of a certain straight edge which is optically recognizable in the actual environment in the picture, such as a straight edge of a table, a straight edge of a bed, a long fluorescent lamp on a ceiling, a straight edge of a pattern on the ceiling, a straight edge at the junction of the ceiling and a wall and the like. The image acquisition device can be a camera for taking pictures; it may also be an infrared camera, taking infrared pictures, or other photographing devices that can take pictures containing straight edges in the actual surroundings. The pictures taken by the monocular camera or the binocular camera include 2D information of pixels, and the pictures taken by the depth camera include 3D information of pixels. And according to different types of image acquisition devices, different calculation modes are adopted.
In an embodiment of the invention, if the angle between the straight-edge feature and the orientation of the cleaning robot is shown in fig. 3a, the pose of the cleaning robot is changed after taking a picture, if the angle is rotated clockwise by an angle 3 as shown in fig. 3a, the orientation is changed from the shooting orientation 1 to the current orientation 12, if the angle is changed from the shooting orientation 21 when the image capturing device of the cleaning robot takes a picture, if the orientation is changed from the shooting orientation 62 to the shooting orientation 73 according to the shooting orientation 41 and the cleaning robot, if the angle is changed from the current orientation to the main orientation 81 and the orientation 93, the adjustment angle 02 adjusted from the current orientation to the main orientation is determined, if the cleaning robot is adjusted from the current orientation 2 to the main orientation (if D in fig. 3a represents the main orientation in the case of being parallel to the straight-edge feature L, if D represents the longest orientation in the case of being perpendicular to the longest straight-edge feature L, if the adjustment angle is taken by a difference between the straight-line 5 and the straight-line 5, if the angle is not equal to the straight-5, if the angle is equal to the angle 2, if the angle is equal to the angle 5, if the angle is equal to.
As an alternative embodiment, a picture of the actual environment where the cleaning robot is located is taken by an image acquisition device (such as a monocular camera) which is arranged on the top of the cleaning robot and is provided with a lens vertically upwards; the picture is a picture of the lower surface of the horizontal plane above the top of the cleaning robot; extracting straight-edge features from the acquired photos; determining a main direction according to a direction of a longest one of the straight-sided features, wherein the main direction is parallel or perpendicular to the direction of the longest straight-sided feature; adjusting a current orientation of the cleaning robot to the primary direction. The image capture device may be a monocular camera, a binocular camera, or a depth camera. The image acquisition device is vertically and upwards arranged at the top of the cleaning robot, the optical axis of the lens is perpendicular to the ceiling, so that a straight edge on the ceiling or a straight edge boundary line between the ceiling and a wall surface can be simply and conveniently acquired, and a picture acquired by the image acquisition device is characterized by a straight edge. In particular, on one hand, obstacles on the ground are often many and complex, which increases the burden of calculation, and on the other hand, more importantly, the optical axis of the vertically upward image capturing device is perpendicular to the horizontal ceiling or the bed bottom (i.e. the lower surface of the horizontal plane above the top of the cleaning robot), and the direction of the longest straight-edge feature in the obtained picture is identical to the direction of the corresponding straight edge in the actual environment, so that the complicated calculation of converting the direction of the straight-edge feature in the picture into the direction of the straight edge in the actual environment is avoided, and the direction of the longest straight-edge feature in the picture can be directly taken as the direction of the longest straight edge represented by the longest straight-edge feature, so that the main direction can be directly determined by the longest straight-edge feature. Moreover, in general, the shape of the room is rectangular, the boundary line between the ceiling and the wall surface can be usually used as the longest straight edge, the characteristic of the longest straight edge can be simply and conveniently obtained, and the method has the advantages of small data processing amount, high processing speed, low cost and good economical efficiency.
In one embodiment of the present invention, fig. 3b is a schematic diagram illustrating an angle between a straight-side feature and an orientation of a cleaning robot according to an embodiment of the present invention, and as shown in fig. 3b, determining a main direction according to a slope k of a longest straight-side feature of the straight-side features includes calculating a slope k of the longest straight-side feature, determining an angle α between a projection of the longest straight-side feature on a floor and a current orientation of the cleaning robot according to the slope k of the longest straight-side feature, determining a main direction according to an angle α such that the main direction is parallel or perpendicular to the direction of the longest straight-side feature, determining the longest straight-side feature in a photograph, wherein the longest straight-side feature has an equation of y kx + b, and since an optical axis of an image capturing device is perpendicular to the ceiling, the slope k of the longest straight-side feature in the photograph is the same as the slope of the longest straight-side feature in an actual environment.
As an alternative embodiment, in the case that the image capturing device is a depth camera and/or a binocular camera, acquiring an environment image of an actual environment in which the cleaning robot is located, and extracting a straight-edge feature from the acquired environment image, includes: the depth camera and/or the binocular camera take a second photograph (which belongs to one of the aforementioned photographs). The second picture taken by the depth camera comprises 3D information of the actual environment; the straight edge feature of the picture can be determined by directly taking 3D information from the second picture. The 3D information of the pixel in the second picture may determine the three-dimensional coordinate of the pixel in the spatial coordinate system. Determining the slope of the straight-edge feature in the control coordinate system according to the 3D information of the two pixel points on the straight-edge feature; the length and direction of the straight-edge feature can be determined according to the pixel points of the end points of the straight-edge feature. Since the second pictures taken by the binocular camera and the depth camera have 3D information themselves, the length and direction of the straight-sided features can be determined by directly analyzing the straight-sided features in the second pictures, thereby determining the main direction. It should be noted that the second photo taken by the binocular camera includes 2D information of the actual environment, when the binocular camera obtains 3D information of the pixel point, the same feature point in the actual environment is taken by two lenses (i.e., binocular lenses) at different positions, and the pixel point corresponding to and matched with the two cameras (i.e., the pixel point of the feature point in the two photos) is found out from the two photos taken by the two lenses, so that the 3D information of the pixel point can be calculated by combining the distance between the two lenses. In addition, photographs taken by the binocular camera are susceptible to light and shooting environments, and the effect of the binocular camera in acquiring pixel points is slightly inferior to that of the depth camera. Therefore, in the embodiment, the depth camera is preferably adopted, the effect is reliable, and the performance is stable.
As an optional embodiment, when a camera such as a depth camera, a monocular camera or a binocular camera is used as an image capture device to capture a picture, a wide-angle lens may be used to capture the picture in order to capture more straight-edge features at a time, that is, to capture a wider range of picture viewing angles, but the wide-angle lens enlarges the viewing angles of the picture due to optical principles, but causes relatively severe image distortion, and in the processing process, distortion correction is performed on curves in the picture, and a corrected straight line is determined; and extracting straight-edge features from the image according to the corrected straight lines and the straight lines in the image. For example, a distorted line is first reduced to a straight-sided feature, and then the subsequent processing steps are performed.
As an alternative embodiment, the determining of the main direction within the actual environment of the cleaning robot comprises: acquiring an environment image of an actual environment according to an image acquisition device; extracting straight edge features corresponding to the longest straight edge of the longest obstacle according to the acquired environment image; the primary direction is determined from the straight-sided features. The environment image may be a 2D image or a 3D image. The image acquisition device can be at least one of a depth camera, a binocular camera or a monocular camera, firstly acquires an environment image of the actual environment where the cleaning robot is located, and analyzes and extracts straight-edge features from the environment image. The straight edge feature refers to straight line information of a fixed obstacle in the actual environment around the cleaning robot included in the environment image, for example, a boundary line between a wall and the ground, a boundary line between a wall and a ceiling, a straight line pattern of a ceiling, a straight edge of a windowsill or a window frame, or a long strip fluorescent lamp on a ceiling. The straight line information usually represents the direction of the room, for example, the boundary line between the wall and the ground, the boundary line between the wall and the ceiling, usually parallel to the direction of the room; the straight pattern of the ceiling, the straight edges of the windowsills or window frames, or the elongated fluorescent lamps on the ceiling also often coincide with a certain direction of extension of the room. After the straight-side features are acquired from the environment image, a main direction is determined according to the longest straight-side feature, and the main direction is perpendicular to or parallel to the longest straight-side feature.
As an alternative embodiment, dividing the cleaning area according to the main direction includes: determining the shape and size of the divided cleaning area; the cleaning area is divided according to the main direction, and the determined shape and size. Since the cleaning robot has a certain positioning accuracy or a certain planning accuracy of the working path, the size of the cleaning area division is set before the cleaning area division is performed. For example, it may be a set area or a set proportion including a proportion of the length of the divided cleaning region, which may determine the size of the cleaning region, or a real to virtual proportion. The shape of the divided cleaning area can be a plurality of same-type dense patterns, for example, common dense patterns have a rectangle, a regular triangle, a square, a regular hexagon and the like; or may be a variety of different patterns of lay-ups. Optionally, the cleaning area in this embodiment is rectangular. The cleaning area is rectangular, so that programming and operation can be conveniently carried out, and the running speed can be effectively improved.
As an alternative embodiment, the cleaning according to the divided cleaning regions includes: determining an included angle between the current orientation of the cleaning robot and the main direction; adjusting the current orientation of the cleaning robot to a starting orientation according to the included angle, wherein the starting orientation is the orientation when the cleaning robot starts cleaning the cleaning area, and the starting orientation can be considered as the orientation after the current orientation is adjusted to the main direction in the invention, so that the starting orientation is parallel to the main direction (the term "parallel to the main direction" in the invention includes: the same direction or the opposite direction of the main direction); and cleaning the cleaning area according to the adjusted initial orientation. There are many ways of cleaning methods of a cleaning robot when cleaning a cleaning area. For example, the cleaning device can gradually get close to the center, gradually get far away from the center, adopt an efficient arch-shaped cleaning mode and the like. In this embodiment, a virtual area of the cleaning area is cleaned, and then an uncleaned area in the virtual area is cleaned by a conventional cleaning method. For example, a bow-shaped cleaning mode is adopted, the cleaning mode can effectively clean the contact range of adjacent cleaning areas, because the contact range of the adjacent cleaning areas is often the position of the cleaning robot for turning or turning around, the cleaning dead angle is easy to generate, and the problem of poor cleaning quality is caused. Therefore, the initial orientation of the cleaning robot is perpendicular to or parallel to the main direction, and when the orientation of the cleaning robot is adjusted, the artificial adjustment error is too large, so that an automatic adjustment mode is adopted, an included angle between the current orientation of the cleaning robot and the main direction is determined, and then the cleaning robot rotates for a certain angle according to the included angle, so that the orientation of the cleaning robot and the main direction can be in the same direction or opposite direction.
As an alternative embodiment, the cleaning robot divides the cleaning area according to the main direction before or after cleaning according to the divided cleaning area. In the divided cleaning areas, cleaning is performed in a zigzag route with the main direction as a starting direction.
As an alternative embodiment, the cleaning robot divides the cleaning area according to the main direction before or after cleaning according to the divided cleaning area. The divided cleaning area is moved to the front obstacle or the boundary of the divided cleaning area with the main direction as the initial direction, and is rotated by 90 degrees to be cleaned in a zigzag route, as shown in fig. 11.
As an alternative embodiment, before or after cleaning according to the divided cleaning regions, further comprising: judging whether the cleaning area after cleaning covers the range of sub-areas (the sub-areas in the invention refer to areas which belong to the whole cleaning area and are separated by walls and doors and windows, such as bedrooms, living rooms, toilets, kitchens, study rooms and the like which are all sub-areas); when the cleaning area which is cleaned completely does not cover the range of the sub-area, the adjacent range of the cleaning area which is cleaned completely is entered, and the above method operations are repeated until the cleaning area which is cleaned completely covers the range of the sub-area. According to the embodiment, a cleaning area is divided, the cleaning robot is controlled to divide the cleaning area into the next cleaning area after the cleaning of the cleaning area is finished, strain can be generated at any time, the cleaning condition of the cleaning robot belongs to the range of the sub-area, and when the closed area of the sub-area changes, the pre-divided cleaning area is meaningless, so that the cleaning area is divided while cleaning is performed, the random strain capability is realized, and the cleaning efficiency and the strain capability of the cleaning robot are effectively improved. The "area to be cleaned" and the "cleaning area" in the present invention refer to the same area, and the only difference is that the area to be cleaned is the area that the cleaning robot plans to clean, and the cleaning area refers to the area that may have been cleaned or partially cleaned, so the concepts of both are the same in most cases.
As an embodiment of the present embodiment, a cleaning method of a cleaning robot will be described in detail below.
The embodiment provides a region division method for a cleaning robot, which is characterized in that obstacle information around the cleaning robot is acquired by adopting methods such as a time of flight (TOF) method, a laser ranging method and a camera for acquiring characteristic information, an electronic map is established, and according to the direction of a wall (the longest straight edge of the longest obstacle) taken out of the electronic map on the premise of starting cleaning, the unknown environment is more reasonably divided into regions and cleaned in regions, so that the cleaning efficiency is improved, and the missing region and the repeated coverage region are reduced. For example, TOF, lidar are both ranging methods; the camera is used for taking pictures and does not belong to a distance measuring method. The present embodiment has two different embodiments according to the difference of the collecting device, and the two different embodiments are described below.
First embodiment, fig. 4 is a flowchart of a region division method for a cleaning robot according to an embodiment shown in fig. 3b, in which a picture of an actual environment of the cleaning robot is taken by an image capture device disposed on the top of the cleaning robot and having a lens disposed vertically upward, wherein the picture is a picture of a lower surface of a horizontal plane above the top of the cleaning robot. As shown in fig. 4, the method comprises the following specific steps:
step S401: an image capturing device (such as a video camera or a still camera) acquires an environment image containing information on the environment around the cleaning robot, wherein at least one environment image contains straight-edge features in the environment.
Step S402: the cleaning robot extracts straight-edge features from the environment image.
Step S403: the cleaning robot calculates the angle between the current orientation of the cleaning robot and the longest straight-edge feature; since the optical axis of the image capture device is perpendicular to the ceiling, the slope k of the longest straight-sided feature in the photograph is the same as the slope of the longest straight-sided feature in the actual environment. The longest straight edge on the ceiling is parallel to its projection onto the floor, so the cleaning robot is currently oriented at an angle to the longest straight edge feature, i.e. the angle of the current orientation to the longest straight edge.
Step S404: and adjusting the current orientation of the cleaning robot to enable the current orientation of the cleaning robot to be parallel to a main direction, wherein the main direction is the same as or perpendicular to the longest straight-side feature.
Step S405: the cleaning robot performs the cleaning task along the adjusted starting orientation (i.e., the main direction). Specifically, the cleaning task may be that the cleaning robot reaches the straight edge ahead along the adjusted starting orientation, turns 90 °, delineates an area along the straight edge (when the adjusted direction is at right angles to the straight edge feature), and cleans in a zigzag path; or the cleaning robot divides the cleaning area according to the initial orientation (i.e. the main direction), then turns 90 degrees along the adjusted initial orientation to reach the boundary of the divided cleaning area (which can be the long edge of the actual obstacle for dividing the cleaning area, or can also be a virtual boundary, i.e. the boundary of the divided virtual area or the boundary of the virtual range), and cleans along the boundary in a bow-shaped path; or the cleaning machine runs to the front obstacle along the adjusted initial direction, turns to 90 degrees, firstly delimits the area and then cleans in the delimited area by a bow-shaped path; or travel to the front obstacle along the adjusted initial orientation, cleaning in a zig-zag path (when the adjusted primary direction is substantially co-directional or substantially perpendicular to the straight edge feature). The turning direction is 90 degrees or the rotation is 90 degrees, and if not specifically stated, the rotation can be clockwise or anticlockwise.
In order to ensure that straight-edge features that can be used for adjusting the orientation of the cleaning robot in the later steps S403, S404 can be extracted in step S402, the manner in which the image of the environment that meets the above requirements is acquired is somewhat different for the different forms of image acquisition apparatus in step S401, as described in detail below.
The distance of the scenery or feature in the image can be converted by a second picture taken by a depth camera (such as an RGB-D camera, a TOF camera, etc.), so that 3D information of each pixel can be obtained, and the distance and angle of the scenery or feature in the second picture can be obtained. Therefore, if the capture unit is a depth camera, the angle between the capture unit (i.e., the cleaning robot) and the straight edge feature in the image can be obtained from the second picture taken by the capture unit, and the task of step S403 can be completed.
For a binocular ordinary camera as a shooting unit, since the binocular camera can calculate three-dimensional distance information, two second pictures of the same straight line containing a fixed obstacle and shot by the cleaning robot at one position can recognize the 3D information, and an included angle between the straight line (represented by a straight-edge feature in the second pictures) and the orientation of the cleaning robot is calculated.
For a monocular common camera serving as a shooting unit, the monocular camera is not arranged at a position where the cleaning robot is vertically upward as described above, because the cleaning robot does not determine the relative position and angle relationship between a straight-edge feature in a first picture obtained by shooting and a straight line actually fixing an obstacle, the included angle between the straight line and the orientation of the cleaning robot cannot be distinguished through only one first picture, but the cleaning robot can be used for obtaining the straight line information of the fixed obstacle by moving the cleaning robot to a plurality of positions and/or shooting a plurality of first pictures of the fixed obstacle on the same straight line at a plurality of angles, and comprehensively calculating the positions and angle parameters of the cleaning robot at a plurality of positions and/or angles of the first pictures, so as to obtain the included angle between the straight line and the orientation of the cleaning robot.
In a second embodiment, a method for dividing an area of a cleaning robot is provided, which specifically includes the following steps:
step S501: the cleaning robot rotates one circle originally, so that the distance measuring device (such as TOF or laser radar) obtains the distance measuring information of the cleaning robot and nearby obstacles around the cleaning robot, and an electronic map is built (as shown in FIG. 5. FIG. 5 is a schematic diagram of the initial environment of the cleaning robot according to an embodiment of the invention).
Step S502: the cleaning robot extracts a straight-edge feature, such as the obstacle feature in fig. 5, from the electronic map, where the straight-edge feature may be, for example, a projection straight line of a side of a wall near the cleaning robot facing the cleaning robot on the floor, a projection straight line of a side of a furniture such as a cabinet or a bed near the cleaning robot facing the cleaning robot on the floor, or the like.
Step S503: the cleaning robot calculates the angle between the current orientation of the cleaning robot and the longest straight-edge feature; because the direction and shape of the object features in the electronic map are the same as those of the object features in the actual environment, and only the size is in a proportional relationship, the longest straight-side feature in the electronic map corresponds to the longest straight side in the actual environment, and the direction of the longest straight-side feature in the electronic map is also the same as that of the corresponding straight side in the actual environment.
Step S504: the cleaning robot adjusts the current orientation such that the current orientation of the cleaning robot is parallel to a primary direction, the primary direction being in the same direction or perpendicular to the longest straight edge feature.
Step S505: the cleaning robot performs the cleaning task along the adjusted starting orientation (i.e., the main direction).
Specifically, the cleaning robot mainly uses a positioning sensor and an obstacle detection sensor (such as collision, infrared, ultrasonic, etc.) to sense and clean the whole unknown area, wherein the positioning sensor provides information on the position and posture (orientation) of the cleaning robot, and the obstacle detection sensor provides information on obstacles around the cleaning robot as the input of the electronic map updating and path planning module. The electronic map updating and the path planning are synchronously carried out, the cleaned area is updated to the electronic map while the area is cleaned, and the subsequent repeated cleaning is prevented.
Fig. 6 is a flowchart of a region dividing method for a cleaning robot according to an embodiment of the present invention, as shown in fig. 6, for any unknown environment, the whole cleaning process of the cleaning robot mainly includes region dividing, edge cleaning and region cleaning, and the specific steps of the whole cleaning process are as follows:
step S601: initializing a program, and determining relevant parameters such as the size of an area, the resolution of a map, the distance between cleaning routes and the like;
step S602: executing an area dividing program to determine the position of a first cleaning area;
step S603: performing an edgewise cleaning procedure around the cleaning zone boundary for a circle;
step S604: executing an area cleaning program to perform supplementary cleaning on an uncleaned area inside a cleaning area;
step S605: updating the uncleaned areas around the current area and adding the uncleaned areas to the list; if there is no uncleaned area in the list, go to step S606; if there is an uncleaned area, returning to step S603;
step S606: and finishing the whole cleaning process.
It should be noted that in the method of the present embodiment, it is assumed that the initial environment of a certain area to be cleaned is as shown in fig. 5. Establishing a coordinate system (a world coordinate system) by taking the course of the initial position of the robot as an x axis, wherein the whole cleaning process can be divided into the following three steps:
firstly, dividing a cleaning area.
Before executing a cleaning task, preprocessing is firstly carried out, namely, the area of the whole environment is divided according to the current position of the cleaning robot, and the area division can be specifically divided into the following steps:
step S701: the cleaning robot rotates in place for a circle, barrier information around the cleaning robot is acquired through the TOF ranging sensor, an electronic map is built, and the step S702 is executed. Fig. 7 is a schematic view of a cleaning robot performing TOF ranging according to an embodiment of the present invention, and a ranging range of a TOF ranging sensor is shown in fig. 7.
In the embodiment, a local grid map of the surrounding environment of the cleaning robot can be established according to the obstacle information acquired by the TOF ranging sensor, and compared with a laser radar, the TOF ranging sensor is obviously lower in measurement accuracy, frequency and measurement distance than the laser radar, but much better than an ordinary infrared ranging sensor, so that the position of the obstacle information in the electronic map can be acquired through one-circle rotation. A schematic diagram of an electronic map corresponding to the environment of fig. 7 is shown in fig. 8, and fig. 8 is a schematic diagram of an electronic map according to an embodiment of the invention. In actual use, the number of the map grids is larger, the resolution is higher, and part of noise is also included. Of course, other distance measuring devices such as a laser radar may be used to measure the distance, so as to implement the technical solution of the present embodiment.
Step S702: performing image processing (for example, hough transform) on the electronic map extracted in step S701, identifying an obstacle with the longest edge that can be detected in an identifiable range as a longest straight-side feature, for example, extracting an equation of a straight line closest to an obstacle grid in a world coordinate system by hough transform, calculating an angle between the current orientation of the cleaning robot and the direction of the longest straight-side feature, taking a direction with the angle smaller than 90 degrees as a main direction, and proceeding to step S703, where a schematic diagram of the main direction is shown in fig. 9, and fig. 9 is a schematic diagram of determining the main direction on the electronic map according to an embodiment of the present invention. The processing method is not limited to hough transform, and may be other processing methods capable of detecting the line features, such as gradient descent method.
Step S703: a first rectangular cleaning area is divided with the current position of the cleaning robot as the center of the area, wherein two sides of the cleaning area are parallel or perpendicular to the main direction (in other embodiments, each cleaning area is fixed in size, e.g., 4m × 5 m). The result of the region division for the above environment in the case where the first region is centered and each region size is fixed is shown in fig. 10, and fig. 10 is a schematic diagram of the region division according to the main direction according to an embodiment of the present invention.
Secondly, according to the current position of the cleaning robot, performing edgewise cleaning on a cleaning area (such as an area a in fig. 10) corresponding to the current position, which can be specifically divided into the following steps:
step S801: judging whether the current position of the cleaning robot is in the target cleaning area, and if so, entering the step S802; if not, go to step S803; the target cleaning region may be an objective cleaning region in an actual environment, or a cleaning region that the cleaning robot autonomously divides.
Step S802: finding the nearest cleaning region boundary, setting the moving direction of the cleaning robot to be the main direction close to and perpendicular to the cleaning region boundary (in this embodiment, the nearest cleaning region boundary may be regarded as the longest straight side, and the direction perpendicular to the longest straight side feature may be determined as the main direction by the longest straight side feature corresponding to the nearest cleaning region boundary in the environment image), so that the current orientation of the cleaning robot is rotated to be the main direction as the starting orientation and moves forward along the starting orientation, as shown in fig. 11, where fig. 11 is a schematic diagram of the cleaning robot finding the cleaning region boundary according to an embodiment of the present invention. If the robot reaches the boundary of the cleaning area smoothly or meets the front obstacle, the moving direction is updated to the direction corresponding to the boundary (namely, the robot rotates by 90 degrees in the clockwise direction or the anticlockwise direction), meanwhile, the direction of the cleaning robot is rotated to the moving direction, the current position and the posture are recorded as the area division starting position, and the process goes to step S804; if a collision occurs before reaching the cleaning area boundary, the cleaning robot is rotated based on a signal from the collision sensor, and the current position and posture are recorded as the area division start position, and the process proceeds to step S806.
Step S803: in this case, one of the four borders of the cleaning region has to be cleaned before and belongs to a reachable position to which it is navigated. If the position is successfully reached, updating the motion direction of the cleaning robot to the direction corresponding to the boundary for motion, and entering step S804; if collision occurs in the process of navigating to the position, moving along the obstacle for 3S, adding 1 to the number of collision, and if the number of collision is less than 3, returning to the step S801; if the number of collisions is 3 or more, it is considered that the boundary of the target area cannot be reached due to the occurrence of the dynamic obstacle, and the process proceeds to step S807.
Step S804: moving along the direction corresponding to the boundary, if the next boundary of the cleaning area is reached smoothly, updating the moving direction to the direction corresponding to the next boundary, and entering step S805; if a collision occurs before reaching the target boundary, the cleaning robot is rotated based on a signal from the collision sensor, and the process proceeds to step S806; if it is detected that the distance between the current position and the area division starting position is less than a threshold (e.g., 10 cm)/or the change of the current orientation with respect to the area division starting position exceeds a certain threshold, the process proceeds to step S807.
Step S805: the cleaning robot is rotated to the main direction, and the process returns to step S804.
Step S806: the cleaning robot is moved along an obstacle (e.g., a wall, a leg of a chair, a cabinet, etc.) according to a signal of the obstacle detecting sensor. If a collision occurs during the movement, rotating the cleaning robot according to a signal of the collision sensor, and returning to step S806; if the cleaning robot is found to cross the boundary of the cleaning area in the moving process, updating the moving direction to the direction corresponding to the boundary, and returning to the step S805; if the obstacle sensor has not detected an obstacle within a certain continuous time, returning to step S802; if it is detected that the distance between the current position and the area division starting position is less than a threshold (e.g., 10 cm)/or the change of the current orientation with respect to the area division starting position exceeds a certain threshold, the process proceeds to step S807.
Step S807: the edgewise operation for the area is ended.
With respect to the environment shown in fig. 10, the result after the above steps are performed is shown in fig. 12, and fig. 12 is a schematic view of a cleaning area boundary where the cleaning robot finds the area where the cleaning robot is located according to an embodiment of the present invention.
And thirdly, after the edge is finished, cleaning the previous edge area. The main function of this part of the procedure is to perform supplementary cleaning of the unknown part inside the area in case the area border has been cleaned.
Step S901: judging whether an edge still exists in the edge list, if so, entering the step S902; if not, go to step S908;
step S902: traversing the edge list to obtain the midpoint position of each edge, calculating the distance between the current position of the cleaning robot and the midpoint position of each edge, using the difference between the edge length and the distance between the cleaning robot and the midpoint position of the edge as a weight, finding a target edge with the maximum weight, and determining an end point of the target edge closer to the cleaning robot;
step S903: and navigating to the target endpoint through an algorithm. If the target endpoint is successfully reached, the step S904 is entered; if collision occurs in the process of reaching the target endpoint, moving along the obstacle for 3S, adding 1 to the number of collision, and if the number of collision is less than 3, entering step S903; if the number of times of collision exceeds 3 times, deleting the edge from the list, and returning to the step S901;
step S904: and determining the movement direction and the cleaning direction of the cleaning robot according to the direction of the edge. Wherein, the moving direction and the cleaning direction can be any one of four directions which are vertical or parallel to the direction of the edge, but the moving direction and the cleaning direction are necessarily vertical to each other;
step S905: the rotary cleaning robot is oriented to the moving direction and moves forward along the moving direction. If it is detected that the front side is cleaned, the step S906 is entered; if the boundary of the movement direction is detected, whether the next cleaning route crosses the boundary of the cleaning direction needs to be judged, if so, the step S901 is returned to; otherwise, go to step S906; if collision occurs in the moving process, it is also necessary to determine whether the next cleaning route will cross the boundary of the cleaning direction, if so, the process returns to step S901; otherwise, go to step S907;
step S906: rotating the cleaning robot to move forward and toward the cleaning direction, if the cleaning robot successfully reaches the next cleaning route, reversing the movement direction, and returning to the step S905; if collision occurs in the forward movement process, indicating that an obstacle exists on the route, rotating the cleaning robot according to a signal of the collision sensor, setting the direction (left or right) along the obstacle, recording the current position, and going to step S907;
step S907: moving along the obstacle, if a collision occurs during the movement, rotating the cleaning robot according to a signal of the collision sensor, and returning to step S907; if the cleaning robot is detected to have moved to the previous cleaning route, returning to step S901; if the cleaning robot is detected to move to the current cleaning route and the displacement in the movement direction exceeds the threshold value, returning to the step S905; if the cleaning robot is detected to move to the next cleaning route, the movement direction is reversed, and the step S905 is returned;
step S908: and finishing the cleaning of the area.
Fig. 13 is a schematic view of a partial area where a cleaning robot according to an embodiment of the present invention is cleaned; fig. 14 is a schematic view of the remaining part area where the cleaning robot according to one embodiment of the present invention cleans, as shown in fig. 13 and 14, to perform the effect of area cleaning.
As can be seen from the overall flowchart of the present embodiment, after the area a in fig. 13 is cleaned, the entire environment can be covered by repeating the edge cleaning and the area cleaning on the area B, C, D.
Assuming that the main direction is not obtained at first, but the cleaning area with the same size is directly taken as the origin from the center of the cleaning robot, and two sides of the cleaning area are respectively parallel to the x-axis and the y-axis of the world coordinate system, the result of dividing the cleaning area of the whole environment is shown in fig. 15, fig. 15 is a schematic diagram of the cleaning robot according to one embodiment of the present invention dividing the cleaning area according to the direction of any cleaning robot, and the sizes of the remaining areas except for the areas a and D in fig. 15 are simplified in the figure. Although the size of the cleaning area and the way of dividing the cleaning area are not changed, the final result is that the whole environment is divided into 8 cleaning areas, and what is worse, the effective areas of the two cleaning areas are very small, and the efficiency is obviously reduced by adopting the path planning algorithm to perform the cleaning in the areas, and the phenomenon can be completely solved by extracting the wall direction firstly and then rotating the cleaning areas.
If the area is not re-divided according to the obstacle information before the edge cleaning, the sub-area is excessive, and the cleaning efficiency and the user experience are affected.
Fig. 16 is a flowchart of another operation method of a cleaning robot according to an embodiment of the present invention, and as shown in fig. 16, according to another aspect of an embodiment of the present invention, there is provided an operation method of a cleaning robot, the method including the steps of:
step S162, shooting by an image acquisition device which is arranged at the top of the cleaning robot and provided with a vertically upward lens to obtain a picture of the actual environment where the cleaning robot is located, wherein the picture is a picture of the lower surface of a horizontal plane above the top of the cleaning robot; the photos comprise a first photo and/or a second photo; the first picture is a picture taken by a monocular camera and the second picture is a picture taken by a binocular camera and/or a depth camera.
Step S164, extracting straight edge features from the acquired photos;
step S166, determining a main direction according to the direction of the longest straight-side feature in the straight-side features, wherein the main direction is parallel to or perpendicular to the direction of the longest feature;
in step S168, the current orientation of the cleaning robot is adjusted to the main direction (i.e., parallel to the main direction, including the same direction or the opposite direction).
In step S162, since the lens of the image capturing device is vertically disposed upward on the top of the cleaning robot, it takes a picture of the lower surface of the horizontal plane above the top of the cleaning robot, which may be a ceiling (if the cleaning robot is operating in an open area in a room) or a table or bed (if the cleaning robot is operating below a table or bed).
Fig. 17 is a block diagram of a cleaning robot according to an embodiment of the present invention, and as shown in fig. 17, according to another aspect of an embodiment of the present invention, there is provided a cleaning robot including an operation control device 170, the operation control device 170 including: the acquisition module 172, the extraction module 174, the determination module 176, and the adjustment module 178 are described in detail below with respect to the operation control device 170.
An acquisition module 172, configured to acquire an image of an actual environment where the cleaning robot is located; an extracting module 174, connected to the acquiring module 172, for extracting the straight edge feature from the acquired image; a determining module 176, connected to the extracting module 174, configured to determine, according to a direction of a longest one of the straight-side features, a straight-side direction corresponding to the direction of the longest straight-side feature in an actual environment, and determine a main direction from the straight-side direction, where the main direction is parallel to or perpendicular to the straight-side direction; an adjustment module 178, coupled to the determination module 176, is used to adjust the current orientation of the cleaning robot to the primary direction.
According to another aspect of the present invention, there is provided a cleaning robot including: the cleaning device comprises a power supply assembly, a control assembly, an operation assembly, a cleaning assembly and a sensor, wherein the control assembly comprises a processor; a power supply assembly for supplying power to the cleaning robot; the sensor is used for acquiring an environment image of the actual environment where the cleaning robot is located; the processor is used for running the program, wherein the program executes the running method of any one of the above items when running; and the running component drives the cleaning robot to be adjusted to be parallel to the main direction based on a program run by the processor.
When the operation control device is used specifically, namely, when the cleaning robot operates, the camera, various sensors, the odometer, the IMU, a chip (an ARM or GPU or DSP or other processor), a wheel assembly, a power assembly and other parts need to be operated in a matching manner.
According to another aspect of an embodiment of the present invention, there is provided a cleaning robot including: the cleaning device comprises a body, a power supply assembly, a control assembly, an operation assembly, a cleaning assembly and a sensor, wherein the control assembly comprises a processor; the processor is configured to run a program, where the program executes the running method described in any one of the above when running.
According to another aspect of the embodiments of the present invention, there is provided a storage medium including a stored program, wherein when the program is executed, a device in which the storage medium is located is controlled to perform the method of any one of the above.
According to another aspect of the embodiments of the present invention, there is provided a processor for executing a program, wherein the program executes to perform the method of any one of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A method of operating a cleaning robot, comprising:
acquiring an environment image of an actual environment where the cleaning robot is located;
extracting straight-edge features from the acquired environment image;
determining a straight edge direction corresponding to the direction of the longest straight edge feature in an actual environment according to the direction of the longest straight edge feature in the straight edge features, and determining a main direction according to the straight edge direction, wherein the main direction is parallel to or perpendicular to the straight edge direction;
adjusting a current orientation of the cleaning robot to the primary direction.
2. The method of claim 1, wherein the environmental image is an electronic map, and wherein determining the primary direction from a longest of the straight-sided features comprises:
determining a working range of the cleaning robot in the electronic map.
3. The method of claim 2, wherein the electronic map is derived from ranging information measured by a ranging device.
4. The method of claim 2, wherein the electronic map is acquired by an image acquisition device, an inertial measurement unit, and a odometer.
5. The method of claim 1, wherein the environmental image is a photograph taken by an image capture device of the cleaning robot, the image capture device comprising at least one of a depth camera, a monocular camera, and a binocular camera.
6. The method of claim 5, wherein adjusting the current orientation of the cleaning robot to the primary direction comprises:
determining the shooting orientation when the image acquisition device of the cleaning robot shoots a picture;
determining an orientation difference according to the shooting orientation and the current orientation of the cleaning robot;
determining an adjusting angle from the current orientation to the main direction according to the orientation and the orientation difference during shooting;
and adjusting the cleaning robot from the current orientation to the main direction according to the adjustment angle.
7. The method according to claim 5, characterized in that the image acquisition device comprises a depth camera and/or a binocular camera,
the acquiring an environment image of an actual environment where the cleaning robot is located, and extracting straight-edge features from the acquired environment image includes:
the depth camera and/or binocular camera takes a second photograph, wherein the second photograph includes 3D information of the actual environment;
and determining straight-edge features in the second photo according to the acquired 3D information.
8. The method of any one of claims 1 to 7, further comprising:
dividing a cleaning area according to the main direction;
cleaning in a zigzag route with the main direction as a starting direction in the divided cleaning area; or in the divided cleaning area, the main direction is taken as the starting direction, the divided cleaning area runs to the front obstacle or the boundary of the divided cleaning area, and the divided cleaning area is rotated by 90 degrees and then cleaned in a zigzag route.
9. A method of operating a cleaning robot, comprising:
the method comprises the steps that an image acquisition device which is arranged at the top of the cleaning robot and provided with a vertically upward lens shoots a picture of the actual environment where the cleaning robot is located, wherein the picture is a picture of the lower surface of a horizontal plane above the top of the cleaning robot;
extracting straight-edge features from the acquired photos;
determining a main direction according to a direction of a longest one of the straight-sided features, wherein the main direction is parallel or perpendicular to the direction of the longest straight-sided feature;
adjusting a current orientation of the cleaning robot to the primary direction.
10. A cleaning robot, characterized by comprising: a power supply assembly, a control assembly, an operating assembly, a cleaning assembly, and a sensor, wherein the control assembly includes a processor;
the power supply assembly is used for supplying energy to the cleaning robot;
the sensor is used for acquiring an environment image of the actual environment where the cleaning robot is located;
the processor is used for running a program, wherein the program executes the running method of any one of claims 1 to 8;
the running component drives the cleaning robot to be adjusted to be parallel to the main direction based on the program run by the processor.
CN201810880193.2A 2018-08-03 2018-08-03 Operation method and device of cleaning robot Active CN110801180B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810880193.2A CN110801180B (en) 2018-08-03 2018-08-03 Operation method and device of cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810880193.2A CN110801180B (en) 2018-08-03 2018-08-03 Operation method and device of cleaning robot

Publications (2)

Publication Number Publication Date
CN110801180A true CN110801180A (en) 2020-02-18
CN110801180B CN110801180B (en) 2022-02-22

Family

ID=69486851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810880193.2A Active CN110801180B (en) 2018-08-03 2018-08-03 Operation method and device of cleaning robot

Country Status (1)

Country Link
CN (1) CN110801180B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110974091A (en) * 2020-02-27 2020-04-10 深圳飞科机器人有限公司 Cleaning robot, control method thereof, and storage medium
CN111358370A (en) * 2020-03-18 2020-07-03 五邑大学 Floor sweeping robot and walking method thereof
CN111595347A (en) * 2020-06-18 2020-08-28 上海大学 Method and system for determining position of mobile robot
CN112034837A (en) * 2020-07-16 2020-12-04 珊口(深圳)智能科技有限公司 Method for determining working environment of mobile robot, control system and storage medium
CN112180947A (en) * 2020-10-22 2021-01-05 湖南格兰博智能科技有限责任公司 Method and equipment for selecting initial traveling direction of mobile robot
CN112263188A (en) * 2020-10-22 2021-01-26 湖南格兰博智能科技有限责任公司 Correction method and device for moving direction of mobile robot
CN112773272A (en) * 2020-12-29 2021-05-11 深圳市杉川机器人有限公司 Moving direction determining method and device, sweeping robot and storage medium
CN113768419A (en) * 2021-09-17 2021-12-10 安克创新科技股份有限公司 Method and device for determining sweeping direction of sweeper and sweeper
CN114081400A (en) * 2021-11-16 2022-02-25 深圳市探博智能机器人有限公司 Robot return judgment method and system, cleaning equipment and storage medium
CN114569004A (en) * 2022-02-22 2022-06-03 杭州萤石软件有限公司 Traveling direction adjustment method, mobile robot system, and electronic device
CN115039561A (en) * 2022-06-30 2022-09-13 松灵机器人(深圳)有限公司 Mowing method, mowing device, mowing robot and storage medium
CN115226476A (en) * 2022-07-21 2022-10-25 松灵机器人(深圳)有限公司 Mowing method, mowing device, mowing robot and storage medium
CN115885231A (en) * 2020-07-09 2023-03-31 三菱电机楼宇解决方案株式会社 Cleaning system and program
WO2023160369A1 (en) * 2022-02-26 2023-08-31 追觅创新科技(苏州)有限公司 Movement direction deviation rectifying method and system, and self-moving robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6690134B1 (en) * 2001-01-24 2004-02-10 Irobot Corporation Method and system for robot localization and confinement
CN102929280A (en) * 2012-11-13 2013-02-13 朱绍明 Mobile robot separating visual positioning and navigation method and positioning and navigation system thereof
CN106527423A (en) * 2015-09-15 2017-03-22 小米科技有限责任公司 Cleaning robot and control method therefor
CN107003675A (en) * 2014-12-16 2017-08-01 罗伯特·博世有限公司 The method that processing surface is surveyed and drawn for autonomous robot vehicle
CN107943036A (en) * 2017-11-28 2018-04-20 深圳市杉川机器人有限公司 Purging zone system of selection and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6690134B1 (en) * 2001-01-24 2004-02-10 Irobot Corporation Method and system for robot localization and confinement
CN102929280A (en) * 2012-11-13 2013-02-13 朱绍明 Mobile robot separating visual positioning and navigation method and positioning and navigation system thereof
CN107003675A (en) * 2014-12-16 2017-08-01 罗伯特·博世有限公司 The method that processing surface is surveyed and drawn for autonomous robot vehicle
CN106527423A (en) * 2015-09-15 2017-03-22 小米科技有限责任公司 Cleaning robot and control method therefor
CN107943036A (en) * 2017-11-28 2018-04-20 深圳市杉川机器人有限公司 Purging zone system of selection and device

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110974091A (en) * 2020-02-27 2020-04-10 深圳飞科机器人有限公司 Cleaning robot, control method thereof, and storage medium
CN111358370A (en) * 2020-03-18 2020-07-03 五邑大学 Floor sweeping robot and walking method thereof
CN111595347A (en) * 2020-06-18 2020-08-28 上海大学 Method and system for determining position of mobile robot
CN115885231A (en) * 2020-07-09 2023-03-31 三菱电机楼宇解决方案株式会社 Cleaning system and program
CN112034837A (en) * 2020-07-16 2020-12-04 珊口(深圳)智能科技有限公司 Method for determining working environment of mobile robot, control system and storage medium
CN112180947A (en) * 2020-10-22 2021-01-05 湖南格兰博智能科技有限责任公司 Method and equipment for selecting initial traveling direction of mobile robot
CN112263188A (en) * 2020-10-22 2021-01-26 湖南格兰博智能科技有限责任公司 Correction method and device for moving direction of mobile robot
CN112180947B (en) * 2020-10-22 2023-09-12 湖南格兰博智能科技有限责任公司 Method and equipment for selecting initial travelling direction of mobile robot
WO2022143284A1 (en) * 2020-12-29 2022-07-07 深圳市杉川机器人有限公司 Method and apparatus for determining movement direction, and sweeping robot and storage medium
CN112773272A (en) * 2020-12-29 2021-05-11 深圳市杉川机器人有限公司 Moving direction determining method and device, sweeping robot and storage medium
CN113768419A (en) * 2021-09-17 2021-12-10 安克创新科技股份有限公司 Method and device for determining sweeping direction of sweeper and sweeper
CN114081400A (en) * 2021-11-16 2022-02-25 深圳市探博智能机器人有限公司 Robot return judgment method and system, cleaning equipment and storage medium
CN114081400B (en) * 2021-11-16 2023-04-14 深圳市探博智能机器人有限公司 Robot return judgment method and system, cleaning equipment and storage medium
CN114569004A (en) * 2022-02-22 2022-06-03 杭州萤石软件有限公司 Traveling direction adjustment method, mobile robot system, and electronic device
WO2023160305A1 (en) * 2022-02-22 2023-08-31 杭州萤石软件有限公司 Travelling direction adjusting method, mobile robot system and electronic device
CN114569004B (en) * 2022-02-22 2023-12-01 杭州萤石软件有限公司 Travel direction adjustment method, mobile robot system and electronic device
WO2023160369A1 (en) * 2022-02-26 2023-08-31 追觅创新科技(苏州)有限公司 Movement direction deviation rectifying method and system, and self-moving robot
CN115039561A (en) * 2022-06-30 2022-09-13 松灵机器人(深圳)有限公司 Mowing method, mowing device, mowing robot and storage medium
WO2024002061A1 (en) * 2022-06-30 2024-01-04 松灵机器人(深圳)有限公司 Mowing method and apparatus, mowing robot, and storage medium
CN115226476A (en) * 2022-07-21 2022-10-25 松灵机器人(深圳)有限公司 Mowing method, mowing device, mowing robot and storage medium

Also Published As

Publication number Publication date
CN110801180B (en) 2022-02-22

Similar Documents

Publication Publication Date Title
CN110801180B (en) Operation method and device of cleaning robot
CN110522359B (en) Cleaning robot and control method of cleaning robot
KR101725060B1 (en) Apparatus for recognizing location mobile robot using key point based on gradient and method thereof
KR101776622B1 (en) Apparatus for recognizing location mobile robot using edge based refinement and method thereof
JP6202544B2 (en) Robot positioning system
JP6732746B2 (en) System for performing simultaneous localization mapping using a machine vision system
KR101776621B1 (en) Apparatus for recognizing location mobile robot using edge based refinement and method thereof
US10060730B2 (en) System and method for measuring by laser sweeps
KR101784183B1 (en) APPARATUS FOR RECOGNIZING LOCATION MOBILE ROBOT USING KEY POINT BASED ON ADoG AND METHOD THEREOF
Akbarzadeh et al. Towards urban 3d reconstruction from video
CN110458897B (en) Multi-camera automatic calibration method and system and monitoring method and system
CN106569489A (en) Floor sweeping robot having visual navigation function and navigation method thereof
Fruh et al. Fast 3D model generation in urban environments
JP6636042B2 (en) Floor treatment method
CN111220148A (en) Mobile robot positioning method, system and device and mobile robot
Almansa-Valverde et al. Mobile robot map building from time-of-flight camera
WO2021146862A1 (en) Indoor positioning method for mobile device, mobile device and control system
TW201823768A (en) Method for creating an environment map for a processing unit
Yuan et al. Fast localization and tracking using event sensors
Duda et al. SRSL: Monocular self-referenced line structured light
CN114245091A (en) Projection position correction method, projection positioning method, control device and robot
KR100906991B1 (en) Method for detecting invisible obstacle of robot
Biber et al. 3d modeling of indoor environments for a robotic security guard
CN111571561B (en) Mobile robot
CN115855086A (en) Indoor scene autonomous reconstruction method, system and medium based on self-rotation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant