CN111700546A - Cleaning method of mobile robot and mobile robot - Google Patents

Cleaning method of mobile robot and mobile robot Download PDF

Info

Publication number
CN111700546A
CN111700546A CN202010590225.2A CN202010590225A CN111700546A CN 111700546 A CN111700546 A CN 111700546A CN 202010590225 A CN202010590225 A CN 202010590225A CN 111700546 A CN111700546 A CN 111700546A
Authority
CN
China
Prior art keywords
obstacle
mobile robot
identified
map corresponding
recognized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010590225.2A
Other languages
Chinese (zh)
Other versions
CN111700546B (en
Inventor
缪昭侠
闫瑞君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Technology Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Technology Co Ltd filed Critical Shenzhen Silver Star Intelligent Technology Co Ltd
Priority to CN202010590225.2A priority Critical patent/CN111700546B/en
Publication of CN111700546A publication Critical patent/CN111700546A/en
Application granted granted Critical
Publication of CN111700546B publication Critical patent/CN111700546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to a cleaning method for a mobile robot and a mobile robot. The method comprises the following steps: controlling a mobile robot to clean a current single subarea, identifying an obstacle to be identified in the single subarea, and acquiring a map corresponding to the obstacle to be identified; when the mobile robot finishes cleaning the single partition, acquiring a map corresponding to the single partition; comparing the map corresponding to the obstacle to be identified with the map corresponding to the single partition, and judging whether the obstacle to be identified moves; and if so, controlling the mobile robot to execute a first supplementary scanning action according to the map corresponding to the obstacle to be recognized. The cleaning method of the mobile robot and the mobile robot can identify the dynamic obstacles in the environment and timely perform supplementary cleaning on the area where the dynamic obstacles move, so that the cleaning area of the mobile robot is wider in coverage, the working capacity of the mobile robot is improved, and the user experience is improved.

Description

Cleaning method of mobile robot and mobile robot
Technical Field
The invention relates to the technical field of robots, in particular to a cleaning method of a mobile robot and the mobile robot.
Background
The artificial intelligence technology develops rapidly, and the influence of the cleaning robot on the life of a user is larger and larger. At present, cleaning robots can basically meet the needs of life, but some technologies still need to be improved, for example, cleaning robots can identify obstacles, but cannot well identify dynamic obstacles, such as people or animals moving in space. When such obstacles move, the cleaning robot fails to clean the ground to be cleaned, which results in the problem that the cleaning robot cannot clean the ground.
Disclosure of Invention
An object of an embodiment of the present invention is to provide a cleaning method for a mobile robot and the mobile robot, so as to solve the technical problem of missing cleaning by the cleaning robot.
In a first aspect, an embodiment of the present invention provides a cleaning method for a mobile robot, where the method includes:
controlling a mobile robot to clean a current single subarea, identifying an obstacle to be identified in the single subarea, and acquiring a map corresponding to the obstacle to be identified;
when the mobile robot finishes cleaning the single partition, acquiring a map corresponding to the single partition;
comparing the map corresponding to the obstacle to be identified with the map corresponding to the single partition, and judging whether the obstacle to be identified moves;
and if so, controlling the mobile robot to execute a first supplementary scanning action according to the map corresponding to the obstacle to be recognized.
Optionally, the identifying an obstacle to be identified in the single partition includes:
when the mobile robot detects an obstacle in the single partition, controlling the mobile robot to perform an edge action based on the obstacle;
and identifying the obstacle to be identified in the single subarea according to the motion trail of the mobile robot executing the edgewise action.
Optionally, the mobile robot is provided with an edge sensor, the edge sensor is arranged at the right side of the mobile robot,
the identifying the obstacle to be identified in the single partition according to the motion trail of the mobile robot executing the edgewise action comprises the following steps:
judging whether the motion trail of the edgewise action is closed or not;
if yes, obtaining the closing direction of the motion trail, and when the closing direction is the clockwise direction, determining that the obstacle of the mobile robot executing the edgewise action is the obstacle to be identified.
Optionally, the acquiring the closing direction of the motion trajectory includes:
calculating a curve change integral corresponding to the motion trail to obtain the area of a closed region of the motion trail;
when the area is positive, the closing direction is counterclockwise;
when the area is negative, the closing direction is clockwise.
Optionally, the comparing the map corresponding to the obstacle to be identified with the map corresponding to the single partition, and determining whether the obstacle to be identified moves includes:
acquiring the position information of the obstacle to be identified according to the map corresponding to the obstacle to be identified;
searching whether the position information comprises the obstacle to be identified or not in a map corresponding to the single partition;
and if not, the obstacle to be identified is a dynamic obstacle.
Optionally, the method further comprises:
and recording the stability weight of the obstacle to be identified.
Optionally, the recording the stability weight of the obstacle to be identified includes:
judging whether the mobile robot executes a cleaning action in a current single subarea for the first time;
if so, recording the stability weight of the obstacle to be recognized as 1;
if not, judging whether the current position of the obstacle to be recognized corresponds to a historical obstacle mark, if so, adding 1 to the weight of the stability of the obstacle to be recognized on the basis of the weight of the historical record; if no historical obstacle markers exist, the stability weight of the obstacle to be identified is 1.
Optionally, after the mobile robot cleans all of the single partitions, the method further includes:
planning a recharging route, wherein the recharging route comprises an area corresponding to the obstacle to be recognized, the stability weight of which is less than a preset threshold value;
controlling the mobile robot to execute a recharging action according to the recharging route, and judging whether the barrier to be recognized with the stability weight smaller than a preset threshold value moves or not;
and if so, controlling the mobile robot to sweep the corresponding area before the barrier to be identified moves.
Optionally, the method further comprises:
when the mobile robot cleans a complete partial area, acquiring a map corresponding to the complete area;
comparing the map corresponding to the obstacle to be identified with the maps corresponding to all the partitions, and judging whether the obstacle to be identified moves;
and if so, controlling the mobile robot to execute a second supplementary scanning action according to the map corresponding to the obstacle to be recognized.
In a second aspect, an embodiment of the present invention provides a mobile robot, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the above cleaning method of the mobile robot.
Different from the prior art, the cleaning method of the mobile robot and the mobile robot provided by the embodiment of the invention identify the obstacle to be identified in the current single partition and acquire the map corresponding to the obstacle to be identified when the mobile robot cleans the current single partition, acquire the map corresponding to the single partition after the mobile robot cleans the single partition, finally compare the map corresponding to the obstacle to be identified with the map corresponding to the single partition, determine whether the obstacle to be identified moves, and control the mobile robot to perform supplementary cleaning on the area where the obstacle to be identified moves if the obstacle to be identified moves. The cleaning method of the mobile robot and the mobile robot can identify the dynamic obstacles in the environment and timely perform supplementary cleaning on the area where the dynamic obstacles move, so that the cleaning area of the mobile robot is wider in coverage, the working capacity of the mobile robot is improved, and the user experience is improved.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a schematic diagram of an application scenario provided in an embodiment of the present invention;
fig. 2 is a schematic circuit structure diagram of a robot according to an embodiment of the present invention;
fig. 3 is a flowchart of a cleaning method of a mobile robot according to an embodiment of the present invention;
fig. 4 is a flowchart of a method for determining whether the obstacle to be recognized moves in a cleaning method of a mobile robot according to an embodiment of the present invention;
fig. 5 is a flowchart of a cleaning method of a mobile robot according to another embodiment of the present invention;
fig. 6 is a flowchart of a cleaning method of a mobile robot according to another embodiment of the present invention;
fig. 7 is a flowchart of a cleaning method of a mobile robot according to a further embodiment of the present invention;
fig. 8 is a schematic structural diagram of a cleaning device of a mobile robot according to an embodiment of the present invention;
fig. 9 is a schematic circuit structure diagram of a mobile robot according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, if not conflicted, the various features of the embodiments of the invention may be combined with each other within the scope of protection of the invention. Additionally, while functional block divisions are performed in apparatus schematics, with logical sequences shown in flowcharts, in some cases, steps shown or described may be performed in sequences other than block divisions in apparatus or flowcharts. The terms "first", "second", "third", and the like used in the present invention do not limit data and execution order, but distinguish the same items or similar items having substantially the same function and action.
The cleaning method and the cleaning device for the mobile robot provided by the embodiment of the invention can be applied to the application scene shown in fig. 1. The illustrated application scenario includes a robot 10. Wherein the robot 10 may be a mobile robot configured in any suitable shape to perform a particular business function operation, for example, in some embodiments, the mobile robot of the present invention includes, but is not limited to, a cleaning robot, including, but not limited to, a sweeping robot, a dust collection robot, a mopping robot, a floor washing robot, and the like.
Robot 10 may be a SLAM system based robot. During the movement of the robot 10, the movement locus thereof can be estimated by using a sensor such as a gyroscope. The robot 10 can detect an obstacle in the environment in the process of moving in the environment, when the robot 10 detects the obstacle, the mobile robot can be controlled to execute an edge motion based on the obstacle, whether the obstacle is a wall or other obstacles is judged according to a motion track of the robot 10 when executing the edge motion, when the obstacle is other obstacles, whether the obstacle is a dynamic obstacle is judged according to a map acquired when executing the edge motion and a map acquired after the robot 10 completes a work task of a current area, so that whether the obstacle moves can be accurately determined, and if the obstacle moves, the robot 10 is controlled to perform supplementary scanning on the area before the obstacle moves, so that the robot 10 is ensured not to miss scanning, and the cleaning capability of the robot 10 is improved.
In some embodiments, referring to fig. 2, the robot 10 includes a robot main body 11 (not shown), a laser radar 12, a camera unit 13, a controller 14, a traveling mechanism 15 (not shown), and a sensing unit 16; or the robot 10 employs only one of the laser radar 12 and the camera unit 13. The robot body 11 is a main structure of the robot, and may be made of a corresponding shape and structure and a corresponding manufacturing material (such as hard plastic or metal such as aluminum or iron) according to actual needs of the robot 10, for example, the robot body is configured to be a flat cylinder shape common to sweeping robots.
The traveling mechanism 15 is a structural device provided in the robot main body 11 and providing the robot 10 with a traveling capability. The running gear 15 can be realized in particular by means of any type of moving means, such as rollers, tracks, etc. The laser radar 12 is used for sensing the obstacle condition of the environment around the robot and obtaining obstacle information. The image pickup unit 13 is used to take an image, and may be various types of cameras, such as a wide-angle camera mounted on the main body 11. Generally, the laser radar 12 and the camera unit 13 are selected alternatively to reduce the cost.
In some embodiments, the sensing unit 16 is used to collect some motion parameters of the robot 10 and various types of data of the environment space, and the sensing unit 16 includes various types of suitable sensors, such as a gyroscope, an infrared sensor, an odometer, a magnetic field meter, an accelerometer, a speedometer, and the like.
The controller 14 is an electronic computing core built in the robot main body 11 for executing logical operation steps to realize intelligent control of the robot 10. The controller 14 is connected to the laser radar 12, the camera unit 13 and the sensing unit 16, and is configured to execute a preset algorithm to identify a dynamic obstacle and control the robot 10 to perform supplementary scanning according to data collected by the laser radar 12, the camera unit 13 and the sensing unit 16.
It should be noted that, depending on the task to be performed, in addition to the above functional modules, one or more other different functional modules (such as a water tank, a cleaning device, etc.) may be mounted on the robot main body 10 and cooperate with each other to perform the corresponding task.
Fig. 3 is a flowchart of a cleaning method for a mobile robot according to an embodiment of the present invention, which can be applied to the robot 10 in the above embodiment, and the method includes:
s101, controlling a mobile robot to clean a current single partition, identifying an obstacle to be identified in the single partition, and acquiring a map corresponding to the obstacle to be identified;
s102, when the mobile robot finishes cleaning the single partition, obtaining a map corresponding to the single partition;
s103, comparing the map corresponding to the obstacle to be recognized with the map corresponding to the single partition, and judging whether the obstacle to be recognized moves;
if the obstacle to be recognized has moved, the following S104 is performed.
And S104, controlling the mobile robot to execute a first supplementary scanning action according to the map corresponding to the obstacle to be recognized.
In this embodiment, the mobile robot is specifically a cleaning robot, which cleans a single sub-area during cleaning, and cleans another sub-area after cleaning one sub-area until all sub-areas are cleaned. The partition is a partition that divides a work area of the mobile robot into any one of a plurality of unit areas, and for example, one room may be regarded as one partition. The task that the mobile robot completes a single partition refers to a unit area that the mobile robot has moved and runs, namely the unit area that the mobile robot has completed work, specifically, the mobile robot firstly follows the wall for one circle in the partition to obtain the boundary of the whole partition, and then plans and covers the partition, obviously, the mobile robot cannot completely walk along the wall. From the perspective of the mobile robot, the "wall body" includes a wall body and a wall body of an obstacle that cannot be spanned by leaning on the wall body, and corresponding to the obstacle, along the wall may be an edge of the obstacle placed along the leaning wall body. For example, when a mobile robot walking along a wall encounters a cabinet or a mattress leaning on the wall, the mobile robot cannot enter the bottom of the obstacle and cannot continue to walk along the wall, but only can walk along the edge of the cabinet or the mattress, and after the edge of the obstacle is finished, the mobile robot continues to search for the wall and walks along the wall, which is considered as a wall-following process.
Wherein, the identifying the obstacle to be identified in the single partition in S101 includes:
when the mobile robot detects an obstacle in the single partition, controlling the mobile robot to perform an edge action based on the obstacle;
and identifying the obstacle to be identified in the single subarea according to the motion trail of the mobile robot executing the edgewise action.
Wherein, the mobile robot detects the barrier and includes: when the mobile robot detects the collision, the mobile robot considers that the obstacle is detected; or, the mobile robot may be considered to detect an obstacle when detecting that an obstacle exists within a preset distance by using an infrared sensor, a laser radar, a camera unit, or the like.
The edgewise action is a process that the mobile robot always keeps a stable distance from the detected obstacle and moves forwards.
The obstacles include fixed obstacles, movable obstacles, and the "wall body". The fixed barrier refers to a barrier which cannot move by its own ability, such as a sofa, a tea table, a bed, a carton, a refrigerator, and the like. It should be noted that the fixed barrier is not movable, but can be changed into a movable barrier with the help of external force, for example, when a person moves away from the carton, the carton is a movable barrier. The movable obstacle refers to an obstacle that can move by its own ability, such as a human, an animal pet, or the like.
Wherein the obstacle to be identified refers to other obstacles except the "wall body". The obstacle to be identified in the single partition can be identified according to the position of the edge sensor and the motion track of the edge action.
For example, the mobile robot is provided with an edge sensor, the edge sensor is arranged on the right side of the mobile robot, and the identifying the obstacle to be identified in the single partition according to the motion track of the mobile robot performing the edge action includes: judging whether the motion trail of the edgewise action is closed or not; if yes, obtaining the closing direction of the motion trail, and when the closing direction is the clockwise direction, determining that the obstacle of the mobile robot executing the edgewise action is the obstacle to be identified.
It is understood that the "wall body" is a boundary of the current partition, and when a track of the mobile robot performing the edgewise action is within the obstacle, the obstacle is the "wall body"; and when the track of the edgewise action executed by the mobile robot is outside the obstacle, the obstacle is the obstacle to be identified. And the relation between the track and the obstacle can be determined according to the position of the edge sensor.
The obstacle identification method comprises the following steps that an edge sensor is arranged on the right side of a mobile robot, and the mobile robot is positioned in the left hand direction of an obstacle when the mobile robot performs an edge action, so that if the motion trail corresponding to the edge action is closed and the motion trail moving along the obstacle is in the clockwise direction, the motion trail is out of the obstacle, and the obstacle is the obstacle to be identified; if the motion track corresponding to the edgewise action is closed and the motion track moving along the obstacle is in the counterclockwise direction, the track is within the obstacle, and the obstacle is the wall body.
Wherein the motion trail can be obtained according to the image shot by the camera unit. Whether the motion trajectory is closed includes: when the mobile robot returns to the origin from the origin, the motion trail is closed, otherwise, the motion trail is not closed. Whether the motion track is closed or not can be judged in a loop detection mode, for example, the position of the starting point of the mobile robot is obtained, an image is shot in the moving process of the mobile robot, the shot image is compared with an image corresponding to a preset key frame, if the similarity of the two images is high, the currently obtained image is a candidate frame for loop detection, and then the candidate frame is judged through other constraints so as to determine whether loop is detected or not.
Note that, in the loop back detection, the origin point is returned from the origin point, which is not necessarily the same point, and an error is allowed.
And when the motion trail is closed, acquiring the closing direction of the motion trail. Specifically, the method comprises the following steps: calculating a curve change integral corresponding to the motion trail to obtain the area of a closed region of the motion trail; when the area is positive, the closing direction is counterclockwise; when the area is negative, the closing direction is clockwise.
Wherein the calculating a curve variation integral corresponding to the motion trajectory to obtain an area of a closed region of the motion trajectory includes:
determining at least two path points according to the motion trail, and calculating curve integrals along the at least two path points, wherein each two path points in the at least two path points determine a segmentation line segment, and the y ═ y of the segmentation line segment is (y ═ y)n+yn+1)/2,dx=xn+1-xnThe curve variation of the segment is-0.5 (y)i+1+yi)*(xi+1-xi) (ii) a Wherein n and i are integers greater than or equal to 0.
Wherein, according to the green formula:
Figure BDA0002555165080000092
the green formula reveals the relationship of the double integral of a planar area and the line integral on a closed curve. Where L + represents the forward direction of the boundary curve along the enclosed region. We know from the derivation of the green equation:
Figure BDA0002555165080000091
where, if L ═ y, it can be ensured that the above equation (1) is constant positive in the region and equal to the closed region area. Therefore, the curve integration is only required to be performed along the polygon edge, if the integration is positive, the curve is in the positive direction (i.e. counterclockwise) of the boundary curve, otherwise, the curve is in the clockwise direction, and the absolute value of the obtained curve integration result is the area of the closed region.
In the above, the edgewise sensor may be provided on the right side of the mobile robot, the edgewise sensor may be provided on the left side of the mobile robot, and the recognition result of the obstacle corresponding to the closing direction defining the motion trajectory may be different depending on the position where the edgewise sensor is provided.
After determining the obstacle to be recognized in the current single partition through the step S101, further detecting whether the obstacle to be recognized moves, and if so, performing supplementary scanning on the area before the obstacle to be recognized moves.
The map corresponding to the obstacle to be recognized obtained in S101 is the map collected when the mobile robot executes the edgewise motion. The map of the mobile robot when performing the edgewise action may be collected by a camera unit of the mobile robot. The map includes an image within a preset range of a motion trajectory when performing the edgewise action, the image including image information of the obstacle to be recognized, such as a position, a shape, a category, a color, and the like of the obstacle to be recognized. The image information of the obstacle to be recognized may be marked on the map.
When the map corresponding to the single partition is acquired in S102, the map corresponding to the partition may also be acquired by a laser radar or a camera unit of the mobile robot, where the map corresponding to the partition includes a point cloud or an image corresponding to the partition, and the point cloud or the image includes information of all obstacles in the partition, such as positions of the obstacles.
As shown in fig. 4, S103, comparing the map corresponding to the obstacle to be recognized with the map corresponding to the single partition, and determining whether the obstacle to be recognized moves includes:
s1031, obtaining the position information of the obstacle to be recognized according to the map corresponding to the obstacle to be recognized;
s1032, searching whether the position information comprises the obstacle to be identified or not in a map corresponding to the single partition;
if not, the following step S1033 is performed.
And S1033, the obstacle to be identified is a dynamic obstacle.
And determining that the obstacle to be identified is a dynamic obstacle, namely, the obstacle to be identified moves. In the embodiment, whether the obstacle to be recognized moves can be determined only by comparing the image acquired when the mobile robot performs the edgewise action with the image of the current subarea. The movement of the obstacle can be determined by only comparing two images, so that the calculation amount is greatly reduced, and the calculation resource is saved.
And after the movement of the obstacle to be recognized is detected, controlling the mobile robot to perform supplementary scanning on the area before the movement of the obstacle to be recognized. And executing a first supplementary sweeping action, namely sweeping the corresponding area by the mobile robot.
After the obstacle to be recognized is recognized, the position of the obstacle to be recognized can be marked on the map, and after the obstacle to be recognized moves, the mobile robot can go to the position to execute the first supplementary scanning action.
The embodiment of the invention provides a cleaning method of a mobile robot, which is characterized in that when the mobile robot cleans a current single subarea, the obstacle to be identified in the current single subarea is identified, a map corresponding to the obstacle to be identified is obtained, after the mobile robot cleans the single subarea, the map corresponding to the single subarea is obtained, finally, the map corresponding to the obstacle to be identified and the map corresponding to the single subarea are compared, whether the obstacle to be identified moves or not is determined, and if the obstacle to be identified moves, the mobile robot is controlled to perform supplementary cleaning on the area where the obstacle to be identified moves. The method provided by the embodiment of the invention can identify the dynamic barrier in the environment and timely perform supplementary sweeping on the area after the dynamic barrier moves, so that the sweeping area of the mobile robot is wider in coverage, the working capacity of the mobile robot is improved, and the user experience is improved.
Fig. 5 is a flowchart of a cleaning method of a mobile robot according to another embodiment of the present invention. Fig. 5 differs from fig. 3 mainly in that the method further comprises:
s105, when the mobile robot cleans a complete partial area, obtaining a map corresponding to the whole area;
s106, comparing the map corresponding to the obstacle to be recognized with the maps corresponding to all the partitions, and judging whether the obstacle to be recognized moves;
if the obstacle to be recognized moves, the following step S107 is executed.
And S107, controlling the mobile robot to execute a second supplementary scanning action according to the map corresponding to the obstacle to be recognized.
In this embodiment, after the mobile robot has performed all the work tasks of the individual partitions, it is further determined whether or not there is a moving obstacle in all the individual partitions.
During the process that the mobile robot executes tasks in each single partition, if an obstacle is detected, the mobile robot performs an edge action based on the obstacle to determine whether the obstacle is the obstacle to be identified, and a map containing information of the obstacle to be identified is collected during the process of executing the edge action. The map containing the information of the obstacle to be recognized may be an image containing the information of the obstacle to be recognized, and the image includes a position, a shape, a category, a color, and the like of the obstacle to be recognized.
The detailed process of detecting whether the obstacle to be identified moves according to the maps corresponding to all the partitions and the map containing the obstacle information to be identified may refer to the method embodiment described above.
After the movement of the obstacle to be recognized is detected, the mobile robot is controlled to sweep the area before the movement of the obstacle to be recognized. If a plurality of moving obstacles to be identified are detected, the mobile robot can firstly carry out supplementary scanning on the nearest area during the supplementary scanning process, and then carry out supplementary scanning on the next nearest area based on the area until all the areas to be subjected to supplementary scanning are completely cleaned.
The embodiment of the invention provides a cleaning method of a mobile robot on the basis of the method embodiment, which further improves the cleaning coverage rate of the mobile robot and can avoid missing an uncleaned area in the environment.
Fig. 6 is a flowchart of a cleaning method for a mobile robot according to another embodiment of the present invention. The main difference between fig. 6 and fig. 5 is that the method further comprises:
and S108, recording the stability weight of the obstacle to be recognized.
It can be understood that the partition map obtained by the mobile robot may be reused for multiple times, the mobile robot may encounter obstacles during historical working, and the obstacles encountered each time may be marked in the map, so that the probability that the obstacle currently encountered may move may be evaluated by using historical marking information of the obstacle, and a variable, that is, the stability weight, may be defined.
In the present embodiment, the stability weight refers to a probability that the obstacle to be recognized may move, which may be expressed by a specific value. When the stability weight is larger, the obstacle to be recognized is more unlikely to move; the obstacle to be identified is more likely to move when the stability weight is smaller.
The recording the stability weight of the obstacle to be identified may specifically include: judging whether the mobile robot executes a motion action in the current subarea for the first time; if so, recording the stability weight of the obstacle to be recognized as 1; if not, judging whether the current position of the obstacle to be recognized corresponds to a historical obstacle mark, if so, adding 1 to the stability weight of the obstacle to be recognized on the basis of the stability weight of the historical record; if no historical obstacle markers exist, the stability weight of the obstacle to be identified is 1.
The method comprises the steps of firstly judging whether the mobile robot executes motion actions in a current subarea for the first time, namely judging whether the mobile robot works in the current subarea for the first time, and determining whether the mobile robot works in the current subarea according to system records. If the mobile robot is working in the current zone for the first time, its stability weight may be recorded as 1. It should be noted that, besides recording the stability weight as 1, the stability weight may also be any other value, such as 0.1, 10, etc. And if the mobile robot does not work in the current subarea for the first time, further judging whether a historical obstacle mark corresponding to the current position of the obstacle to be recognized is included in the historical record of the mobile robot, wherein the historical obstacle mark is used for indicating the position information, the article information and the like of the obstacle to be recognized. If the historical obstacle mark is detected, the obstacle to be recognized is indicated to be in the position historically, and the obstacle to be recognized in the position is detected again currently, the probability of movement of the obstacle to be recognized is less, so 1 can be added to the stability weight of the historical record of the obstacle to be recognized, and the obtained result is the current stability weight of the obstacle to be recognized. It should be noted that, besides the value 1, any other value may be used, and the values added each time are accumulated on the previous basis, and the values added each time should be the same. Also, different obstacles should be incremented by the same value when calculating their respective stability weights in order to compare the stability of different obstacles.
In the embodiment of the invention, the stability weight of the identified obstacle to be identified is recorded to preliminarily judge the probability of possible movement of the obstacle to be identified, so that reference can be provided for the identification of the subsequent dynamic obstacle, the accuracy rate of identifying the dynamic obstacle is further improved, and the obstacle which moves can be more accurately found out so as to complement the area which is not scanned in the environment.
Fig. 7 is a flowchart of a cleaning method for a mobile robot according to still another embodiment of the present invention. The main difference between fig. 7 and fig. 6 is that the method further comprises:
s109, planning a recharging route, wherein the recharging route comprises an area corresponding to the obstacle to be recognized, the stability weight of which is smaller than a preset threshold value;
s110, controlling the mobile robot to execute a recharging action according to the recharging route, and judging whether the obstacle to be recognized with the stability weight smaller than a preset threshold value moves or not;
if the obstacle to be recognized moves, the following step S111 is performed.
And S111, controlling the mobile robot to sweep the corresponding area before the obstacle to be recognized moves.
The recharging route is a route for the mobile robot to return to the charging power supply after the mobile robot has cleaned all the partitions, and may be a route for returning to a starting position.
The preset threshold may be any value representing the stability weight. The preset threshold value can be used as a critical value for movement of an obstacle to be recognized, namely, the threshold value is smaller than the preset threshold value, the obstacle to be recognized is more likely to move, and the obstacle to be recognized is less likely to move if the threshold value is larger than the preset threshold value.
If a plurality of areas corresponding to the obstacles to be identified with the stability weights smaller than the preset threshold exist, when the recharging route is planned, the mobile robot can select one area closest to the mobile robot according to the current position of the mobile robot, then determine the other area closest to the mobile robot according to the selected area, and so on, so as to obtain the recharging route. Of course, the recharge route may also be planned in other ways.
The mobile robot moves according to the recharging route, whether the obstacle to be recognized is still in the area or not is detected again in the area corresponding to each obstacle to be recognized with the stability weight smaller than the preset threshold value, and if the obstacle to be recognized is not in the area, the obstacle to be recognized is determined to move. Whether the obstacle to be recognized is still in the area can be detected through an image shot by the camera unit, and whether the area further comprises the obstacle to be recognized can also be determined through other modes such as laser radar.
In the embodiment of the invention, in the recharging process of the mobile robot, whether the obstacle to be recognized with lower stability weight moves or not is further judged, and if the obstacle to be recognized moves, the area before the obstacle to be recognized moves is subjected to supplementary scanning again. The method improves the accuracy of identifying the dynamic barrier, can realize comprehensive cleaning of all the subareas, improves the cleaning rate and improves the user experience.
Fig. 8 is a schematic structural diagram of a cleaning device of a mobile robot according to an embodiment of the present invention. As shown in fig. 8, the apparatus 20 includes: the device comprises a first processing module 201, a first obtaining module 202, a first judging module 203 and a first complementary scanning module 204.
The first processing module 201 is configured to control the mobile robot to clean a current single partition, identify an obstacle to be identified in the single partition, and acquire a map corresponding to the obstacle to be identified; the first obtaining module 202 is configured to obtain a map corresponding to the single partition when the mobile robot cleans the single partition; the first judging module 203 is configured to compare the map corresponding to the obstacle to be identified with the map corresponding to the single partition, and judge whether the obstacle to be identified moves; the first supplementary scanning module 204 is configured to, if the obstacle to be identified moves, control the mobile robot to execute a first supplementary scanning action according to the map corresponding to the obstacle to be identified.
When identifying the obstacle to be identified, the first processing module 201 specifically includes: when the mobile robot detects an obstacle in the single partition, controlling the mobile robot to perform an edge action based on the obstacle; and identifying the obstacle to be identified in the single subarea according to the motion trail of the mobile robot executing the edgewise action.
The method for recognizing the obstacle to be recognized in the single partition according to the motion track of the mobile robot executing the edgewise action comprises the following steps: judging whether the motion trail of the edgewise action is closed or not; if yes, obtaining the closing direction of the motion trail, and when the closing direction is the clockwise direction, determining that the obstacle of the mobile robot executing the edgewise action is the obstacle to be identified.
Wherein the obtaining the closing direction of the motion trajectory comprises: calculating a curve change integral corresponding to the motion trail to obtain the area of a closed region of the motion trail; when the area is positive, the closing direction is counterclockwise; when the area is negative, the closing direction is clockwise.
The first determining module 203 is specifically configured to: acquiring the position information of the obstacle to be identified according to the map corresponding to the obstacle to be identified; searching whether the position information comprises the obstacle to be identified or not in a map corresponding to the single partition; and if not, the obstacle to be identified is a dynamic obstacle.
In some embodiments, the apparatus 20 further comprises a weight recording module 205, and the weight recording module 205 is configured to record a stability weight of the obstacle to be identified. The weight recording module 205 is specifically configured to: judging whether the mobile robot executes a cleaning action in a current single subarea for the first time; if so, recording the stability weight of the obstacle to be recognized as 1; if not, judging whether the current position of the obstacle to be recognized corresponds to a historical obstacle mark, if so, adding 1 to the weight of the stability of the obstacle to be recognized on the basis of the weight of the historical record; if no historical obstacle markers exist, the stability weight of the obstacle to be identified is 1.
In some embodiments, when the mobile robot cleans all of the single partitions, the apparatus 20 further includes a second processing module 206, where the second processing module 206 is specifically configured to: planning a recharging route, wherein the recharging route comprises an area corresponding to the obstacle to be recognized, the stability weight of which is less than a preset threshold value; controlling the mobile robot to execute a recharging action according to the recharging route, and judging whether the barrier to be recognized with the stability weight smaller than a preset threshold value moves or not; and if so, controlling the mobile robot to sweep the corresponding area before the barrier to be identified moves.
In some embodiments, the apparatus 20 further includes a second sweep module 207, and the second sweep module 207 is specifically configured to: when the mobile robot cleans all the subareas, acquiring maps corresponding to all the subareas; comparing the map corresponding to the obstacle to be identified with the maps corresponding to all the partitions, and judging whether the obstacle to be identified moves; and if so, controlling the mobile robot to execute a second supplementary scanning action according to the map corresponding to the obstacle to be recognized.
The cleaning device of the mobile robot can perform the cleaning method of the mobile robot provided by the embodiment of the invention, and has functional modules and beneficial effects corresponding to the execution method. For technical details that are not described in detail in the embodiment of the cleaning device of the mobile robot, reference may be made to the cleaning method of the mobile robot provided in the embodiment of the present invention.
Fig. 9 is a schematic circuit structure diagram of a mobile robot according to an embodiment of the present invention. Wherein the mobile robot may be any type of cleaning robot. As shown in fig. 9, the mobile robot includes one or more processors 31 and a memory 32. In fig. 9, one processor 31 is taken as an example.
The processor 31 and the memory 32 may be connected by a bus or other means, and fig. 9 illustrates the connection by a bus as an example.
The memory 32 is a non-volatile computer-readable storage medium and can be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the cleaning method of the mobile robot in the embodiment of the present invention. The processor 31 executes various functional applications and data processing of the cleaning device of the mobile robot by running the nonvolatile software program, instructions and modules stored in the memory 32, that is, the functions of the cleaning method of the mobile robot and the various modules or units of the device embodiments provided by the above method embodiments are realized.
The memory 32 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 32 may optionally include memory located remotely from the processor 31, and these remote memories may be connected to the processor 31 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The program instructions/modules are stored in the memory 32 and, when executed by the one or more processors 31, perform the cleaning method of the mobile robot in any of the method embodiments described above.
Embodiments of the present invention also provide a non-volatile computer storage medium, where the computer storage medium stores computer-executable instructions, which are executed by one or more processors, such as one processor 31 in fig. 9, so that the one or more processors may execute the cleaning method of the mobile robot in any of the above method embodiments.
An embodiment of the present invention further provides a computer program product, which includes a computer program stored on a non-volatile computer-readable storage medium, the computer program including program instructions that, when executed by the mobile robot, cause the mobile robot to perform any one of the cleaning methods of the mobile robot.
In summary, in the cleaning method of the mobile robot of the present invention, after the coverage of the current single partition is completed, before entering the next single partition, it is determined whether the obstacle to be recognized moves, and if so, the area before the obstacle to be recognized moves is subjected to supplementary cleaning; if not, the next single partition is covered, and the above-mentioned complementary scanning method is also executed. And after the last single partition executes the supplementary scanning method, judging whether the obstacles to be identified of all the single partitions move or not, and if so, performing supplementary scanning on the corresponding area. If the mobile robot does not move, cleaning is finished.
The above-described embodiments of the apparatus or device are merely illustrative, wherein the unit modules described as separate parts may or may not be physically separate, and the parts displayed as module units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network module units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the above technical solutions substantially or contributing to the related art may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; within the idea of the invention, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A cleaning method of a mobile robot, characterized in that the method comprises:
controlling a mobile robot to clean a current single subarea, identifying an obstacle to be identified in the single subarea, and acquiring a map corresponding to the obstacle to be identified;
when the mobile robot finishes cleaning the single partition, acquiring a map corresponding to the single partition;
comparing the map corresponding to the obstacle to be identified with the map corresponding to the single partition, and judging whether the obstacle to be identified moves;
and if so, controlling the mobile robot to execute a first supplementary scanning action according to the map corresponding to the obstacle to be recognized.
2. The method of claim 1, wherein the identifying an obstacle to be identified in the single partition comprises:
when the mobile robot detects an obstacle in the single partition, controlling the mobile robot to perform an edge action based on the obstacle;
and identifying the obstacle to be identified in the single subarea according to the motion trail of the mobile robot executing the edgewise action.
3. The method according to claim 2, characterized in that the mobile robot is provided with an edgewise sensor, which is provided at the right side of the mobile robot,
the identifying the obstacle to be identified in the single partition according to the motion trail of the mobile robot executing the edgewise action comprises the following steps:
judging whether the motion trail of the edgewise action is closed or not;
if yes, obtaining the closing direction of the motion trail, and when the closing direction is the clockwise direction, determining that the obstacle of the mobile robot executing the edgewise action is the obstacle to be identified.
4. The method of claim 3, wherein the obtaining the closing direction of the motion trajectory comprises:
calculating a curve change integral corresponding to the motion trail to obtain the area of a closed region of the motion trail;
when the area is positive, the closing direction is counterclockwise;
when the area is negative, the closing direction is clockwise.
5. The method according to any one of claims 1 to 4, wherein the comparing the map corresponding to the obstacle to be identified with the map corresponding to the single partition to determine whether the obstacle to be identified moves comprises:
acquiring the position information of the obstacle to be identified according to the map corresponding to the obstacle to be identified;
searching whether the position information comprises the obstacle to be identified or not in a map corresponding to the single partition;
and if not, the obstacle to be identified is a dynamic obstacle.
6. The method according to any one of claims 1 to 4, further comprising:
and recording the stability weight of the obstacle to be identified.
7. The method of claim 6, wherein the recording the stability weight of the obstacle to be identified comprises:
judging whether the mobile robot executes a cleaning action in a current single subarea for the first time;
if so, recording the stability weight of the obstacle to be recognized as 1;
if not, judging whether the current position of the obstacle to be recognized corresponds to a historical obstacle mark, if so, adding 1 to the weight of the stability of the obstacle to be recognized on the basis of the weight of the historical record; if no historical obstacle markers exist, the stability weight of the obstacle to be identified is 1.
8. The method of claim 6, wherein when the mobile robot has cleared all of the individual partitions, the method further comprises:
planning a recharging route, wherein the recharging route comprises an area corresponding to the obstacle to be recognized, the stability weight of which is less than a preset threshold value;
controlling the mobile robot to execute a recharging action according to the recharging route, and judging whether the barrier to be recognized with the stability weight smaller than a preset threshold value moves or not;
and if so, controlling the mobile robot to sweep the corresponding area before the barrier to be identified moves.
9. The method according to any one of claims 1 to 4, further comprising:
when the mobile robot cleans a complete partial area, acquiring a map corresponding to the complete area;
comparing the map corresponding to the obstacle to be identified with the maps corresponding to all the partitions, and judging whether the obstacle to be identified moves;
and if so, controlling the mobile robot to execute a second supplementary scanning action according to the map corresponding to the obstacle to be recognized.
10. A mobile robot, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 9.
CN202010590225.2A 2020-06-24 2020-06-24 Cleaning method of mobile robot and mobile robot Active CN111700546B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010590225.2A CN111700546B (en) 2020-06-24 2020-06-24 Cleaning method of mobile robot and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010590225.2A CN111700546B (en) 2020-06-24 2020-06-24 Cleaning method of mobile robot and mobile robot

Publications (2)

Publication Number Publication Date
CN111700546A true CN111700546A (en) 2020-09-25
CN111700546B CN111700546B (en) 2022-09-13

Family

ID=72542757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010590225.2A Active CN111700546B (en) 2020-06-24 2020-06-24 Cleaning method of mobile robot and mobile robot

Country Status (1)

Country Link
CN (1) CN111700546B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112180931A (en) * 2020-09-30 2021-01-05 小狗电器互联网科技(北京)股份有限公司 Sweeping path planning method and device of sweeper and readable storage medium
CN112168066A (en) * 2020-09-30 2021-01-05 深圳市银星智能科技股份有限公司 Control method and device for cleaning robot, cleaning robot and storage medium
CN112826373A (en) * 2021-01-21 2021-05-25 深圳乐动机器人有限公司 Cleaning method, device, equipment and storage medium of cleaning robot
CN112971615A (en) * 2021-02-03 2021-06-18 追创科技(苏州)有限公司 Control method of intelligent cleaning equipment and intelligent cleaning equipment
US20210200234A1 (en) * 2019-12-27 2021-07-01 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling thereof
CN113219985A (en) * 2021-05-27 2021-08-06 九天创新(广东)智能科技有限公司 Road planning method and device for sweeper and sweeper
CN113616119A (en) * 2021-08-23 2021-11-09 追觅创新科技(苏州)有限公司 Cleaning method and device for mobile robot, storage medium, and electronic device
CN113670292A (en) * 2021-08-10 2021-11-19 追觅创新科技(苏州)有限公司 Map drawing method and device, sweeper, storage medium and electronic device
CN113855835A (en) * 2021-09-27 2021-12-31 丰疆智能(深圳)有限公司 Disinfection method and device, storage medium and disinfection robot
CN113876246A (en) * 2021-08-31 2022-01-04 洁博士南京环保设备有限公司 Control method for visual obstacle avoidance of mechanical arm of intelligent cleaning robot
CN114569001A (en) * 2022-03-16 2022-06-03 北京石头世纪科技股份有限公司 Intelligent mobile device
CN115040038A (en) * 2022-06-22 2022-09-13 杭州萤石软件有限公司 Robot control method and device and robot
CN115153350A (en) * 2022-07-14 2022-10-11 深圳拓邦股份有限公司 Supplementary sweeping method and device of sweeping robot, storage medium and sweeping robot
WO2023019922A1 (en) * 2021-08-20 2023-02-23 北京石头创新科技有限公司 Navigation method and self-propelled apparatus
WO2023217190A1 (en) * 2022-05-10 2023-11-16 美智纵横科技有限责任公司 Cleaning method, cleaning apparatus, cleaning device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106863305A (en) * 2017-03-29 2017-06-20 赵博皓 A kind of sweeping robot room map creating method and device
CN107550399A (en) * 2017-08-17 2018-01-09 北京小米移动软件有限公司 timing cleaning method and device
CN108445878A (en) * 2018-02-28 2018-08-24 北京奇虎科技有限公司 A kind of obstacle processing method and sweeping robot for sweeping robot
CN109562519A (en) * 2016-08-03 2019-04-02 Lg电子株式会社 Mobile robot and its control method
CN109891348A (en) * 2016-11-09 2019-06-14 东芝生活电器株式会社 Autonomous body
CN110403528A (en) * 2019-06-12 2019-11-05 深圳乐动机器人有限公司 A kind of method and system improving cleaning coverage rate based on clean robot
CN110772178A (en) * 2019-09-25 2020-02-11 深圳市无限动力发展有限公司 Sweeping method and device of sweeper, computer equipment and storage medium
KR20200027068A (en) * 2018-08-27 2020-03-12 엘지전자 주식회사 Robot cleaner and method for controlling the same

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109562519A (en) * 2016-08-03 2019-04-02 Lg电子株式会社 Mobile robot and its control method
CN109891348A (en) * 2016-11-09 2019-06-14 东芝生活电器株式会社 Autonomous body
CN106863305A (en) * 2017-03-29 2017-06-20 赵博皓 A kind of sweeping robot room map creating method and device
CN107550399A (en) * 2017-08-17 2018-01-09 北京小米移动软件有限公司 timing cleaning method and device
CN108445878A (en) * 2018-02-28 2018-08-24 北京奇虎科技有限公司 A kind of obstacle processing method and sweeping robot for sweeping robot
KR20200027068A (en) * 2018-08-27 2020-03-12 엘지전자 주식회사 Robot cleaner and method for controlling the same
CN110403528A (en) * 2019-06-12 2019-11-05 深圳乐动机器人有限公司 A kind of method and system improving cleaning coverage rate based on clean robot
CN110772178A (en) * 2019-09-25 2020-02-11 深圳市无限动力发展有限公司 Sweeping method and device of sweeper, computer equipment and storage medium

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210200234A1 (en) * 2019-12-27 2021-07-01 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling thereof
US11874668B2 (en) * 2019-12-27 2024-01-16 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling thereof
CN112168066A (en) * 2020-09-30 2021-01-05 深圳市银星智能科技股份有限公司 Control method and device for cleaning robot, cleaning robot and storage medium
CN112180931A (en) * 2020-09-30 2021-01-05 小狗电器互联网科技(北京)股份有限公司 Sweeping path planning method and device of sweeper and readable storage medium
CN112180931B (en) * 2020-09-30 2024-04-12 北京小狗吸尘器集团股份有限公司 Cleaning path planning method and device of sweeper and readable storage medium
CN112826373B (en) * 2021-01-21 2022-05-06 深圳乐动机器人有限公司 Cleaning method, device, equipment and storage medium of cleaning robot
CN112826373A (en) * 2021-01-21 2021-05-25 深圳乐动机器人有限公司 Cleaning method, device, equipment and storage medium of cleaning robot
CN112971615A (en) * 2021-02-03 2021-06-18 追创科技(苏州)有限公司 Control method of intelligent cleaning equipment and intelligent cleaning equipment
CN113219985A (en) * 2021-05-27 2021-08-06 九天创新(广东)智能科技有限公司 Road planning method and device for sweeper and sweeper
CN113670292B (en) * 2021-08-10 2023-10-20 追觅创新科技(苏州)有限公司 Map drawing method and device, sweeper, storage medium and electronic device
CN113670292A (en) * 2021-08-10 2021-11-19 追觅创新科技(苏州)有限公司 Map drawing method and device, sweeper, storage medium and electronic device
WO2023019922A1 (en) * 2021-08-20 2023-02-23 北京石头创新科技有限公司 Navigation method and self-propelled apparatus
WO2023025023A1 (en) * 2021-08-23 2023-03-02 追觅创新科技(苏州)有限公司 Cleaning method and apparatus of mobile robot, and storage medium and electronic apparatus
CN113616119A (en) * 2021-08-23 2021-11-09 追觅创新科技(苏州)有限公司 Cleaning method and device for mobile robot, storage medium, and electronic device
CN113876246A (en) * 2021-08-31 2022-01-04 洁博士南京环保设备有限公司 Control method for visual obstacle avoidance of mechanical arm of intelligent cleaning robot
CN113855835A (en) * 2021-09-27 2021-12-31 丰疆智能(深圳)有限公司 Disinfection method and device, storage medium and disinfection robot
CN114569001A (en) * 2022-03-16 2022-06-03 北京石头世纪科技股份有限公司 Intelligent mobile device
CN114569001B (en) * 2022-03-16 2023-10-20 北京石头世纪科技股份有限公司 Intelligent mobile device
WO2023217190A1 (en) * 2022-05-10 2023-11-16 美智纵横科技有限责任公司 Cleaning method, cleaning apparatus, cleaning device, and storage medium
CN115040038A (en) * 2022-06-22 2022-09-13 杭州萤石软件有限公司 Robot control method and device and robot
CN115153350A (en) * 2022-07-14 2022-10-11 深圳拓邦股份有限公司 Supplementary sweeping method and device of sweeping robot, storage medium and sweeping robot

Also Published As

Publication number Publication date
CN111700546B (en) 2022-09-13

Similar Documents

Publication Publication Date Title
CN111700546B (en) Cleaning method of mobile robot and mobile robot
US10102429B2 (en) Systems and methods for capturing images and annotating the captured images with information
CN111625007A (en) Method for identifying dynamic obstacle and mobile robot
US10500722B2 (en) Localization and mapping using physical features
CN113110457B (en) Autonomous coverage inspection method for intelligent robot in indoor complex dynamic environment
CN111399516B (en) Robot path planning method and device and robot
JP6977093B2 (en) How to control a mobile robot
US20230157506A1 (en) Trajectory-based localization and mapping
CN108481327A (en) A kind of positioning device, localization method and the robot of enhancing vision
US20210213619A1 (en) Robot and control method therefor
CN113532461B (en) Robot autonomous obstacle avoidance navigation method, equipment and storage medium
CN111714028A (en) Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
CN112015186A (en) Robot path planning method and device with social attributes and robot
CN116576857A (en) Multi-obstacle prediction navigation obstacle avoidance method based on single-line laser radar
CN208289901U (en) A kind of positioning device and robot enhancing vision
CN112033423B (en) Robot path planning method and device based on road consensus and robot
CN111700553B (en) Obstacle avoidance method, device, robot and storage medium
CN112540613A (en) Method and device for searching recharging seat position and mobile robot
CN111444852A (en) Loop detection method and device and robot
CN109512340B (en) Control method of cleaning robot and related equipment
CN111998853A (en) AGV visual navigation method and system
KR102467990B1 (en) Robot cleaner
CN114967698A (en) Cleaning method, cleaning device, electronic apparatus, and storage medium
JP7354528B2 (en) Autonomous mobile device, method and program for detecting dirt on lenses of autonomous mobile device
KR20180048088A (en) Robot cleaner and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000 1701, building 2, Yinxing Zhijie, No. 1301-72, sightseeing Road, Xinlan community, Guanlan street, Longhua District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Yinxing Intelligent Group Co.,Ltd.

Address before: 518000 building A1, Yinxing hi tech Industrial Park, Guanlan street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen Silver Star Intelligent Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant