CN110362079B - Traversal control method and chip of robot and cleaning robot - Google Patents

Traversal control method and chip of robot and cleaning robot Download PDF

Info

Publication number
CN110362079B
CN110362079B CN201910623713.6A CN201910623713A CN110362079B CN 110362079 B CN110362079 B CN 110362079B CN 201910623713 A CN201910623713 A CN 201910623713A CN 110362079 B CN110362079 B CN 110362079B
Authority
CN
China
Prior art keywords
robot
point
parallel
edge
nth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910623713.6A
Other languages
Chinese (zh)
Other versions
CN110362079A (en
Inventor
肖刚军
黄泰明
许登科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN201910623713.6A priority Critical patent/CN110362079B/en
Publication of CN110362079A publication Critical patent/CN110362079A/en
Application granted granted Critical
Publication of CN110362079B publication Critical patent/CN110362079B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a traversing control method of a robot, a chip and a cleaning robot, wherein in the traversing control method of the robot, the robot defines a region to be traversed by adopting a mode of combining a positioning edge and an edge, and after the region is traversed, the next region to be traversed is defined by the same mode, and then the traversing is continued. And by analogy, the traversal of all the areas is completed. In the traversing modes, the starting point of the robot is taken as the circle center, the area to be cleaned is divided into a plurality of small areas, and the small areas are taken as units for traversing. In addition, the robot can avoid missing some areas with smaller entrances in the way.

Description

Traversal control method and chip of robot and cleaning robot
Technical Field
The invention relates to the field of intelligent robots, in particular to a robot traversal control method, a chip and a cleaning robot.
Background
The existing robot SLAM algorithm mainly adopts three ground traversing modes: one is to directly traverse the grid regions one by one, and complete the traversal of the global region after all the grid regions are traversed; the other is that, the global edge is firstly carried out, after a global range is defined, the traversal of the grid regions one by one is carried out; in another example, a grid region is defined first, and after the defined grid region is traversed, the next grid region is defined and traversed. These traversal methods have both advantages and disadvantages. Due to the complexity and diversity of home environments, the existing traversal mode cannot meet all requirements, and more robots with different traversal modes are needed in the market.
Disclosure of Invention
The invention provides a robot traversal control method, a chip and a cleaning robot, which can improve the robot traversal efficiency. The specific technical scheme of the invention is as follows:
a robot traversal control method comprises the following steps: step S1: the robot takes the current position as an original point, moves forwards and straightly along the Nth direction, determines the current position as an Nth positioning point after detecting a wall or a wall-leaning obstacle, and enters a step S2, wherein a path from the original point to the Nth positioning point is an Nth positioning edge; step S2: the robot returns to the origin and then proceeds to step S3; step S3: the robot carries out pivot steering at the origin, and judges whether the sum of angles rotated in the preset rotating direction from the first positioning edge is equal to 360 degrees or not, if so, the step S8 is carried out, and if not, the step S4 is carried out; step S4: the robot turns to the (N + 1) th direction, an included angle between the (N + 1) th direction and the (N) th direction is an Nth included angle, the Nth included angle is smaller than 180 degrees, and then the step S5 is carried out; step S5: the robot walks along the (N + 1) th direction, determines that the current position is an (N + 1) th positioning point after detecting a wall or a wall-approaching obstacle, and a walking path from the origin to the (N + 1) th positioning point is an (N + 1) th positioning edge, and then enters step S6; step S6: the robot turns at the N +1 positioning point, walks towards the Nth positioning point along the edge, the path of the walking along the edge is the Nth edge, the Nth edge is positioned in the range of the Nth included angle, and then the step S7 is carried out; step S7: when the robot walks to the Nth positioning point, an Nth encircled area surrounded by the Nth positioning edge, the Nth edgewise edge and the (N + 1) th positioning edge is encircled, the robot traverses the Nth encircled area, after the traversal is finished, the value of N is added with 1, and the step S2 is returned; step S8: the robot stops rotating, walks to the corresponding positioning point along the current positioning edge, then walks edgewise towards the upper positioning point, when the robot walks to the upper positioning point, a residual area defined by the current edgewise path and the positioning edges connected with the two ends of the current edgewise path is defined, the robot traverses the residual area, and finally the traversal of all the areas is completed. Wherein, the N is a natural number.
Further, the value of the nth angle is any angle value greater than or equal to 60 ° and less than or equal to 90 °.
Further, in the process that the robot walks from the origin to the positioning point, if an isolated obstacle is detected, the following steps are executed: and the robot walks along the edge along one side of the isolated barrier in the delineating area, and continues to move forward towards the positioning point along the linear direction when the robot walks to the linear direction of the origin point towards the positioning point.
Further, the traversing the nth bounding region by the robot in step S7 specifically includes: the robot starts from the Nth positioning point, walks to the (N + 1) th positioning point along the Nth edge to form a first parallel point, and the line distance between a straight line where the first parallel point is located and the straight line where the Nth positioning edge is located and a straight line where the first parallel point is located and the straight line where the Nth positioning edge is located is a preset distance; the robot turns to the direction parallel to the Nth direction at the first parallel point, and is in straight parallel along the current direction, and the walking path is a first parallel side; when the robot walks to the (N + 1) th positioning edge, the wall or the wall-leaning barrier, taking the current position point as a second parallel point, and the robot walks to a third parallel point along the (N + 1) th positioning edge, the wall or the wall-leaning barrier towards the (N + 1) th positioning point, wherein the line distance between a straight line parallel to the N direction where the third parallel point is located and a straight line where the first parallel edge is located is a preset distance; the robot turns to the direction parallel to the Nth direction at the third parallel point, and moves straight along the current direction, and the walking path is a second parallel edge; when the robot walks to the Nth edge, the current position point is taken as a fourth parallel point, the robot walks to a fifth parallel point towards the (N + 1) th positioning point along the Nth edge, a straight line where the fifth parallel point is located is parallel to the Nth direction, and the parallel distance between the straight line where the fifth parallel point is located and the straight line where the second parallel edge is located is a preset distance; the robot turns to the direction parallel to the Nth direction at the fifth parallel point and moves straight along the current direction, and the walking path is a third parallel edge; by analogy, the robot finishes traversing the Nth circled area according to the walking mode of the parallel path planning.
Further, if an isolated obstacle is detected during the process that the robot walks along the parallel edges, the following steps are executed: and the robot walks along the edge along one side of the Nth positioning edge along the isolated barrier, and continues to walk along the parallel edge when the robot walks to the linear direction of the parallel edge.
A chip is used for storing program instructions, and the program instructions are used for controlling a robot to execute the traversal control method of the robot.
A cleaning robot comprises a main control chip, wherein the main control chip is the chip.
According to the traversal control method of the robot, the robot defines a region to be traversed by adopting a mode of combining a positioning edge and a border edge, and after the region is traversed, the next region to be traversed is defined in the same mode, and then the traversal is continued. And by analogy, the traversal of all the areas is completed. In the traversing modes, the starting point of the robot is taken as the circle center, the area to be cleaned is divided into a plurality of small areas, and the small areas are taken as units for traversing. In addition, the robot can avoid missing some areas with smaller entrances in the way.
Drawings
Fig. 1 is a schematic traversal diagram of a robot according to the present invention.
Fig. 2 is a schematic view of the robot walking along the positioning edge according to the present invention.
Fig. 3 is a schematic view illustrating the detection of an isolated obstacle when the robot of the present invention walks along parallel sides.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention. It should be understood that the following specific examples are illustrative only and are not intended to limit the invention. In the following description, specific details are given to provide a thorough understanding of the embodiments. However, it will be understood by those of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits may be shown in block diagrams in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, structures and techniques may not be shown in detail in order not to obscure the embodiments.
The first parallel point, the second parallel point, the third parallel point, and the like mentioned in the following embodiments may be collectively referred to as parallel points; the first, second, third, etc. parallel lines referred to may be collectively referred to as parallel lines. Similarly, when the first, second, third, etc. numerical terms are used in front of a certain name, the name can be directly used as a general name.
The following embodiments refer to wall barriers, such as wardrobes, television cabinets, sofas, etc. that are positioned next to a wall. The isolated barrier is an object which is not close to a wall, and the robot can walk along the outer edge of the object for a circle, such as a tea table, a dining table and the like in the center of a living room.
The robot is an intelligent robot capable of moving autonomously, and can perform positioning, obstacle type determination, obstacle avoidance and other behaviors through a camera, a laser radar and other sensors. The traversal refers to that the robot walks all over the ground of the area where the robot is located. If the robot is a cleaning robot, when the robot walks on the ground, cleaning operations such as dust collection and the like can be carried out at the same time, and when the robot finishes traversing the area, the robot finishes cleaning the area. The cleaning is not limited to sweeping and dust collection, and can also refer to cleaning functions of mopping, polishing or waxing and the like. Namely, the robot can sweep the floor, suck dust, mop the floor and polish or wax the floor according to the traversal control method.
The traversal control method specifically comprises the following steps: in step S1, the robot moves straight forward in the first direction with the current position as the origin, and after the robot detects a wall or an obstacle near the wall through the camera and the obstacle sensor, the current position is determined as the first positioning point. The robot walks from the origin to the first positioning point, and the walking path is a first positioning edge. The first direction is a direction from the origin straight line toward the first fixed point. When the robot reaches the first positioning point, the process proceeds to step S2. In step S2, the robot returns to the origin from the first positioning point, and then proceeds to step S3. In step S3, the robot performs pivot steering at the origin, and determines whether the sum of the angles of rotation in the preset rotation direction from the first positioning edge is equal to 360 °, where the preset rotation direction may be set according to specific design requirements, and may be set to be clockwise or counterclockwise. If the sum of the rotated angles is equal to 360 degrees, which indicates that the robot takes the origin as the center, the division of the whole area is completed, and only the last small area is left to be traversed, the process goes to step S8 to perform the last traversal. If the angle does not reach 360 degrees, the robot only starts or only finishes the division and traversal of partial areas by taking the origin as the center of a circle, and the step S4 needs to be executed to continue the division and traversal of the areas. In step S4, the robot turns to a second direction, an included angle between the second direction and the first direction is a first included angle, the first included angle is smaller than 180 °, and may be set to 60 ° or 90 °. Next, the process proceeds to step S5. In step S5, the robot travels along the second direction, and after the robot detects a wall or an obstacle near the wall through the camera and the obstacle sensor, the current position is determined as a second positioning point, the robot travels from the origin to the second positioning point, and the traveling path is a second positioning edge. The second direction is a direction from the origin straight line toward the second positioning point. When the robot reaches the second positioning point, the process proceeds to step S6. In step S6, the robot turns around at the second positioning point, walks edgewise toward the first positioning point, where a path of the edgewise walking is a first edgewise, and the first edgewise walking is located within the range of the first included angle, and then proceeds to step S7. In step S7, when the robot walks to the first positioning point, a first circling area surrounded by the first positioning edge, the first edge, and the second positioning edge is defined, and the robot traverses the first circling area. After traversing, the robot returns to the original point, then continues to turn to the third direction, and the included angle between the third direction and the second direction is a second included angle. Through the mode of firstly returning to the original point and then turning to determine the included angle, the divided angle is more accurate, the divided area is more accurate, and the problem of repeated sweeping or missed sweeping of the boundary area is avoided. It should be noted that the second included angle is the same as the first included angle, and the angle calculation is performed along the same direction, for example, the first included angle is an included angle from the first positioning edge to the second positioning edge in the clockwise direction, and then the second included angle is an included angle from the second positioning edge to the third positioning edge in the clockwise direction, and similarly, the third included angle is an included angle from the third positioning edge to the fourth positioning edge in the clockwise direction, and so on. And the robot walks forwards along the third direction, after a wall or a barrier close to the wall is detected, the current position is determined to be a third positioning point, the robot walks from the origin to the third positioning point, and the walking path is a third positioning edge. The third direction is a direction from the origin straight line toward the third positioning point. And after the robot reaches a third positioning point, the robot turns at the third positioning point and walks edgewise towards the second positioning point, the path of the edgewise walking is a second edge, and the second edge is positioned within the range of the second included angle. When the robot walks to the second positioning point, a second enclosed area formed by the second positioning edge, the second edge and the third positioning edge is defined, and the robot traverses the second enclosed area. By analogy, when the sum of the angles of rotation of the robot in the preset rotation direction at the origin is equal to 360 °, the process proceeds to step S8. In step S8, the robot rotates one turn and turns to the first direction, and then stops rotating and then walks to the corresponding first positioning point along the current first positioning edge. The robot turns at the first location point and walks edgewise towards the adjacent upper location point (not the second location point). When the robot walks to a certain position, a residual area surrounded by the current edge path and the positioning edges connected with the two ends of the current edge path is defined, the robot traverses the residual area, and finally, the traversal of all the areas is completed. Wherein, N is a natural number such as one, two, three, four, etc. According to the traversal control method of the robot, the robot defines an area to be traversed (namely the defined area) by adopting a mode of combining a positioning edge and a border edge, and after the area is traversed, the next area to be traversed is defined in the same mode, and then the traversal is continued. And by parity of reasoning, completing the traversal of all the areas. In the traversing modes, the starting point of the robot is taken as the circle center, the area to be cleaned is divided into a plurality of small areas, the small areas are different from the grid areas with the standard forms, the small areas have the shape similar to a fan, the specific shape depends on the edge shape of the whole area, and if the included angle of the adjacent positioning edges is set reasonably, the robot only needs to divide and traverse the plurality of small area areas, and the traversal of the whole area can be completed. Compared with a grid area dividing mode, the mode is more flexible, and the condition of missing scanning can be reduced, for example, when an entrance of a certain area is narrower, the entrance can not be detected easily due to walking errors and the like through a grid area traversing mode, and the missing scanning of the corresponding area is caused. The method can effectively avoid the problems in an edge mode. In addition, this kind of mode through location limit combination edgewise can enclose the more suitable traversal region of a scope, it is more reasonable than the mode that the whole edge cleaned again, because the robot just carries out whole edge at the beginning, whole edge consuming time is more for a long time, let the user misunderstanding easily, do nothing robot, this method is directly carrying out the grid region and ergodic and carrying out the mode of getting a compromise between these two kinds of modes of whole edge earlier, can be in order to avoid easy misunderstanding, can reach better traversal effect again, user experience is better. In addition, the robot can avoid missing some areas with smaller entrances in the way.
As one embodiment, taking the preset angle as 90 degrees as an example, as shown in fig. 1, the largest outer frame HGEC is the area to be cleaned, the circle in the middle represents the sweeping robot, and the position of the robot at this time is point P. The sweeping robot starts sweeping from the point P, moves forwards until the point A, detects a wall and returns to the point P. After returning to point P, the robot is now oriented to point D, so it needs to turn left by 90 degrees and be oriented to point B. If the robot returns to the point P and faces to the direction of the point A, the robot turns to the right by 90 degrees and faces to the direction of the point B. So that the included angle between the forward straight direction PB of the robot and the direction PA is 90 degrees. Then, the robot moves straight forward, and when the robot travels to point B, the wall is detected again. The robot turns to the left at the point B, and the edgewise is carried out by using an edgewise sensor on the right side of the robot, wherein the edgewise sensor can adopt a sensing device such as an infrared sensor or an ultrasonic sensor. The robot walks edgewise from point B to point C and finally to point a. At this time, the robot defines a first delineation zone PBCA. The robot then traverses the region. After traversing, the robot returns to the point P, then turns to the point D and moves straight towards the direction of the point D, and the included angle between the PD and the PB is also a preset angle of 90 degrees. When the robot moves straight to the point D and detects the wall, the robot turns left, and walks edgewise by using the edge sensor on the right side of the robot, passes through the point E and reaches the point B. At this point, the robot defines a second circumscribing area PDEB. The robot traverses the region. By analogy, the robot sequentially defines and traverses the third defined area PFGD and the fourth defined area PAHF, and accordingly traversal cleaning of all the areas is finally completed.
Preferably, the value of the nth included angle is any angle value greater than or equal to 60 degrees and less than or equal to 90 degrees, the set angle is not too large or too small, and the cleaning traversing effect of the robot is affected by the too large or too small angle. It is possible to choose 70 ° or 80 °, and most preferably 90 °.
As one embodiment, if an isolated obstacle is detected while the robot walks from the origin to the positioning point, the following steps are performed: and the robot walks along the edge along one side of the isolated barrier in the delineating area, and continues to move forward towards the positioning point along the linear direction when the robot walks to the linear direction of the origin point towards the positioning point.
Specifically, as shown in fig. 2, the robot moves straight forward from point P, detects an obstacle M at point V1, and at this time, the robot recognizes the obstacle M by the camera, and determines, in conjunction with its own image database, that the obstacle M is not a wall or a wall-near obstacle, and belongs to an isolated object. Therefore, the robot turns to the right and walks edgewise on the side of the delineation area PADCB where the obstacle M is located, using the edgewise sensor on the left side thereof. It should be noted that the circled area PADCB does not exist at the beginning, and is estimated by the robot according to the preset rotation direction, for example, if the preset rotation direction is clockwise, the circled area is certainly on the right side of the PA direction; if the predetermined rotation direction is counterclockwise, the delineation region must be on the left side of the PA direction. After the robot bypasses the obstacle along the track of V1-V2-V3-V4, the robot continues to move straight along the direction of PA until the robot walks to the point A, and the wall is detected. Then the robot returns to the point P from the point A, and the robot can directly adopt a navigation mode without returning along the edge. Then, the robot turns and moves straight in the direction of point B, detects the wall at point B, and then walks to point a along the trajectory of wall B-C-D-a. At this time, the robot defines a defined area PADCB and traverses the area.
As one embodiment, the traversing the nth circled area by the robot in step S7 specifically includes the following steps (taking the first circled area as an example): the robot starts from a first positioning point, walks to a first parallel point along the first edge and towards the second positioning point, and the line distance between a straight line where the first parallel point is located and the straight line where the first positioning edge is located, the straight line being parallel to the first direction, is a preset distance. The preset distance can be set according to specific design requirements, and preferably, the preset distance can be set to be the width of the body of one robot. And the robot turns at the first parallel point, turns to a direction parallel to the first direction, and is in straight parallel along the current direction, and the walking path is a first parallel side. When the robot walks to the second positioning edge, the wall or the obstacle close to the wall, the robot walks to a third parallel point along the second positioning edge, the wall or the obstacle close to the wall by taking the current position point as the second parallel point. And the line distance between the straight line where the third parallel point is located and the straight line where the first parallel edge is located and the straight line where the third parallel point is located and the first parallel edge is the preset distance. And the robot turns at the third parallel point, turns to the direction parallel to the first direction and moves straight along the current direction, and the walking path is a second parallel edge. When the robot walks to the first edge, the current position point is taken as a fourth parallel point, the robot walks to a fifth parallel point along the first edge towards the second positioning point, a straight line where the fifth parallel point is located is parallel to the first direction, and a parallel distance between the straight line where the fifth parallel point is located and the straight line where the second parallel edge is located is a preset distance. And the robot turns to the fifth parallel point, turns to the direction parallel to the first direction, and moves straight along the current direction, the walking path is the third parallel edge, and so on, and the robot finishes the traversal of the first circled area according to the walking mode planned by the parallel path. According to the method, the parallel path is combined with the edge spacing mode at two ends of the parallel path, and compared with the existing bow-shaped traversing mode, the mode is more suitable for traversing the area with irregular edge outline, and effective cleaning of the edge and the corner can be guaranteed.
Similarly, if the second encircled area is taken as an example, the corresponding first direction is changed into the second direction, the first positioning edge is changed into the second positioning edge, and the second positioning edge is changed into the third positioning edge.
As shown in fig. 1, when the robot walks along the side to point a, a first circling region PBCA is circled. The robot then begins traversing the region. The robot walks along the edge from the point A and returns to the point a of the first parallel point, can turn around and then return, and can also directly back up and return. The robot turns at a point a and faces a point b, the ab direction at the moment is parallel to the PA direction, and the straight line distance between the first parallel side ab and the first positioning side PA is the width of the robot body. After the robot moves straight from the point a to the point b, the robot turns to and moves to the point c. And the robot turns at the point c and faces the direction of the point d until the robot moves to a fourth parallel point d. The cd direction is parallel to the PA direction, and the linear distance between the second parallel side cd and the first parallel side ab is the body width of one robot. By analogy, the robot walks from the point d to the point e of the fifth parallel point, walks along the route described by the arrow according to the same parallel path planning mode, sequentially passes through the point f, the point g and the like, and walks to the point h. At this point, the robot has walked through all the positions of the first circling zone, completing the traversal of the zone. Similarly, the robot sequentially passes through the point i, the point j, the point k, the point l, the point m and the like according to the path direction indicated by the arrow, and finishes traversing the second circumscribed area PBED after reaching the point n. And then, the robot sequentially completes the traversal of the third circled area PFGD and the fourth circled area PAHF according to the path direction indicated by the arrow, and finally completes the traversal of all the areas.
Similarly, as shown in fig. 2, the robot passes through the point a, the point b, the point c, the point d, the point e, the point f, the point g, and the like in sequence in the path direction indicated by the arrow to complete the traversal of the current delineation area PADCB.
As one embodiment, when the robot passes through the nth circle area, if the robot detects an isolated obstacle during the process of walking along the parallel sides, the following steps are performed: and the robot walks along the edge along one side of the Nth positioning edge along the isolated barrier, and continues to walk along the parallel edge when the robot walks to the linear direction of the parallel edge. For example, the robot traverses the fourth enclosed area in the parallel path planning walking manner, detects an obstacle during the walking along the parallel edges, and detects and analyzes that the obstacle is an isolated obstacle through the camera, then the robot walks along the fourth positioning edge along one side of the isolated obstacle, and when the robot walks to the straight direction of the parallel edges, it indicates that the robot has bypassed the obstacle, and the robot continues to walk along the parallel edges. And if the sixth encircled area is traversed, the isolated barrier is walked edgewise along one side of the sixth positioning edge. And so on. According to the method, when the robot is controlled to walk along the parallel edges and an isolated obstacle is detected, the robot walks along one side of the walking positioning edge along the obstacle, so that the walking planning of the robot can be ensured, the condition of missing sweeping is avoided, and the cleaning quality of the robot is improved.
As shown in fig. 3, the robot traverses the delineating area PADCB according to the path shown by the arrow, when the robot walks to the point b and detects the isolated obstacle M, the robot selects the isolated obstacle M to be located on one side of the positioning edge PA to be followed, the robot walks to the point e according to the track of b-c-d-e and returns to the straight direction of the parallel edge bf, and the robot continues to move straight from the point e to the point f. And then, the robot continues to walk according to the path described by the arrow, when the robot walks to the point g, the robot continues to perform edge following along the left side of the barrier M, the robot walks to the point h according to the track of g-e-d-c-b-h, the robot returns to the straight line direction where the parallel edge Bg is located, the robot continues to move straight from the point h, and the robot walks according to the path described by the arrow until the traversal of the whole delineation area is completed.
A chip for storing program instructions for controlling a robot to execute a traversal control method of the robot as described in the above embodiments. The chip enables the robot to define a region to be traversed by adopting a mode of combining a positioning edge and a rim by configuring the program instruction, and after the region is traversed, the next region to be traversed is defined by the same mode, and then the traversal is continued. And by analogy, the traversal of all the areas is completed. In the traversing modes, the starting point of the robot is taken as the circle center, the area to be cleaned is divided into a plurality of small areas, and the small areas are taken as units for traversing.
A cleaning robot can be a sweeping robot or a mopping robot, a main control chip is assembled in the cleaning robot, and the main control chip is the chip in each embodiment. The robot can be controlled to define a region to be traversed by adopting a mode of combining the positioning edge and the edge by assembling the chip, and after the region is traversed, the next region to be traversed is defined by the same mode, and then the traversal is continued. And by analogy, the traversal of all the areas is completed. In the traversing modes, the starting point of the robot is taken as the circle center, the area to be cleaned is divided into a plurality of small areas, and the small areas are taken as units for traversing.
The obstacle sensor described in each of the above embodiments refers to an infrared sensor or a mechanical collision sensor for detecting an obstacle.
In the above embodiments, directional words such as "up", "down", "left", and "right" refer to directions such as up, down, left, and right in the drawings, unless otherwise specified. If the specific description exists, the specific description definition is carried out, for example, the left side of the robot refers to the left side of the forward direction of the robot, and does not refer to the left side of the drawing.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. These programs may be stored in a computer-readable storage medium (such as a ROM, a RAM, a magnetic or optical disk, or various other media that can store program codes). Which when executed performs the steps comprising the method embodiments described above. Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (7)

1. A robot traversal control method is characterized by comprising the following steps:
step S1: the robot takes the current position as an original point, moves forwards and straightly along the Nth direction, determines the current position as an Nth positioning point after detecting a wall or a wall-leaning obstacle, and enters a step S2, wherein a path from the original point to the Nth positioning point is an Nth positioning edge;
step S2: the robot returns to the origin and then proceeds to step S3;
step S3: the robot carries out pivot steering at the origin, and judges whether the sum of angles rotated in the preset rotating direction from the first positioning edge is equal to 360 degrees or not, if so, the step S8 is carried out, and if not, the step S4 is carried out;
step S4: the robot turns to the (N + 1) th direction, an included angle between the (N + 1) th direction and the (N) th direction is an Nth included angle, the Nth included angle is smaller than 180 degrees, and then the step S5 is carried out;
step S5: the robot walks along the (N + 1) th direction, determines that the current position is an (N + 1) th positioning point after detecting a wall or a wall-approaching obstacle, and a walking path from the origin to the (N + 1) th positioning point is an (N + 1) th positioning edge, and then enters step S6;
step S6: the robot turns at the N +1 positioning point, walks towards the Nth positioning point along the edge, the path of the walking along the edge is the Nth edge, the Nth edge is positioned in the range of the Nth included angle, and then the step S7 is carried out;
step S7: when the robot walks to the Nth positioning point, an Nth encircled area surrounded by the Nth positioning edge, the Nth edgewise edge and the (N + 1) th positioning edge is encircled, the robot traverses the Nth encircled area, after the traversal is finished, the value of N is added with 1, and the step S2 is returned;
step S8: the robot stops rotating, walks to a corresponding positioning point along a current positioning edge, then walks edgewise towards a certain position point, when the robot walks to the certain position point, a residual area defined by a current edgewise path and the positioning edges connected with two ends of the current edgewise path is defined, the robot traverses the residual area, and finally the traversal of all areas is completed;
wherein, the N is a natural number.
2. The method of claim 1, wherein the value of the nth angle is any angle value greater than or equal to 60 ° and less than or equal to 90 °.
3. The method according to claim 1, characterized in that, if an isolated obstacle is detected while the robot is walking from the origin to the positioning point, the following steps are performed:
and the robot walks along the edge along one side of the isolated barrier in the delineating area, and continues to move forward towards the positioning point along the linear direction when the robot walks to the linear direction of the origin point towards the positioning point.
4. The method according to claim 1, wherein the traversing the nth bounding region by the robot in step S7 specifically includes:
the robot starts from the Nth positioning point, walks to the (N + 1) th positioning point along the Nth edge to form a first parallel point, and the line distance between a straight line where the first parallel point is located and the straight line where the Nth positioning edge is located and a straight line where the first parallel point is located and the straight line where the Nth positioning edge is located is a preset distance;
the robot turns to the direction parallel to the Nth direction at the first parallel point, and is in straight parallel along the current direction, and the walking path is a first parallel side;
when the robot walks to the (N + 1) th positioning edge, the wall or the wall-leaning barrier, taking the current position point as a second parallel point, the robot walks to a third parallel point along the (N + 1) th positioning edge, the wall or the wall-leaning barrier towards the (N + 1) th positioning point, wherein the line distance between the straight line where the third parallel point is located and the straight line where the first parallel edge is located and the straight line where the third parallel point is located in the N direction is a preset distance;
the robot turns to the direction parallel to the Nth direction at the third parallel point, and moves straight along the current direction, and the walking path is a second parallel edge;
when the robot walks to the Nth edge, the current position point is taken as a fourth parallel point, the robot walks to a fifth parallel point towards the (N + 1) th positioning point along the Nth edge, a straight line where the fifth parallel point is located is parallel to the Nth direction, and the parallel distance between the straight line where the fifth parallel point is located and the straight line where the second parallel edge is located is a preset distance;
the robot turns to the direction parallel to the Nth direction at the fifth parallel point and moves straight along the current direction, and the walking path is a third parallel edge;
and by analogy, the robot finishes traversing the Nth circled area according to the walking mode of the parallel path planning.
5. The method according to claim 4, characterized in that the robot, during walking along the parallel sides, if an isolated obstacle is detected, performs the following steps:
and the robot walks along the edge along one side of the Nth positioning edge along the isolated barrier, and continues to walk along the parallel edge when the robot walks to the linear direction of the parallel edge.
6. A chip for storing program instructions for controlling a robot to perform the traversal control method of the robot of any of claims 1 to 5.
7. A cleaning robot comprising a master control chip, wherein the master control chip is the chip of claim 6.
CN201910623713.6A 2019-07-11 2019-07-11 Traversal control method and chip of robot and cleaning robot Active CN110362079B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910623713.6A CN110362079B (en) 2019-07-11 2019-07-11 Traversal control method and chip of robot and cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910623713.6A CN110362079B (en) 2019-07-11 2019-07-11 Traversal control method and chip of robot and cleaning robot

Publications (2)

Publication Number Publication Date
CN110362079A CN110362079A (en) 2019-10-22
CN110362079B true CN110362079B (en) 2022-07-08

Family

ID=68218785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910623713.6A Active CN110362079B (en) 2019-07-11 2019-07-11 Traversal control method and chip of robot and cleaning robot

Country Status (1)

Country Link
CN (1) CN110362079B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110815226B (en) * 2019-11-15 2022-03-01 四川长虹电器股份有限公司 Method for returning to initial position at any posture and any position of robot
CN111061263B (en) * 2019-11-27 2023-11-28 小狗电器互联网科技(北京)股份有限公司 Robot obstacle-cleaning and winding method and sweeping robot
CN111227715A (en) * 2020-01-16 2020-06-05 湖南格兰博智能科技有限责任公司 Arch-shaped cleaning method suitable for large area
CN111290388B (en) * 2020-02-25 2022-05-13 苏州科瓴精密机械科技有限公司 Path tracking method, system, robot and readable storage medium
CN113552865A (en) * 2020-04-17 2021-10-26 苏州科瓴精密机械科技有限公司 Traversal method, traversal system, robot and readable storage medium
CN111638713B (en) * 2020-05-26 2023-06-09 珠海一微半导体股份有限公司 Method for defining passable area, area calculation method, chip and robot
CN113812252B (en) * 2020-06-18 2023-03-17 未岚大陆(北京)科技有限公司 Method for controlling operation of apparatus, robot apparatus, and storage medium
CN111802978B (en) * 2020-07-15 2021-12-10 小狗电器互联网科技(北京)股份有限公司 Cleaning control method, storage medium and sweeping robot
CN111906786B (en) * 2020-08-01 2022-03-04 珠海一微半导体股份有限公司 Robot control method, chip and robot
CN112540611A (en) * 2020-09-23 2021-03-23 深圳市银星智能科技股份有限公司 Path planning method of robot, robot and master control chip
CN112650250A (en) * 2020-12-23 2021-04-13 深圳市杉川机器人有限公司 Map construction method and robot
CN115113616B (en) * 2021-03-08 2024-06-14 广东博智林机器人有限公司 Path planning method
CN113589806B (en) * 2021-07-21 2024-06-18 珠海一微半导体股份有限公司 Strategy control method for robot bow-shaped walking time
CN113974506B (en) 2021-09-23 2024-03-19 云鲸智能(深圳)有限公司 Cleaning control method, device, cleaning robot and storage medium
CN114397889B (en) * 2021-12-22 2024-03-26 深圳银星智能集团股份有限公司 Full-coverage path planning method based on unit decomposition and related equipment
CN114711668B (en) * 2022-03-31 2024-05-14 苏州三六零机器人科技有限公司 Cleaning method, cleaning device, sweeper and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107544517A (en) * 2017-10-11 2018-01-05 珠海市微半导体有限公司 The control method of Intelligent cleaning robot
CN110338715A (en) * 2019-07-11 2019-10-18 珠海市一微半导体有限公司 The method and chip and clean robot on intelligent robot cleaning ground

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1098587A1 (en) * 1998-07-31 2001-05-16 Volker Sommer Household robot for the automatic suction of dust from the floor surfaces
KR101750340B1 (en) * 2010-11-03 2017-06-26 엘지전자 주식회사 Robot cleaner and controlling method of the same
CN105115490A (en) * 2015-07-16 2015-12-02 深圳前海达闼科技有限公司 Method for determining indoor active area, and apparatus thereof
CN107340768B (en) * 2016-12-29 2020-08-28 珠海市一微半导体有限公司 Path planning method of intelligent robot
CN111328386A (en) * 2017-09-12 2020-06-23 罗博艾特有限责任公司 Exploration of unknown environments by autonomous mobile robots
CN109199245A (en) * 2018-09-30 2019-01-15 江苏美的清洁电器股份有限公司 Sweeper and its control method and control device
CN109528090A (en) * 2018-11-24 2019-03-29 珠海市微半导体有限公司 The area coverage method and chip and clean robot of a kind of robot
CN109464074B (en) * 2018-11-29 2021-05-28 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107544517A (en) * 2017-10-11 2018-01-05 珠海市微半导体有限公司 The control method of Intelligent cleaning robot
CN110338715A (en) * 2019-07-11 2019-10-18 珠海市一微半导体有限公司 The method and chip and clean robot on intelligent robot cleaning ground

Also Published As

Publication number Publication date
CN110362079A (en) 2019-10-22

Similar Documents

Publication Publication Date Title
CN110362079B (en) Traversal control method and chip of robot and cleaning robot
US11175670B2 (en) Robot-assisted processing of a surface using a robot
CN110338715B (en) Method and chip for cleaning floor by intelligent robot and cleaning robot
AU2019382443B2 (en) Method for Controlling Cleaning of Robot, Chip, and Robot Cleaner
US11052540B2 (en) Methods and systems for complete coverage of a surface by an autonomous robot
EP3764186A1 (en) Method for controlling autonomous mobile robot to travel along edge
US9149167B2 (en) Robot cleaner and control method thereof
EP2592518B1 (en) Robot cleaner and control method thereof
CN105792721B (en) Robotic vacuum cleaner with side brush moving in spiral pattern
CN112137529B (en) Cleaning control method based on dense obstacles
TW201434546A (en) Cleaning method for edgewise-navigating and centralizedly-stretching cleaning robot
US9599987B2 (en) Autonomous mobile robot and method for operating the same
CN111248819A (en) Cleaning path execution method and cleaning robot
WO2022041236A1 (en) Traveling control method and path planning method for mobile robot, and mobile robot
CN112180924B (en) Mobile control method for navigating to dense obstacle
CN112826373A (en) Cleaning method, device, equipment and storage medium of cleaning robot
CN110597253A (en) Robot control method, chip and laser type cleaning robot
Goel et al. Systematic floor coverage of unknown environments using rectangular regions and localization certainty
CN115444326B (en) Floor medium searching method, cleaning robot and storage medium
CN117017117A (en) Control method of cleaning device, cleaning device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

GR01 Patent grant
GR01 Patent grant