CN113885485A - Robot walking control method, system, robot and storage medium - Google Patents

Robot walking control method, system, robot and storage medium Download PDF

Info

Publication number
CN113885485A
CN113885485A CN202010554973.5A CN202010554973A CN113885485A CN 113885485 A CN113885485 A CN 113885485A CN 202010554973 A CN202010554973 A CN 202010554973A CN 113885485 A CN113885485 A CN 113885485A
Authority
CN
China
Prior art keywords
robot
boundary
outer boundary
environment image
planned path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010554973.5A
Other languages
Chinese (zh)
Other versions
CN113885485B (en
Inventor
朱绍明
任雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Cleva Electric Appliance Co Ltd
Suzhou Cleva Precision Machinery and Technology Co Ltd
Original Assignee
Suzhou Cleva Precision Machinery and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Cleva Precision Machinery and Technology Co Ltd filed Critical Suzhou Cleva Precision Machinery and Technology Co Ltd
Priority to CN202010554973.5A priority Critical patent/CN113885485B/en
Priority to PCT/CN2020/123186 priority patent/WO2021253698A1/en
Publication of CN113885485A publication Critical patent/CN113885485A/en
Application granted granted Critical
Publication of CN113885485B publication Critical patent/CN113885485B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a robot walking control method, a system, a robot and a storage medium, wherein the method comprises the following steps: presetting a planning path for the robot; if the planned path comprises at least part of an outer boundary, the outer boundary is a separation boundary of a working area and a non-working area; starting a camera device to shoot an environment image at least when the robot reaches the outer boundary, and driving the robot to walk along an actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once; the robot walking control method and system, the robot and the storage medium can select the walking and working routes of the robot by combining the images shot by the robot, thereby improving the working efficiency of the robot.

Description

Robot walking control method, system, robot and storage medium
Technical Field
The invention relates to the field of intelligent control, in particular to a robot walking control method, a robot walking control system, a robot and a storage medium.
Background
The low repetition rate and the high coverage rate are the objectives of the traversing robot such as the mobile robot for dust collection, grass cutting, swimming pool cleaning and the like. Taking the mobile robot as an intelligent mowing robot as an example, the mowing robot takes a lawn surrounded by a boundary as a working area to perform mowing operation, and the area outside the lawn is defined as a non-working area.
In the prior art, the robot is generally driven to walk by a point positioning mode, such as a UWB positioning mode; correspondingly, the lawns close to the boundary in the working area cannot be cut off due to other factors such as low UWB precision, safety factors and the like; namely, in the working process of the robot, a certain error may exist between the planned boundary and the actual boundary, and thus, the lawn at the boundary of the working area cannot be effectively cut off.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a method and a system for controlling robot walking, a robot, and a storage medium.
In order to achieve one of the above objects, an embodiment of the present invention provides a robot walking control method, including: presetting a planning path for the robot;
if the planned path comprises at least part of an outer boundary, the outer boundary is a separation boundary of a working area and a non-working area; starting a camera device to shoot an environment image at least when the robot reaches the outer boundary, and driving the robot to walk along an actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once;
by the method, the walking and working routes of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
As a further improvement of an embodiment of the present invention, if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
judging whether the outer boundary included in the planned path is continuous, if so, executing the following steps:
driving the robot to walk to any point on the outer boundary in the planned path;
starting a camera device to shoot an environment image by taking any point on the outer boundary as a starting point position;
driving the robot to walk along an actual boundary obtained by analyzing the environmental image and synchronously work until the robot walks along the actual boundary and synchronously works and returns to the starting position;
by the method, the outer boundary is operated in an image recognition mode, and the working efficiency of the robot is improved.
As a further improvement of an embodiment of the present invention, if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, the method also comprises the following steps:
judging whether the planned path contains other paths except the outer boundary, if so, executing the following steps:
and after the robot returns to the starting position, the robot is driven to walk according to other paths and work synchronously.
By the method, the outer boundary is operated in an image recognition mode, and then the area within the outer boundary is operated in a traditional fixed-point positioning mode, so that the walking and working route of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
As a further improvement of an embodiment of the present invention, if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
if the outer boundary included in the planned path is continuous and the planned path further includes other paths except the outer boundary, executing the following steps:
driving the robot to walk and synchronously work according to the other planned paths;
after the robot is driven to walk along the other paths and complete work, the camera device is started to shoot the environment image, the robot is driven to walk to an actual boundary obtained by analyzing the environment image, and the robot is driven to walk along the actual boundary obtained by analyzing the environment image and work synchronously;
by the method, the area within the outer boundary is operated in a traditional fixed-point positioning mode, and then the outer boundary is operated in an image recognition mode, so that the walking and working routes of the robot can be selected by combining images shot by the robot, and the working efficiency of the robot is improved.
As a further improvement of an embodiment of the present invention, if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
driving the robot to walk to the planned path, and taking any point of the planned path as a starting point position;
starting from the starting point position, driving the robot to walk according to the planned path and synchronously work;
in the process that the robot walks along the planned path and works synchronously, a camera device is started in real time to shoot an environment image, the environment image is analyzed in real time to judge whether an actual boundary is obtained, if yes, the robot is driven to walk along the actual boundary in the planning direction of the planned path and work synchronously when the actual boundary is obtained; if not, walking and synchronously working according to the planned path on other paths of which the planned path does not comprise the outer boundary;
by the method, the path is searched by combining an image recognition mode and a traditional fixed point positioning mode, and the two modes are alternately carried out to operate the outer boundary and the area within the outer boundary through specific judgment conditions, so that the walking and working route of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
As a further improvement of an embodiment of the present invention, if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
driving the robot to walk according to the planned path and work synchronously, and starting the camera device synchronously;
when each pre-planned coordinate point on the planned path is reached, driving the robot to rotate in place, and shooting an environment image through a camera device;
if the environment image is analyzed and an actual boundary is not obtained, the robot is driven to continue to walk along the planned path and work synchronously;
and if the environment image is analyzed to obtain an actual boundary, driving the robot to walk to the actual boundary, and driving the robot to walk along the actual boundary and work synchronously.
As a further improvement of an embodiment of the present invention, when the robot is driven to walk to the actual boundary and then is driven to walk along the actual boundary and work synchronously after the robot is driven to walk to the actual boundary, the method further includes:
synchronously monitoring whether the distance between the current walking position and the nearest point of the planned path is greater than a preset threshold value or not, if so, driving the robot to return to the planned path, and if not, keeping the robot on an actual boundary;
by the method, the path is searched by combining an image recognition mode and a traditional fixed point positioning mode, and the two modes are alternately carried out to operate the outer boundary and the area within the outer boundary through specific judgment conditions, so that the walking and working route of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
In order to achieve the above object, according to another aspect of the present invention, there is provided a robot walking control system including: the acquisition module is used for acquiring a preset planning path for the robot;
the processing module is used for dividing the outer boundary into a working area and a non-working area when at least part of the outer boundary is included in the planned path; and at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once.
In order to achieve one of the above objects, an embodiment of the present invention provides a robot, including a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the robot walking control method when executing the computer program.
In order to achieve one of the above objects, an embodiment of the present invention provides a readable storage medium on which a computer program is stored, the computer program, when executed by a processor, implementing the steps of the robot walking control method as described above.
Compared with the prior art, the robot walking control method and system, the robot and the storage medium can select the walking and working routes of the robot by combining the images shot by the robot, so that the working efficiency of the robot is improved.
Drawings
Fig. 1 is a schematic flow chart of a robot walking control method according to an embodiment of the present invention;
fig. 2, fig. 4, fig. 5, and fig. 7 are respectively schematic flow charts of a specific implementation process of one step in fig. 1;
FIGS. 3, 6, and 8 are schematic diagrams of different examples of the present invention, respectively;
fig. 9 is a schematic block diagram of a robot walking control system provided by the present invention.
Detailed Description
The present invention will be described in detail below with reference to embodiments shown in the drawings. These embodiments are not intended to limit the present invention, and structural, methodological, or functional changes made by those skilled in the art according to these embodiments are included in the scope of the present invention.
The robot system of the invention can be a mowing robot system, a sweeping robot system, a snow sweeper system, a leaf suction machine system, a golf course ball picking-up machine system and the like, each system can automatically walk in a working area and carry out corresponding work, in the specific example of the invention, the robot system is taken as the mowing robot system for example, and correspondingly, the working area can be a lawn.
Robot lawnmower systems typically include: mowing Robot (RM), charging station, boundary line. The robot lawnmower includes: the body, set up walking unit, the control unit on the body. The walking unit is used for controlling the robot to walk, turn and the like; the control unit is used for planning the walking direction and the walking route of the robot, storing external parameters obtained by the robot, processing and analyzing the obtained parameters and the like, and specifically controlling the robot according to the processing and analyzing results; the control unit, for example: MCU or DSP, etc.
The mowing robot of the present invention further includes: in a specific example of the present invention, the camera device positions the outer boundary in a picture analysis manner, and the fixed point positioning system positions an area within a boundary enclosed by the outer boundary in a manner of finding a coordinate point on a working path; furthermore, the robot is controlled to traverse the working area by the control unit in combination with the camera device and the fixed point positioning device; as will be described in detail below.
In addition, the robot further includes: various sensors, memory modules, such as: EPROM, Flash or SD card, etc., as well as a working mechanism for working and a power supply; in this embodiment, the working mechanism is a mower deck, and various sensors for sensing the walking state of the walking robot, such as: dumping, ground clearance, collision sensors, geomagnetism, gyroscopes, etc., are not described in detail herein.
As shown in fig. 1, a robot walking control method according to a first embodiment of the present invention includes:
s1, presetting a planning path for the robot;
s2, if the planned path comprises at least a part of an outer boundary, the outer boundary is a separation boundary of a working area and a non-working area; and starting the camera device to shoot the environment image at least when the robot reaches the outer boundary, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at least once at the outer boundary.
Further, the step S2 further includes: and if the planned path does not contain the outer boundary, driving the robot to walk according to the planned path and synchronously work.
In the specific embodiment of the present invention, after the working area is determined, there are various obtaining manners for planning a path for the robot and determining whether the planned path includes an outer boundary. In addition, it should be noted that, in the conventional path planning method, before the robot works, the robot is usually driven to travel along the working area, and the working path is planned in a fixed-point positioning manner in combination with the traveling route of the robot, so that the planned working path does not usually include the actual outer boundary of the working area, that is, the outer boundary in the planned path is formed by inward deviation of the actual outer boundary.
It should be noted that the outer boundary mentioned above specifically refers to a separation boundary between a working region and a non-working region, so that the outer boundary can be accurately identified by means of image identification.
Specifically, the camera device of the invention shoots a scene in front of the robot to form an environment image; the environment is the ground in the advancing direction of the robot; further, after the main controller receives the environment image, the environment image is analyzed; whether the outer boundary exists in a preset distance range from the robot to the current position of the robot or not can be judged through the environment image; the technology for identifying the outer boundary through the image is mature in the prior art, and is not described in detail herein.
In a first preferred embodiment of the present invention, as shown in fig. 2, the step S2 specifically includes: judging whether the outer boundary included in the planned path is continuous, if so, executing the following steps:
driving the robot to walk to any point on the outer boundary in the planned path; starting a camera device to shoot an environment image by taking any point on the outer boundary as a starting point position; and driving the robot to walk along an actual boundary obtained by analyzing the environment image and synchronously work until the robot walks along the actual boundary and synchronously works and returns to the starting position.
Here, the continuation means that all the links are connected end to end in sequence.
Further, the step S2 further includes: judging whether the planned path contains other paths except the outer boundary, if so, executing the following steps:
and after the robot returns to the starting position, the robot is driven to walk according to other paths and work synchronously.
Preferably, the method further includes, after "walking along the actual boundary and synchronously working to return to the starting position", and before "after the robot returns to the starting position, driving the robot to walk along another path and synchronously working": the camera device is immediately closed, so that resources are saved; and the robot can be prevented from working repeatedly on the actual outer boundary.
For ease of understanding, the present invention describes a specific example for the above first preferred embodiment with reference to:
referring to fig. 3, the area of fig. 3 is a working area as a whole, and the planned path constructed in a conventional manner is a path L1 formed by a solid line in a zigzag shape; and there is actually a path L2 formed by the solid rectangular box in the working area; wherein, the path L2 is a continuous outer boundary, and the arrow points to the walking direction of the robot; when mowing is performed by the conventional positioning method, the robot cannot work on the rectangular frame path L2 formed by the broken line due to factors such as UWB which do not have high precision in positioning at a fixed point.
The first preferred embodiment of the present invention can perform work mowing on the complete working area after combining with the image recognition; specifically, after the environment image is captured by the imaging device, an actual boundary, that is, a rectangular frame path L2 formed by a dotted line in the figure, can be further obtained by analyzing the environment image, and accordingly, the robot is driven by means of image recognition to move from the starting position a to the starting position B of the actual boundary, and the robot operates along the path L2 in conjunction with the image recognition from the starting position B, when the robot operates for one circle along the path L2 and returns to the starting position B, it is confirmed that the operation of the outer boundary is completed, and at this time, the imaging device is turned off, and the robot returns to the starting position a from the starting position B; further, according to the conventional fixed-point positioning mode, under the condition that the camera device is turned off, the robot can only walk along the path L1 and synchronously work; when the robot reaches position C, the work ends.
Thus, in the first preferred embodiment, in the robot working process, the outer boundary is firstly operated in the image recognition mode, and then the area within the outer boundary is operated in the traditional fixed point positioning mode, so that the walking and working route of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
In a second preferred embodiment of the present invention, as shown in fig. 4, the step S2 specifically includes: if the outer boundary included in the planned path is continuous and the planned path further includes other paths except the outer boundary, executing the following steps:
driving the robot to walk and synchronously work according to the other planned paths;
and after the driving robot walks along the other paths and finishes working, starting the camera device to shoot the environment image, driving the robot to walk to an actual boundary obtained by analyzing the environment image, and walking along the actual boundary obtained by analyzing the environment image and working synchronously.
Further, when the robot walks along the actual boundary and works synchronously for one circle, the work is finished.
This second preferred embodiment is similar to the first preferred embodiment described above, except that the first embodiment operates on the outer boundary first, and then operates on the area within the outer boundary; while the second embodiment operates on the area within the outer boundary first and then on the outer boundary.
For ease of understanding, the detailed description continues with the example shown in fig. 3.
The second preferred embodiment of the present invention can perform operation and mowing on the complete working area after combining with the image recognition; specifically, starting from the starting position a, the robot walks along the path L1 and works synchronously without starting the camera device, and when the robot reaches the position C, the operation on the path L1 is completed; further, the robot is driven to work on the L2, in the embodiment of the present invention, the robot may turn on the camera device at the current position C, after the camera device is started, select to walk to the nearest position on the actual boundary from the position C nearby, and start to walk along the path L1 from the nearest position and work for one round; the camera device can be started after the robot is driven to move to any point on the outer boundary according to the preset rule, and then the robot is driven to move to the nearest position on the actual boundary according to the principle of near, and then the robot starts to move along the path L1 from the nearest position and works for one circle. In a specific example of the present invention, after completing the operation on the path L2, the robot is driven to return to the position a, the camera device is started, and after the environmental image is captured by the camera device, the actual boundary, i.e. the rectangular frame path L2 formed by the dotted line in the figure, can be further obtained by analyzing the environmental image, and accordingly, the robot moves from the position a to the starting position B of the actual boundary and starts to operate along the path L2 from the starting position B by means of image recognition; when the robot performs one turn along the path L2 and returns to the starting position B, the operation ends.
Thus, in the second preferred embodiment, in the robot working process, the traditional fixed point positioning mode is firstly adopted to work the area within the outer boundary, and then the image recognition mode is adopted to work the outer boundary, so that the walking and working route of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
In a third preferred embodiment of the present invention, as shown in fig. 5, the step S2 specifically includes:
driving the robot to walk to the planned path, and taking any point of the planned path as a starting point position;
starting from the starting point position, driving the robot to walk according to the planned path and synchronously work;
in the process that the robot walks along the planned path and works synchronously, a camera device is started in real time to shoot an environment image, the environment image is analyzed in real time to judge whether an actual boundary is obtained, if yes, the robot is driven to walk along the actual boundary in the planning direction of the planned path and work synchronously when the actual boundary is obtained; if not, walking according to the planned path and working synchronously on other paths of which the planned path does not contain the outer boundary.
For ease of understanding, the present invention describes a specific example for the above third preferred embodiment with reference to:
as shown in connection with fig. 6, the area described in fig. 6 is overall a working area in which there is actually also a path L2 formed by a broken line by conventionally constructed planned paths being a path L1 formed by a solid line in a zigzag shape and an offset path which connects L1 and is not shown; in addition, the thick solid line is a connection path when the robot moves between the path L1 and the path L2, and the path L2 is a discontinuous outer boundary; if mowing operation is performed according to a conventional positioning method, due to factors such as UWB and the like, which are low in fixed-point positioning accuracy, operation on the path L2 is missed when the robot travels along the planned path.
In the third preferred embodiment of the present invention, after the image recognition is combined, the whole working area can be mowed; specifically, starting from the starting position a, the camera is started, after the environmental image is captured by the camera, the actual boundary, that is, the section B-B1 of the path L2 and the section B1-B2 connecting the section B-B1, which are formed by the dotted line in the figure, can be further obtained by analyzing the environmental image, when the robot reaches the position B2, the robot goes to the section B3 along the solid line, and continues to operate along the section B3-C of the path L1, because the section B3-C is not an outer boundary, at this time, the camera on the robot is in a started state, but does not find a corresponding outer boundary, and thus, the robot continues to operate to the position C according to the extending direction of the section B3-C; when the robot reaches the position C, the environmental image shot by the camera device acquires the section C1-C2 of the path L2, and then the robot continues to work along the section C2-D after reaching the section C2 along the solid line from the position C1, and the work is continued until reaching the end point E.
Thus, in the third preferred embodiment, in the robot working process, the image recognition mode and the traditional fixed point positioning mode are combined to search for the path, and the two modes are alternately carried out to work on the outer boundary and the area within the outer boundary through specific judgment conditions, so that the walking and working route of the robot can be selected through combining the images shot by the robot, and the working efficiency of the robot is improved.
Referring to fig. 7, in a fourth preferred embodiment of the present invention, the step S2 specifically includes: driving the robot to walk according to the planned path and work synchronously, and starting the camera device synchronously;
when each pre-planned coordinate point on the planned path is reached, driving the robot to rotate in place, and shooting an environment image through a camera device;
if the environment image is analyzed and an actual boundary is not obtained, the robot is driven to continue to walk along the planned path and work synchronously;
and if the environment image is analyzed to obtain an actual boundary, driving the robot to walk to the actual boundary, and driving the robot to walk along the actual boundary and work synchronously.
Further, if the environment image is analyzed to obtain an actual boundary, after the robot is driven to travel to the actual boundary, the robot is driven to travel along the actual boundary and work synchronously, and the method further comprises the following steps:
and synchronously monitoring whether the distance between the current walking position and the nearest point of the planned path is greater than a preset threshold value or not, if so, driving the robot to return to the planned path, and if not, keeping the robot on the actual boundary.
The preset threshold is a set distance constant, and the size of the preset threshold can be specifically set according to needs.
For ease of understanding, the present invention describes a specific example for the above third preferred embodiment with reference to:
referring to fig. 8, the area shown in fig. 8 is a working area as a whole, wherein due to the factor that the positioning accuracy of UWB and other fixed points is not high, the planned path is a rectangular path L1 formed by connecting and enclosing a1-B1-C1-D1 by solid lines in sequence, and the path L1 includes an discontinuous outer boundary; however, if the machine is operating only on path L1, the actual path segment E-A, A-B-C-D-E connected by the dashed line would be missed; while mowing with the method of the fourth embodiment of the present invention, the outer boundary (i.e., the actual path L1) can be completely traversed.
In the fourth preferred embodiment of the present invention, after the image recognition is combined, the operation and mowing can be performed on the planned working area; specifically, starting from the starting position a1, the camera device is started, and after the environmental image is captured by the camera device, the actual boundary point E can be further obtained by analyzing the environmental image; further, under the synergistic effect of the camera device, the robot walks to the point A along the path E-A, and because the camera device is always started, the robot can continue to walk along the direction A-F after rotating at the point A, and in the process, the following information can be known through judgment: when the robot is located in the A-F road section, the shortest distance from the robot to the planned path L1 exceeds a preset threshold, and at the moment, the robot returns and walks to the positioning point position E1 of the path L1; further, at the position E1, after the robot rotates and is analyzed by the environment image, the actual path cannot be found, and thus, the robot continues to walk along the segment E1-B1; when the robot reaches position B1, the boundaries B-C, C-D, D-E are obtained sequentially and continuously by environmental image analysis and returned to position E along the path, and the work ends.
Thus, in the fourth preferred embodiment, in the robot working process, the path is searched by combining the image recognition mode and the traditional fixed point positioning mode, and the two modes are alternately carried out to work on the outer boundary and the area planned within the outer boundary through the specific judgment condition, so that the walking and working route of the robot can be selected by combining the images shot by the robot, and the working efficiency of the robot is improved.
In an embodiment of the present invention, there is further provided a robot including a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the robot walking control method according to any one of the above embodiments when executing the computer program.
In an embodiment of the present invention, a readable storage medium is further provided, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the robot walking control method according to any one of the above embodiments.
Referring to fig. 9, there is provided a robot walking control system, including: an acquisition module 100 and a processing module 200.
The acquiring module 100 is used for acquiring a planned path preset for the robot;
the processing module 200 is configured to, when at least a part of an outer boundary is included in the planned path, determine the outer boundary as a separation boundary between a working area and a non-working area; and at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once.
Further, the obtaining module 100 is configured to implement step S1; the processing module 200 is configured to implement step S2; it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the system described above may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
In summary, the robot walking control method, the robot walking control system, the robot and the storage medium of the invention can select the walking and working route of the robot by combining the images shot by the robot, thereby improving the working efficiency of the robot.
In the several embodiments provided in the present application, it should be understood that the disclosed modules, systems and methods may be implemented in other manners. The above-described system embodiments are merely illustrative, and the division of the modules into only one logical functional division may be implemented in practice in other ways, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, that is, may be located in one place, or may also be distributed on a plurality of network modules, and some or all of the modules may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or 2 or more modules may be integrated into one module. The integrated module can be realized in a hardware form, and can also be realized in a form of hardware and a software functional module.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may be modified or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present application.

Claims (10)

1. A robot walking control method is characterized by comprising the following steps:
presetting a planning path for the robot;
if the planned path comprises at least part of an outer boundary, the outer boundary is a separation boundary of a working area and a non-working area; and starting the camera device to shoot the environment image at least when the robot reaches the outer boundary, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at least once at the outer boundary.
2. The robot walking control method according to claim 1, wherein if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
judging whether the outer boundary included in the planned path is continuous, if so, executing the following steps:
driving the robot to walk to any point on the outer boundary in the planned path;
starting a camera device to shoot an environment image by taking any point on the outer boundary as a starting point position;
and driving the robot to walk along an actual boundary obtained by analyzing the environment image and synchronously work until the robot walks along the actual boundary and synchronously works and returns to the starting position.
3. The robot walking control method according to claim 2, wherein if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, the method also comprises the following steps:
judging whether the planned path contains other paths except the outer boundary, if so, executing the following steps:
and after the robot returns to the starting position, the robot is driven to walk according to other paths and work synchronously.
4. The robot walking control method according to claim 1, wherein if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
if the outer boundary included in the planned path is continuous and the planned path further includes other paths except the outer boundary, executing the following steps:
driving the robot to walk and synchronously work according to the other planned paths;
and after the driving robot walks along the other paths and finishes working, starting the camera device to shoot the environment image, driving the robot to walk to an actual boundary obtained by analyzing the environment image, and walking along the actual boundary obtained by analyzing the environment image and working synchronously.
5. The robot walking control method according to claim 1, wherein if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
driving the robot to walk to the planned path, and taking any point of the planned path as a starting point position;
starting from the starting point position, driving the robot to walk according to the planned path and synchronously work;
in the process that the robot walks along the planned path and works synchronously, a camera device is started in real time to shoot an environment image, the environment image is analyzed in real time to judge whether an actual boundary is obtained, if yes, the robot is driven to walk along the actual boundary in the planning direction of the planned path and work synchronously when the actual boundary is obtained; if not, walking according to the planned path and working synchronously on other paths of which the planned path does not contain the outer boundary.
6. The robot walking control method according to claim 1, wherein if the planned path includes at least a part of an outer boundary, the outer boundary is a separation boundary between a working area and a non-working area; then at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once, comprising:
driving the robot to walk according to the planned path and work synchronously, and starting the camera device synchronously;
when each pre-planned coordinate point on the planned path is reached, driving the robot to rotate in place, and shooting an environment image through a camera device;
if the environment image is analyzed and an actual boundary is not obtained, the robot is driven to continue to walk along the planned path and work synchronously;
and if the environment image is analyzed to obtain an actual boundary, driving the robot to walk to the actual boundary, and driving the robot to walk along the actual boundary and work synchronously.
7. The robot walking control method according to claim 6, wherein if the environment image is analyzed to obtain the actual boundary, the robot is driven to walk to the actual boundary, and then the robot is driven to walk along the actual boundary and work synchronously, and the method further comprises:
and synchronously monitoring whether the distance between the current walking position and the nearest point of the planned path is greater than a preset threshold value or not, if so, driving the robot to return to the planned path, and if not, keeping the robot on the actual boundary.
8. A robot walking control system, characterized in that the system comprises:
the acquisition module is used for acquiring a preset planning path for the robot;
the processing module is used for dividing the outer boundary into a working area and a non-working area when at least part of the outer boundary is included in the planned path; and at least when the robot reaches the outer boundary, starting a camera device to shoot the environment image, and driving the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at the outer boundary at least once.
9. A robot comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program performs the steps of the robot walking control method of any one of claims 1-7.
10. A readable storage medium having stored thereon a computer program for implementing the steps of the robot walking control method according to any one of claims 1-7 when the computer program is executed by a processor.
CN202010554973.5A 2020-06-17 2020-06-17 Robot walking control method, system, robot and storage medium Active CN113885485B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010554973.5A CN113885485B (en) 2020-06-17 2020-06-17 Robot walking control method, system, robot and storage medium
PCT/CN2020/123186 WO2021253698A1 (en) 2020-06-17 2020-10-23 Robot walking control method and system, robot, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010554973.5A CN113885485B (en) 2020-06-17 2020-06-17 Robot walking control method, system, robot and storage medium

Publications (2)

Publication Number Publication Date
CN113885485A true CN113885485A (en) 2022-01-04
CN113885485B CN113885485B (en) 2023-12-22

Family

ID=79011867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010554973.5A Active CN113885485B (en) 2020-06-17 2020-06-17 Robot walking control method, system, robot and storage medium

Country Status (2)

Country Link
CN (1) CN113885485B (en)
WO (1) WO2021253698A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115136781A (en) * 2022-06-21 2022-10-04 松灵机器人(深圳)有限公司 Mowing method, mowing device, mowing robot and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070276541A1 (en) * 2006-05-26 2007-11-29 Fujitsu Limited Mobile robot, and control method and program for the same
CN103901890A (en) * 2014-04-09 2014-07-02 中国科学院深圳先进技术研究院 Outdoor automatic walking device based on family courtyard and system and method for controlling outdoor automatic walking device based on family courtyard
CN106998984A (en) * 2014-12-16 2017-08-01 伊莱克斯公司 Clean method for robotic cleaning device
WO2017198222A1 (en) * 2016-05-19 2017-11-23 苏州宝时得电动工具有限公司 Automatic work system, self-moving device and control method therefor
US10037029B1 (en) * 2016-08-08 2018-07-31 X Development Llc Roadmap segmentation for robotic device coordination
CN108873880A (en) * 2017-12-11 2018-11-23 北京石头世纪科技有限公司 Intelligent mobile equipment and its paths planning method, computer readable storage medium
CN109984685A (en) * 2019-04-11 2019-07-09 云鲸智能科技(东莞)有限公司 Cleaning control method, device, clean robot and storage medium
US10613541B1 (en) * 2016-02-16 2020-04-07 AI Incorporated Surface coverage optimization method for autonomous mobile machines
WO2020098520A1 (en) * 2018-11-14 2020-05-22 苏州科瓴精密机械科技有限公司 Robot control method and robot system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070276541A1 (en) * 2006-05-26 2007-11-29 Fujitsu Limited Mobile robot, and control method and program for the same
CN103901890A (en) * 2014-04-09 2014-07-02 中国科学院深圳先进技术研究院 Outdoor automatic walking device based on family courtyard and system and method for controlling outdoor automatic walking device based on family courtyard
CN106998984A (en) * 2014-12-16 2017-08-01 伊莱克斯公司 Clean method for robotic cleaning device
US10613541B1 (en) * 2016-02-16 2020-04-07 AI Incorporated Surface coverage optimization method for autonomous mobile machines
WO2017198222A1 (en) * 2016-05-19 2017-11-23 苏州宝时得电动工具有限公司 Automatic work system, self-moving device and control method therefor
US10037029B1 (en) * 2016-08-08 2018-07-31 X Development Llc Roadmap segmentation for robotic device coordination
CN108873880A (en) * 2017-12-11 2018-11-23 北京石头世纪科技有限公司 Intelligent mobile equipment and its paths planning method, computer readable storage medium
WO2020098520A1 (en) * 2018-11-14 2020-05-22 苏州科瓴精密机械科技有限公司 Robot control method and robot system
CN109984685A (en) * 2019-04-11 2019-07-09 云鲸智能科技(东莞)有限公司 Cleaning control method, device, clean robot and storage medium

Also Published As

Publication number Publication date
CN113885485B (en) 2023-12-22
WO2021253698A1 (en) 2021-12-23

Similar Documents

Publication Publication Date Title
US11845189B2 (en) Domestic robotic system and method
US8666554B2 (en) System and method for area coverage using sector decomposition
CN113126613B (en) Intelligent mowing system and autonomous image building method thereof
WO2017198222A1 (en) Automatic work system, self-moving device and control method therefor
CN114937258B (en) Control method for mowing robot, and computer storage medium
CN111198559B (en) Control method and system of walking robot
CN114721385A (en) Virtual boundary establishing method and device, intelligent terminal and computer storage medium
CN113885495A (en) Outdoor automatic work control system, method and equipment based on machine vision
CN115454077A (en) Automatic lawn mower, control method thereof, and computer-readable storage medium
CN113885485A (en) Robot walking control method, system, robot and storage medium
WO2023274339A1 (en) Self-propelled working system
CN114610035A (en) Pile returning method and device and mowing robot
WO2021139683A1 (en) Self-moving device
CN113156929B (en) Self-moving equipment
US20230345864A1 (en) Smart mowing system
WO2021208352A1 (en) Traversal method and system, robot and readable storage medium
CN115291613A (en) Autonomous mobile device, control method thereof, and computer-readable storage medium
EP4235336A1 (en) Method and system for robot automatic charging, robot, and storage medium
WO2024179496A1 (en) Control method, control apparatus, storage medium, and self-moving device
CN117311367B (en) Control method and control unit of self-mobile device and self-mobile device
EP4245473A1 (en) Automatic charging method and system for robot, and robot and storage medium
CN115933681A (en) Working area delineation method based on laser and vision scheme and outdoor robot
CN113552866A (en) Method, system, robot and readable storage medium for improving traversal balance performance
CN117991766A (en) Control method of self-mobile device and self-mobile device
CN116700230A (en) Robot path finding method and device based on grid map, robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230609

Address after: 215000 No. 8 Ting Rong Street, Suzhou Industrial Park, Jiangsu, China

Applicant after: Suzhou Cleva Precision Machinery & Technology Co.,Ltd.

Applicant after: SKYBEST ELECTRIC APPLIANCE (SUZHOU) Co.,Ltd.

Address before: 215000 Huahong street, Suzhou Industrial Park, Jiangsu 18

Applicant before: Suzhou Cleva Precision Machinery & Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant