CN113805571B - Robot walking control method, system, robot and readable storage medium - Google Patents

Robot walking control method, system, robot and readable storage medium Download PDF

Info

Publication number
CN113805571B
CN113805571B CN202010471646.3A CN202010471646A CN113805571B CN 113805571 B CN113805571 B CN 113805571B CN 202010471646 A CN202010471646 A CN 202010471646A CN 113805571 B CN113805571 B CN 113805571B
Authority
CN
China
Prior art keywords
robot
target position
rectangular outline
original image
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010471646.3A
Other languages
Chinese (zh)
Other versions
CN113805571A (en
Inventor
朱绍明
任雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Cleva Electric Appliance Co Ltd
Suzhou Cleva Precision Machinery and Technology Co Ltd
Original Assignee
Suzhou Cleva Electric Appliance Co Ltd
Suzhou Cleva Precision Machinery and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Cleva Electric Appliance Co Ltd, Suzhou Cleva Precision Machinery and Technology Co Ltd filed Critical Suzhou Cleva Electric Appliance Co Ltd
Priority to CN202010471646.3A priority Critical patent/CN113805571B/en
Priority to PCT/CN2020/118366 priority patent/WO2021238001A1/en
Publication of CN113805571A publication Critical patent/CN113805571A/en
Application granted granted Critical
Publication of CN113805571B publication Critical patent/CN113805571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a robot walking control method, a robot walking control system, a robot and a readable storage medium, wherein the method comprises the following steps: controlling the robot to walk according to a preset route; judging whether the target position of the robot in the walking direction needs to be turned or not; if steering is needed, detecting whether an obstacle exists at the target position when the robot is controlled to walk to a preset distance from the target position; the invention can obviously improve the traversing efficiency of the robot.

Description

Robot walking control method, system, robot and readable storage medium
Technical Field
The invention relates to the field of intelligent control, in particular to a robot walking control method, a robot walking control system, a robot and a readable storage medium.
Background
Low repetition rate, high coverage are the goal sought by walk-through robots such as mobile robots for dust collection, mowing, pool cleaning, and the like.
Taking a mobile robot as an example of an intelligent mowing robot, in general, the mowing robot takes a lawn enclosed by an electronic boundary as a working area, and is guided by the electronic boundary as a direction during edgewise walking and mowing. Of course, the current mowing robot also has a work area divided by virtual boundaries, not by arranging electronic boundary lines.
The inner part of the boundary is enclosed to form a working area of the robot, namely a lawn area; the outside forms a non-working area, i.e. a boundary area, of the robot. Correspondingly, before the robot walks to control, riding lines or off-line mowing along edges are set according to the conditions of the periphery of the boundary; in general, when the boundary and the lawn are on the same water surface, riding line edge mowing and off-line edge mowing can be selected; when the non-operating region is higher than the operating region, for example: when the boundary is a step, fence or enclosure, the non-working area is lower than the working area, for example: when the boundary is a sinking step, the inclined line is required to be selected for mowing along the edge.
Referring to fig. 1A and 1B, fig. 1A shows that the robot works along the boundary line in a riding line manner, and fig. 1B shows that the robot works along the boundary line in an off-line manner, in two embodiments, the robot walks and works in the order of a-B-c-d.
In the prior art, if the robot rides the line and follows the limit work under the initial state, then the robot can not independently change operating mode in the course of the work, promptly when having the barrier on the boundary line, the robot still can select to ride the line and follow the limit work, and at this moment, the robot can strike the barrier.
In order to solve the above-mentioned problems, in the prior art, it is generally determined whether the robot is on a boundary line by a magnetic field change of the coil; and further, a marker is arranged at the turning point on the boundary line and at the position of the obstacle, for example: a multi-stage coil is wound at the position to increase the magnetic field intensity; when the robot senses that the magnetic field strength increases, turning is performed according to a preset rule. However, this scheme is less accurate by judging through the magnetic field intensity induction mode.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a robot walking control method, a system, a robot and a readable storage medium.
In order to achieve one of the above objects, an embodiment of the present invention provides a robot walking control method, the method including: controlling the robot to walk according to a preset route;
judging whether the target position of the robot in the walking direction needs to be turned or not;
if steering is needed, detecting whether an obstacle exists at the target position when the robot is controlled to walk to a preset distance from the target position.
As a further improvement of an embodiment of the present invention, the method further includes:
If the target position has an obstacle, controlling the robot to turn at the current position so as to avoid the obstacle;
and if no obstacle exists at the target position, controlling the robot to turn when walking to the target position.
As a further improvement of an embodiment of the present invention, the controlling the robot to walk along a predetermined route includes:
the robot is controlled to walk along the boundary determined by the working area and the non-working area.
As a further improvement of an embodiment of the present invention, the determining whether the target position of the robot in the traveling direction needs to be steered includes:
acquiring an original image;
converting the original image into a binarized image, the binarized image comprising a working area having first pixel values and a non-working area having second pixel values;
based on the binarized image, a first rectangular outline which encloses the working area and is the smallest and a second rectangular outline which encloses the non-working area and is the smallest are obtained;
and judging whether the robot needs to turn at the target position according to the first rectangular outline and the second rectangular outline.
As a further improvement of an embodiment of the present invention, the detecting whether the target position has an obstacle when the robot is controlled to walk to a preset distance from the target position includes:
Acquiring the distance between the current position of the robot and the target position;
judging whether the distance between the current position of the robot and the target position reaches the preset distance or not;
and if the preset distance is reached, detecting whether an obstacle exists at the target position.
As a further improvement of an embodiment of the present invention, the obtaining the distance between the current position of the robot and the target position includes:
and obtaining the mapping distance of the distance between the current position of the robot and the target position on the original image according to the first rectangular outline and the second rectangular outline, and taking the mapping distance as the distance between the current position of the robot and the target position.
As a further improvement of an embodiment of the present invention, the obtaining, according to the first rectangular outline and the second rectangular outline, a mapping distance of a distance between the current position of the robot and the target position on the original image includes:
acquiring the width of a first rectangular outline and the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image, and the width of a second rectangular outline and the relative ordinate value of the upper left corner of the second rectangular outline on the original image or the binary image;
If the width of the first rectangular outline is equal to the width of the original image or the binary image, assigning a difference value between the maximum longitudinal coordinate value of the original image or the binary image and the opposite longitudinal coordinate value of the upper left corner of the second rectangular outline on the original image or the binary image to a mapping length when the width of the second rectangular outline is smaller than the width of the original image or the binary image;
if the width of the first rectangular outline is smaller than the width of the original image or the binary image, if the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image is not 0, assigning a difference value between the maximum ordinate value of the original image or the binary image and the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image to a mapping length.
As a further improvement of an embodiment of the present invention, the detecting whether the target position has an obstacle includes:
transmitting a detection signal towards the target location by a ranging device;
if the ranging device receives the reply signal, judging that the target position has an obstacle;
And if the ranging device does not receive the reply signal, judging that no obstacle exists at the target position.
In order to achieve one of the above objects, an embodiment of the present invention provides a robot including a memory storing a computer program and a processor implementing the steps of the robot travel control method described above when the processor executes the computer program.
In order to achieve one of the above objects, an embodiment of the present invention provides a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the robot walking control method as described above.
Compared with the prior art, the robot walking control method, the robot walking control system, the robot and the readable storage medium can remarkably improve the traversing efficiency of the robot.
Drawings
FIGS. 1A and 1B are schematic structural views of specific examples in the background of the invention;
FIG. 2 is a schematic structural view of a specific example of the mowing robot system of the present invention;
FIG. 3 is a schematic flow chart of a robot travel control method according to an embodiment of the present invention;
FIGS. 4, 6 and 9 are flow diagrams of a specific implementation method of one of the steps in FIG. 3;
FIGS. 5A, 5B, 5C, 5D, 5E, 5F, 5G are schematic views of the steps of the present invention in the same specific example;
FIGS. 7, 8 and 11 are schematic views of a specific example of the present invention;
FIG. 10 is a schematic view of the sensing range of the ultrasonic probe of the present invention;
fig. 12 is a schematic block diagram of a robot travel control system according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the embodiments shown in the drawings. These embodiments are not intended to limit the invention and structural, methodological, or functional modifications of these embodiments that may be made by one of ordinary skill in the art are included within the scope of the invention.
The robot system of the invention can be a mowing robot system, a sweeping robot system, a snowplow system, a leaf suction machine system, a golf course ball picking machine system and the like, and each system can automatically walk in a working area and perform corresponding work.
As shown in fig. 2, the robot lawnmower system of the present invention includes: mowing Robot (RM), charging station 20, boundary 30.
The mowing robot includes: the device comprises a body 10, a walking unit, an image acquisition unit, a distance measurement unit and a control unit which are arranged on the body 10. The walking unit includes: a driving wheel 111, a driven wheel 113, and a motor for driving the driving wheel 111; the motor can be a brushless motor with a reduction gearbox and a Hall sensor; after the motor is started, the driving wheel 111 can be driven to walk through the reduction gearbox, and the traveling actions such as forward and backward linear running, in-situ turning, circular arc running and the like can be realized by controlling the speed and the direction of the two wheels; the driven wheels 113 may be universal wheels, which are typically provided in 1 or 2, and mainly serve as supporting balances.
The image acquisition unit is used for acquiring a scene in a visual angle range of the image acquisition unit in a certain range, in the specific embodiment of the invention, the camera 12 is arranged at the upper part of the body 10, a certain included angle is formed between the camera 12 and the horizontal direction, and the scene in a certain range of the mowing robot can be shot; the camera 12 typically captures a scene within a range of the front of the mowing robot.
The distance measuring unit is used for monitoring whether an obstacle exists in a target range or not in a certain range, and is an ultrasonic probe 13 in the specific embodiment of the invention; preferably, the ultrasonic probe comprises a first ranging subunit for detecting whether an obstacle exists in front of the robot and a second ranging subunit for detecting whether an obstacle exists on the side of the robot, wherein the number and the detection direction of the ultrasonic probe can be specifically set according to the needs, and further details are omitted.
The control unit is the main controller 14 that performs image processing, for example: MCU or DSP, etc.
Further, the mowing robot further includes: a working mechanism for working and a power supply 15; in this embodiment, the working mechanism is a grass cutter head, and various sensors for sensing the walking state of the walking robot, such as: dumping, ground separation, collision sensors, geomagnetism, gyroscopes, etc., are not described in detail herein.
The lawn mowing robot system is also provided with a charging station 20 for providing power for the lawn mowing robot to automatically charge. In this embodiment, the boundary 30 may be a virtual boundary, that is, the mowing robot performs the edge-following operation through the vision sensor, and of course, in other embodiments, the boundary 30 may also be a boundary line, and the mowing robot performs the edge-following operation through the combination of the vision sensor and the boundary line. The boundary 30 is internally provided with a working area A and externally provided with a non-working area B; wherein, a plurality of steering points are also formed on the boundary line; in the working process of the robot, the specific walking and working rules of the robot are set to be edge walking, so that in the walking process, if a road encounters a steering point, turning operation is required to be executed; if the robot is located in the working area A or the non-working area B, a rotation operation is required to be executed; in this specific example, turning refers to the forward movement of the left and right wheels at different speeds, with a certain arc; the rotation includes in-situ rotation/turning, all of which are aimed at returning the robot to the boundary line and driving the robot to walk and work along the edge.
In addition, a pool, a cluster, etc. of obstacles 50 that need to prevent the mowing robot from entering are usually disposed in the working area, and will not be further described herein.
Referring to fig. 3, in the method for controlling walking of a robot according to an embodiment of the present invention, if the robot is in a edgewise mode, the robot is operated mainly in a riding line edgewise mode; when the obstacle is encountered, the operation is adjusted to a bias line edge working mode for avoiding the obstacle.
The method comprises the following steps:
s1: controlling the robot to walk according to a preset route;
s2: judging whether the target position of the robot in the walking direction needs to be turned or not;
s3: if steering is needed, detecting whether an obstacle exists at the target position when the robot is controlled to walk to a preset distance from the target position.
Preferably, after step S3, the method further includes: if the target position has an obstacle, controlling the robot to turn at the current position so as to avoid the obstacle;
and if no obstacle exists at the target position, controlling the robot to turn when walking to the target position.
For step S1, in a specific embodiment of the present invention, the robot is controlled to walk along the boundary determined by the working area and the non-working area.
For step S2, referring to fig. 4, the determining whether the target position of the robot in the traveling direction needs to be steered includes: m1: acquiring an original image; m2: converting the original image into a binarized image, the binarized image comprising a working area having first pixel values and a non-working area having second pixel values; m3: based on the binarized image, a first rectangular outline which encloses the working area and is the smallest and a second rectangular outline which encloses the non-working area and is the smallest are obtained; m4: and judging whether the robot needs to turn at the target position according to the first rectangular outline and the second rectangular outline.
For the step M1, after the robot is started, setting the working mode of the robot to be the riding line edge working; and in the process of riding the line and along the edge, a scene in front of the robot is acquired through a camera carried on the robot, wherein the scene is a ground image in the advancing direction of the robot, and the ground image is called an original image.
Referring to fig. 5A, the original image is actually a color image in RGB format; it should be noted that, at different moments, the original images obtained by the camera are the same in size and rectangular; further, for the convenience of calculation, an upper left corner of the original image is taken as an origin (0, 0), an upper boundary of the original image extends rightward to be an X-axis positive direction, and a left boundary extends downward to be a Y-axis positive direction to establish a rectangular coordinate system; as such, when the original image is determined, the coordinate values of the coordinate points of the respective elements thereon are determined. Of course, this example is only for convenience of description, and in practical application, when determining the original image, the origin of coordinates and the establishment of X, Y axis can be specifically adjusted as required. It should be noted that the origin of coordinates, X and Y axes, may be established in the binarized image and correspond to the original image, but are not limited to being established in the original image.
For the step M2, in the realizable mode of the present invention, there are various modes for converting the original image in RGB format into the binarized image; the binarized image is: the pixel point in the image only has two gray values, either 0 or 255, and the binarization processing process is a process of displaying the whole original image with obvious black-and-white effect; in a specific embodiment of the present invention, the active area and the inactive area are adjusted to be white and black with respect to each other.
In a specific example of the present invention, as shown in connection with fig. 5B, 5C, fig. 5B is an HSV image formed by the conversion of fig. 5A, which is also a color image; FIG. 5C is a binarized image formed by the transformation of FIG. 5B; correspondingly, the step M2 specifically includes: m21, converting an original image formed by RGB color space into an HSV image formed by HSV color space, thus, distinguishing a working area from a non-working area through more obvious color characteristics; m22, converting the HSV image into a binarized image through threshold matching; in this specific example, threshold ranges are preset for the H, S, V three channels respectively, and the threshold ranges are respectively: and [ vlaueHLow, vlaueHHigh ], [ valueSLow, valueSHigh ], [ valueVLow, valueVHigh ], if the value of H, S, V corresponding to any pixel in the HSV image is within the corresponding preset threshold range, adjusting the gray value to one of 0 and 255, and adjusting the gray value of pixels H, S, V which are both outside the preset range to the other of 0 and 255.
In the preferred embodiment of the invention, currently, if the working area needs to be analyzed, the gray value of the working area is adjusted to 0, and the gray value of the non-working area is adjusted to 255; currently, if the non-working area needs to be analyzed, the gray value of the working area is adjusted to 255, and the gray value of the non-working area is adjusted to 0, so that the main controller is adapted to only calculate and detect one color (white or black), and the analysis efficiency is improved; it should be noted that, the adjustment of the gray value only needs to invert the preset threshold range, so that the purpose of adjustment can be achieved, and further description is omitted here.
In a preferred embodiment of the present invention, in order to improve the calculation accuracy, as shown in fig. 5D and 5E, 5D is an image formed by smoothing the image of fig. 5C; FIG. 5E is an image formed after the swelling etch process of FIG. 5D; accordingly, after step M22, the method further includes: performing smoothing filtering treatment and/or expansion corrosion treatment on the binarized image to remove noise in the binarized image; the smoothing filter process is, for example: median filtering, mean filtering, gaussian filtering and the like; the swelling corrosion treatment is a morphological operation.
For step M3, referring to fig. 5E, the rectangular selection part is acquired as a first rectangular outline through collection and analysis; referring to fig. 5F, the rectangular selection portion is acquired through collection analysis and is a second rectangular outline.
For step S3, when the robot is controlled to walk to a preset distance from the target position, detecting whether there is an obstacle at the target position includes: acquiring the distance between the current position of the robot and the target position; judging whether the distance between the current position of the robot and the target position reaches the preset distance or not; and if the preset distance is reached, detecting whether an obstacle exists at the target position.
If the preset distance is not reached, the robot is controlled to continue to walk, and the following steps are circularly executed until the robot walks to a position which is away from the target position by the preset distance: acquiring the distance between the current position of the robot and the target position; judging whether the distance between the current position of the robot and the target position reaches the preset distance or not; and if the preset distance is not reached, controlling the robot to continue walking.
Preferably, for step S3, it should be noted that, in this embodiment, the coordinate system of the original image is identical to the coordinate system of the binary image, that is, the mapping coordinate of any point of the working area on the coordinate system of the original image is identical to the mapping coordinate on the coordinate system of the binary image. Thus, the mapping distance between the current position of the robot and the target position on the original image is the same as the mapping distance between the current position of the robot and the target position on the binary image.
Preferably, the step S3 of obtaining the distance between the current position of the robot and the target position includes: and obtaining the mapping distance of the distance between the current position of the robot and the target position on the original image according to the first rectangular outline and the second rectangular outline, and taking the mapping distance as the distance between the current position of the robot and the target position.
In a preferred embodiment of the present invention, as shown in fig. 6, for step S3, the "obtaining, according to the first rectangular outline and the second rectangular outline, the mapping distance between the current position of the robot and the target position on the original image includes:
p1, acquiring a width W [ g ] of a first rectangular outline, a relative longitudinal coordinate value Y [ g ] of the left upper corner of the first rectangular outline on the original image or the binarized image, a width W [ b ] of a second rectangular outline, and a relative longitudinal coordinate value Y [ b ] of the left upper corner of the second rectangular outline on the original image or the binarized image;
if the width W [ g ] of the first rectangular outline=the width W of the original image or the binarized image, executing step P2, and if W [ g ] < W, executing step P3;
P2, if W < b > W, assigning the difference value between the maximum longitudinal coordinate value Y [ max ] and Y [ b ] of the original image or the binarized image to the mapping length;
and P3, if Y [ g ] is not equal to 0, assigning the difference value between the maximum longitudinal coordinate value Y [ max ] and Y [ g ] of the original image or the binarized image to the mapping length.
In the implementation mode, the mowing robot can always confirm whether the mowing robot continues to press the line or not in the process of executing the line-riding and edge-following work, so that the mowing robot is prevented from being separated from the working mode of the line-riding and edge-following due to the influence of the working environment, and the working efficiency of the mowing robot is improved. The specific method comprises the following steps: obtaining a diagonal length maxLine [ g ] of a first rectangular outline, judging whether the maxLine [ g ] is smaller than a preset second value, if yes, driving the robot to change direction to reprogram and search an edge working path until the maxLine [ g ] is not smaller than the preset second value; if not, determining that the robot executes the riding line edge working.
In the preferred embodiment, the preset second value is a length value preset by the system, and the size of the second value can be specifically adjusted according to the requirement; when maxLine [ g ] is smaller than a preset second value, the robot is in a non-working area, and therefore the position of the robot needs to be adjusted to return to the edge working path; correspondingly, the robot can be driven to rotate in an adjusting mode, the edgewise working path is found again, and the edgewise working path is returned to.
In addition, in the preferred embodiment, the camera arranged on the mowing robot performs shooting of an image of the environment in front of the mowing robot during the advancing process; correspondingly, the obtained original image is a rectangular area; further, at least two adjacent boundaries in the first rectangular outline enclosing the working area share the boundary of the original image, and the borderline working path is that the robot walks and works along the boundary line.
In the embodiment of the invention, as shown in fig. 5E, a second value2 is preset to be 22.3, and the value of maxLine [ g ] is obtained by calculation to be 156.2, namely maxLine [ g ] > value2, and at this time, it is determined that the robot is on the edgewise working path.
Of course, the step of confirming whether the robot performs the riding line edge work may further include: based on the binarized image, the area of the working area is obtained and is expressed by maxArea [ g ]; judging whether the maxArea [ g ] is smaller than a preset third value, if so, determining that the robot executes riding line edge work; if not, driving the robot to change direction to re-plan and search the edge working path until the maxArea [ g ] is smaller than a preset third value.
In the preferred embodiment, the preset third value is an area value preset by the system, and the size of the third value can be specifically adjusted according to the requirement; when maxArea [ g ] is not smaller than a preset third value, the robot is in a working area and is far away from the boundary line, so that the position of the robot needs to be adjusted to return to the edge working path; correspondingly, the robot can be driven to rotate in an adjusting mode, the edgewise working path is found again, and the edgewise working path is returned to.
In addition, when the working area is fixed, there are various ways to calculate the area, for example: the coordinates, the pixel point calculation, etc. are not described in detail herein.
In the embodiment of the present invention, as shown in fig. 5E, a third value3 is preset to 19150, and a value of maxArea [ g ] is 10655 obtained by calculation, that is, maxArea [ g ] < value3, where it is determined that the robot is on the edge working path.
It should be noted that, the two embodiments for confirming whether the robot performs the riding line edge operation may be performed alternatively, or may be performed simultaneously or sequentially, and when the embodiments are selected to be performed simultaneously or sequentially, the calculation results are the same.
In a preferred embodiment, the width W of the original image is a determined value and the first rectangular outline is a portion of the original image, so that there is no W [ g ] > W when W [ g ] is compared to W in size. Of course, the maximum ordinate value Y [ max ] of the original image or the binarized image is also a determination value, and can be specifically set according to the size of the original image or the like.
In a preferred embodiment of the present invention, for the step P2, when the width of the first rectangular outline is identical to the width of the original image and the width of the second rectangular outline is smaller than the width of the original image, the mowing robot detects the target position, and under the coordinate system established by the original image or the binary image, the mapping length corresponds to the difference value between the maximum longitudinal coordinate value Y [ max ] and Y [ b ] of the original image or the binary image, and then the difference value between the maximum longitudinal coordinate value Y [ max ] and Y [ b ] of the original image or the binary image is assigned to the mapping length. When the robot rides the line and works along the edge, the closer the robot is to the target position, the larger the relative longitudinal coordinate value Y [ b ] of the upper left corner of the second rectangular outline obtained by the robot on the original image or the binarized image is, the smaller the mapping length is.
In the preferred embodiment of the present invention, for step P3, when the width of the first rectangular outline is inconsistent with the width of the original image, that is, the width of the first rectangular outline is smaller than the width of the original image, and the relative longitudinal coordinate value Y [ g ] of the upper left corner of the first rectangular outline on the original image or the binary image is unequal to 0, it indicates that the mowing robot detects the target position, and in the coordinate system established by the original image or the binary image, the mapping length corresponds to the difference value between the maximum longitudinal coordinate value Y [ max ] and Y [ g ] of the original image or the binary image, and then the difference value between the maximum longitudinal coordinate value Y [ max ] and Y [ g ] of the original image or the binary image is assigned to the mapping length. When the robot rides the line and works along the edge, the closer the robot is to the target position, the larger the relative longitudinal coordinate value Y [ g ] of the upper left corner of the first rectangular outline obtained by the robot on the original image or the binarized image is, the smaller the mapping length is.
In a preferred embodiment of the present invention, when the robot rides along the edge, the distance between the target position and the robot at the current position is obtained through the mapping length, and when the mapping length is equal to the preset first value, it is indicated that the distance between the robot and the target position is relatively close, but the current position, specifically, the steering point or the obstacle in front of the robot cannot be clearly distinguished through the vision sensor. If the target position is an obstacle and the robot misjudges that the target position is a turning point, when the robot continues to ride along the line and performs turning operation, the robot bumps against the obstacle, thereby affecting the normal operation of the robot and bringing potential safety hazard.
In a preferred embodiment of the present invention, the turning point includes: the robot is driven to execute one of the clockwise or anticlockwise turning logic when the robot is at the first turning point, and the robot is driven to execute the other of the clockwise or anticlockwise turning logic when the robot is at the second turning point; correspondingly, if the distance measuring device is started to measure, and then the target position is confirmed to be the steering point, the robot works in the first edge working mode. The "the robot operates in the first edgewise operation mode" includes: if wg=w and wb < W, the robot continues to move in the current direction until yb is equal to a preset first ordinate value; when Y [ b ] is equal to a preset first longitudinal coordinate value, the robot is positioned at the first turning point and executes one of clockwise or anticlockwise turning logics; w g is less than W, the robot continues to move along the current direction until Y g is equal to a preset second longitudinal coordinate value; when Y [ g ] is equal to a preset second ordinate value, the robot is at a second steering point and performs the other one of the clockwise or counterclockwise turning logic.
Specifically, in some embodiments, if the preset first value of the mapping length is set to be larger, the distance from the target position to the turning point is longer when the robot starts the distance measuring device to measure and confirm the target position to be the turning point. In order to avoid that the robot turns ahead, thereby affecting normal operation, whereby the robot continues to move in the current direction, gradually approaching the turning point. By presetting the first ordinate value and the second ordinate value, the robot can steer when being closer to the steering point, and the steering operation can not be performed in advance. Of course, in other embodiments, if the preset first value of the mapping length is set smaller, the robot may immediately perform the steering operation when the ranging device is started to measure and confirm that the target position is the steering point.
In a specific example of the present invention, value 2=22.3, value 3=19150, w=160, the maximum ordinate Y [ max ] of the original image or the binarized image is=120, the first value=49, the first ordinate value=80, and the second ordinate value=75 are preset.
Referring to fig. 7, the binarized images on the left and right sides of fig. 7 are identical images, and are different in that rectangular frame selection portions in the binarized images are different, the left selection portion is a first rectangular outline, and the right selection portion is a second rectangular outline; in addition, the gray value of the working area in the left binarized image is set to 0, and the gray value of the non-working area is set to 255; the gray value of the working area in the left binarized image is set to 255, and the gray value of the non-working area is set to 0. Analyzing the original image to obtain a binarized image, and obtaining maxLine [ g ] =200 and maxarea [ g ] = 14382 according to the first rectangular outline and the second rectangular outline; according to maxLine [ g ] > value2 and maxArea [ g ] < value3, the robot can be obtained to be on the edge working path at the moment. And when wg=w, wb < W, P2 is performed, and the difference between the maximum ordinate values Y [ max ] and Y [ b ] of the original image or the binarized image is given to the mapping length. Since in this embodiment the preset first value=49, the mapping length is then equal to the preset first value, and at this time, the robot detects that the target position does not contain an obstacle, and the robot continues to work in the current direction, when yb=80, the robot is at the first turning point and performs the clockwise turning logic.
Referring to fig. 8, the binarized images on the left and right sides of fig. 8 are identical images, which differ in that the left selecting portion is a first rectangular outline; in addition, the gray value of the working area in the left binarized image is set to 0, and the gray value of the non-working area is set to 255; the gray value of the working area in the left binarized image is set to 255, and the gray value of the non-working area is set to 0. Analyzing the original image to obtain a binarized image, and obtaining maxLine [ g ] =1024 and maxarea [ g ] =4407 according to the first rectangular outline and the second rectangular outline; according to maxLine [ g ] > value2 and maxArea [ g ] < value3, the robot can be obtained to be on the edge working path at the moment. And at this time Y [ g ] =56, P3 is performed to give the mapping length a difference value of maximum ordinate values Y [ max ] and Y [ g ] of the original image or the binarized image. Since in this embodiment the preset first value=49, the robot still needs to continue moving forward until the mapping length is equal to the preset first value, i.e.: until Y [ g ] =71, at this time, when the robot detects that the target position does not contain an obstacle, the robot continues to operate in the current direction, and when Y [ g ] =75, the robot is at the second turning point and executes the counterclockwise turning logic.
Further, it should be noted that, in step S2, if the robot detects that the target position in the traveling direction does not need to be turned, the robot continues to travel along the predetermined route. In particular, in the preferred embodiment of the present invention, there are the following two cases. First, if the width of the first rectangular outline is identical to the width of the original image, and the width of the non-working area is also equal to the width of the original image, that is, W [ b ] =w, then the possible situations are: the robot deviates from the original working direction due to the influence of the working environment, and at this time, the robot needs to be adjusted to be readjusted to the edge working path. Specifically, in this embodiment, a value of 0 or other value not equal to the preset first value is given to the mapping length, and when the mapping length is not equal to the preset first value, in a preferred embodiment of the present invention, the following steps are repeatedly performed until the mapping length is equal to the preset first value or until the robot stops working along edges; the steps include: the robot executes the riding line edge work along the current direction and acquires an original image in front of the robot in real time; analyzing the original image, and acquiring the mapping length of the distance between the current position of the robot and the target position on the original image, wherein the target position is a steering point and/or an obstacle on a boundary; and detecting whether the mapping length is equal to the preset first value or not again. Thus, the robot continues to carry out the edge operation according to the set edge operation plan, and tries to return to the original operation path. Second, in step P3, if Y [ g ] is equal to 0, it is indicated that the robot is in the normal edge working process at this time, and then a value of 0 or other value different from the preset first value is still given to the mapping length, so that the robot can perform the riding line edge working along the current direction until the mapping length is equal to the preset first value or until the robot stops the edge working.
In a preferred embodiment of the present invention, as shown in fig. 9, for step S3, the "detecting whether the target position has an obstacle" includes: transmitting a detection signal towards the target location by a ranging device; if the ranging device receives the reply signal, judging that the target position has an obstacle; and if the ranging device does not receive the reply signal, judging that no obstacle exists at the target position.
The distance measuring device is for example: ultrasonic detection equipment, infrared detection equipment, etc.
Referring to fig. 10, the distance measuring device is an ultrasonic probe, and the range of the distance measuring device capable of receiving the reply signal is a sector area as shown in fig. 9; in this example, θ (i=1, 2, 3) is the angular range detected by the ultrasonic probe; si (i=1, 2, 3) is the distance between the machine and the obstacle determined by the ultrasonic probe receiving the echo signal; diMax (i=1, 2, 3) is the maximum threshold value of the safe distance between the ultrasonic probe and the obstacle; diMin (i=1, 2, 3) is the minimum threshold value of the safety distance between the ultrasonic probe and the obstacle; diMax > diMin; when the ultrasonic wave receives the reply signal, the robot is at the current position, and the detected target position is an obstacle; that is, when the detected distance value Si is between the safety distance thresholds diMin and diMax, it is determined that the machine detects an obstacle and does not strike the obstacle; when the distance Si is greater than the safety distance threshold value diMax, it is determined that the robot does not receive the reply signal, that is, the obstacle is not detected.
In a preferred embodiment of the present invention, the distance measuring device includes: a first ranging subunit detecting whether an obstacle exists in front of the robot and a second ranging subunit detecting whether an obstacle exists on the side of the robot;
the robot performing obstacle avoidance operation includes:
controlling the robot to turn towards the working area; when the first ranging subunit does not receive the reply signal any more, the second ranging subunit receives the reply signal, and drives the robot to execute off-line edge work; in the process of driving the robot to execute the off-line edge working, if the time that the first ranging subunit and the second ranging subunit do not receive the reply signal is larger than a preset fourth value, driving the robot to return to the boundary line again to execute the riding line edge working.
With reference to fig. 11, the robot performs a wire riding and edge working in the process of turning the working path a to the working path b; in the process, a target position on the boundary line is detected; further, the target position can be predicted to be an obstacle through the judgment of the method, and at the moment, the robot is switched to a working mode of off-line edge, namely, the robot is switched from the working path b to the working path c; when the time that the first ranging subunit and the second ranging subunit do not receive the reply signal is larger than the preset fourth value, the robot is required to be driven to return to the working path again, and the riding line edge working is performed, namely the robot is driven to move from the working path c to the working path d and from the working path d to the working path f.
In a preferred embodiment of the present invention, before the step S1, the method further includes: after the robot is initialized, the binarized image obtained by the first analysis is divided into a left image and a right image, the number of pixel points in the left image and the right image is counted respectively, whether the number of the pixel points in the left image is larger than that of the pixel points in the right image is judged, if so, the robot is driven to work in a anticlockwise time sequence, and if not, the robot is driven to work in a clockwise time sequence.
In addition, when it is determined that the robot is on the edgewise working path and walks along the edgewise working path, the specific edgewise working path may be detected by an edge detection method based on the binarized image. Referring to fig. 5G, an edge detection method adopted in a specific example of the present invention is a canny algorithm; the detection principle is as follows: counting the offset x of the edge line and the central line, and adjusting the walking path of the robot according to the x; the center line is a vertical straight line at 1/2 of the width direction of the binarized image; the edge line is the intersection line of the working area and the non-working area; in the statistical process, the method is not limited to solving the mean value of the edge line in the x-axis direction, but can also solve the linear equation of the edge line by utilizing Hough linear detection, and adjust the walking direction and the walking path of the robot through the included angle between the edge line and the central line, and no further description is given here.
In an embodiment of the present invention, a robot is further provided, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the robot walking control method when executing the computer program.
In an embodiment of the present invention, there is also provided a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the robot walking control method described above.
Referring to fig. 12, there is provided a robot walking control system, the system including: a driving module 100, and a parsing module 200.
The driving module 100 is used for controlling the robot to walk according to a preset route;
the analysis module 200 is used for judging whether the target position of the robot in the walking direction needs to be turned; if steering is needed, detecting whether an obstacle exists at the target position when the robot is controlled to walk to a preset distance from the target position.
Further, the parsing module 200 is further configured to: if the target position has an obstacle, controlling the robot to turn at the current position so as to avoid the obstacle; and if no obstacle exists at the target position, controlling the robot to turn when walking to the target position.
In addition, it should be noted that, in the preferred embodiment of the present invention, the driving module 100 in the robot travel control system is used to implement step S1; the parsing module 200 is used for implementing steps S2 and S3; it will be clear to those skilled in the art that, for convenience and brevity of description, reference may be made to the corresponding process in the foregoing method embodiment for the specific working process of the above-described system, which is not repeated here.
In summary, according to the robot walking control method, system, robot and readable storage medium of the present invention, by combining the image analysis and the ranging device, whether there is an obstacle at the target position on the boundary line is accurately distinguished; and then driving the robot to execute different working modes according to the analysis result.
In the several embodiments provided in this application, it should be understood that the disclosed modules, systems, and methods may be implemented in other manners. The system embodiments described above are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed.
The modules described as separate components may or may not be physically separate, and components displayed as modules may or may not be physical modules, that is, may be located in one place, or may be distributed on a plurality of network modules, and some or all of the modules may be selected according to actual needs to achieve the purpose of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or 2 or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in hardware plus software functional modules.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (7)

1. A robot travel control method, the method comprising:
controlling the robot to walk according to a preset route;
judging whether the target position of the robot in the walking direction needs to be turned or not;
if steering is needed, detecting whether an obstacle exists at the target position when the robot is controlled to walk to a preset distance from the target position;
wherein, when the robot is controlled to walk to a preset distance from the target position, detecting whether the target position has an obstacle comprises:
acquiring the distance between the current position of the robot and the target position;
judging whether the distance between the current position of the robot and the target position reaches the preset distance or not;
if the preset distance is reached, detecting whether an obstacle exists at the target position;
the obtaining the distance between the current position of the robot and the target position includes:
acquiring an original image;
converting the original image into a binarized image, the binarized image comprising a working area having first pixel values and a non-working area having second pixel values;
based on the binarized image, a first rectangular outline which encloses the working area and is the smallest and a second rectangular outline which encloses the non-working area and is the smallest are obtained;
Obtaining a mapping distance of the distance between the current position of the robot and the target position on the original image according to the first rectangular outline and the second rectangular outline, and taking the mapping distance as the distance between the current position of the robot and the target position;
the obtaining, according to the first rectangular outline and the second rectangular outline, a mapping distance between a current position of the robot and the target position on the original image includes:
acquiring the width of a first rectangular outline and the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image, and the width of a second rectangular outline and the relative ordinate value of the upper left corner of the second rectangular outline on the original image or the binary image;
if the width of the first rectangular outline is equal to the width of the original image or the binary image, assigning a difference value between the maximum longitudinal coordinate value of the original image or the binary image and the opposite longitudinal coordinate value of the upper left corner of the second rectangular outline on the original image or the binary image to a mapping length when the width of the second rectangular outline is smaller than the width of the original image or the binary image;
If the width of the first rectangular outline is smaller than the width of the original image or the binary image, if the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image is not 0, assigning a difference value between the maximum ordinate value of the original image or the binary image and the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image to a mapping length.
2. The robot walking control method of claim 1, characterized in that the method further comprises:
if the target position has an obstacle, controlling the robot to turn at the current position so as to avoid the obstacle;
and if no obstacle exists at the target position, controlling the robot to turn when walking to the target position.
3. The robot walking control method of claim 1, wherein said determining whether the target position of the robot in the walking direction requires steering comprises:
and judging whether the robot needs to turn at the target position according to the first rectangular outline and the second rectangular outline.
4. A robot walking control method of any of claims 1-3, wherein said detecting whether an obstacle is present at said target location comprises:
transmitting a detection signal towards the target location by a ranging device;
if the ranging device receives the reply signal, judging that the target position has an obstacle;
and if the ranging device does not receive the reply signal, judging that no obstacle exists at the target position.
5. A robot travel control system, the system comprising:
the driving module is used for controlling the robot to walk according to a preset route;
the analysis module is used for judging whether the target position of the robot in the walking direction needs to be turned, and if so, detecting whether an obstacle exists in the target position when the robot is controlled to walk to a preset distance from the target position;
wherein, when the robot is controlled to walk to a preset distance from the target position, detecting whether the target position has an obstacle comprises:
acquiring the distance between the current position of the robot and the target position;
Judging whether the distance between the current position of the robot and the target position reaches the preset distance or not;
if the preset distance is reached, detecting whether an obstacle exists at the target position;
the obtaining the distance between the current position of the robot and the target position includes:
acquiring an original image;
converting the original image into a binarized image, the binarized image comprising a working area having first pixel values and a non-working area having second pixel values;
based on the binarized image, a first rectangular outline which encloses the working area and is the smallest and a second rectangular outline which encloses the non-working area and is the smallest are obtained;
obtaining a mapping distance of the distance between the current position of the robot and the target position on the original image according to the first rectangular outline and the second rectangular outline, and taking the mapping distance as the distance between the current position of the robot and the target position;
the obtaining, according to the first rectangular outline and the second rectangular outline, a mapping distance between a current position of the robot and the target position on the original image includes:
Acquiring the width of a first rectangular outline and the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image, and the width of a second rectangular outline and the relative ordinate value of the upper left corner of the second rectangular outline on the original image or the binary image;
if the width of the first rectangular outline is equal to the width of the original image or the binary image, assigning a difference value between the maximum longitudinal coordinate value of the original image or the binary image and the opposite longitudinal coordinate value of the upper left corner of the second rectangular outline on the original image or the binary image to a mapping length when the width of the second rectangular outline is smaller than the width of the original image or the binary image;
if the width of the first rectangular outline is smaller than the width of the original image or the binary image, if the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image is not 0, assigning a difference value between the maximum ordinate value of the original image or the binary image and the relative ordinate value of the upper left corner of the first rectangular outline on the original image or the binary image to a mapping length.
6. A robot comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the robot walking control method of any of claims 1-4.
7. A readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the robot walking control method of any of claims 1-4.
CN202010471646.3A 2020-05-29 2020-05-29 Robot walking control method, system, robot and readable storage medium Active CN113805571B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010471646.3A CN113805571B (en) 2020-05-29 2020-05-29 Robot walking control method, system, robot and readable storage medium
PCT/CN2020/118366 WO2021238001A1 (en) 2020-05-29 2020-09-28 Robot travelling control method and system, robot, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010471646.3A CN113805571B (en) 2020-05-29 2020-05-29 Robot walking control method, system, robot and readable storage medium

Publications (2)

Publication Number Publication Date
CN113805571A CN113805571A (en) 2021-12-17
CN113805571B true CN113805571B (en) 2024-03-12

Family

ID=78745532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010471646.3A Active CN113805571B (en) 2020-05-29 2020-05-29 Robot walking control method, system, robot and readable storage medium

Country Status (2)

Country Link
CN (1) CN113805571B (en)
WO (1) WO2021238001A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116257068A (en) * 2022-08-05 2023-06-13 智橙动力(北京)科技有限公司 Pool wall obstacle avoidance moving method and device of swimming pool cleaning robot and electronic equipment
CN118012031A (en) * 2022-10-28 2024-05-10 苏州科瓴精密机械科技有限公司 Control method and device of robot, robot and storage medium
CN116719326A (en) * 2023-07-24 2023-09-08 国广顺能(上海)能源科技有限公司 Robot obstacle avoidance method, system and storage medium
CN117733819B (en) * 2024-02-21 2024-05-14 太原工业学院 Operation method and device of intelligent inspection robot for power plant

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003323621A (en) * 2002-05-08 2003-11-14 Nec Corp Image processor and protrusion detecting method for the image processor and its program
JP2007265343A (en) * 2006-03-30 2007-10-11 Tottori Univ Follow-up device of mobile object and electric wheelchair having the same
CN105159319A (en) * 2015-09-29 2015-12-16 广州极飞电子科技有限公司 Spraying method of unmanned plane and unmanned plane
CN106155053A (en) * 2016-06-24 2016-11-23 桑斌修 A kind of mowing method, device and system
CN107454969A (en) * 2016-12-19 2017-12-08 深圳前海达闼云端智能科技有限公司 Obstacle detection method and device
CN107463166A (en) * 2016-06-03 2017-12-12 苏州宝时得电动工具有限公司 Automatic running device and its control traveling method
CN108021130A (en) * 2017-11-07 2018-05-11 北京勇搏科技有限公司 A kind of unpiloted harvester
CN109602341A (en) * 2019-01-23 2019-04-12 珠海市微半导体有限公司 A kind of clean robot based on virtual boundary falls control method and chip
CN110919653A (en) * 2019-11-29 2020-03-27 深圳市优必选科技股份有限公司 Stair climbing control method and device for robot, storage medium and robot
CN110928283A (en) * 2018-09-18 2020-03-27 深圳市优必选科技有限公司 Robot and intelligent moving method and device thereof
CN111123905A (en) * 2018-10-31 2020-05-08 苏州科瓴精密机械科技有限公司 Control method and system of walking robot

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5959168B2 (en) * 2011-08-31 2016-08-02 オリンパス株式会社 Image processing apparatus, operation method of image processing apparatus, and image processing program
CN102520721B (en) * 2011-12-08 2015-05-27 北京控制工程研究所 Autonomous obstacle-avoiding planning method of tour detector based on binocular stereo vision
CN104111651A (en) * 2013-04-22 2014-10-22 苏州宝时得电动工具有限公司 Automatic walking equipment and method for automatic walking equipment to return to stop station
CN104965514B (en) * 2015-06-23 2017-09-05 高世恒 One kind can Intelligent mobile computer
US9828094B2 (en) * 2015-07-26 2017-11-28 John B. McMillion Autonomous cleaning system
CN106416587A (en) * 2015-08-13 2017-02-22 苏州宝时得电动工具有限公司 Mower positioning error eliminating method, device and system
CN109015643A (en) * 2018-08-17 2018-12-18 徐润秋 A kind of walking robot walking route control method
CN110393482A (en) * 2019-09-03 2019-11-01 深圳飞科机器人有限公司 Maps processing method and clean robot

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003323621A (en) * 2002-05-08 2003-11-14 Nec Corp Image processor and protrusion detecting method for the image processor and its program
JP2007265343A (en) * 2006-03-30 2007-10-11 Tottori Univ Follow-up device of mobile object and electric wheelchair having the same
CN105159319A (en) * 2015-09-29 2015-12-16 广州极飞电子科技有限公司 Spraying method of unmanned plane and unmanned plane
CN107463166A (en) * 2016-06-03 2017-12-12 苏州宝时得电动工具有限公司 Automatic running device and its control traveling method
CN106155053A (en) * 2016-06-24 2016-11-23 桑斌修 A kind of mowing method, device and system
CN107454969A (en) * 2016-12-19 2017-12-08 深圳前海达闼云端智能科技有限公司 Obstacle detection method and device
CN108021130A (en) * 2017-11-07 2018-05-11 北京勇搏科技有限公司 A kind of unpiloted harvester
CN110928283A (en) * 2018-09-18 2020-03-27 深圳市优必选科技有限公司 Robot and intelligent moving method and device thereof
CN111123905A (en) * 2018-10-31 2020-05-08 苏州科瓴精密机械科技有限公司 Control method and system of walking robot
CN109602341A (en) * 2019-01-23 2019-04-12 珠海市微半导体有限公司 A kind of clean robot based on virtual boundary falls control method and chip
CN110919653A (en) * 2019-11-29 2020-03-27 深圳市优必选科技股份有限公司 Stair climbing control method and device for robot, storage medium and robot

Also Published As

Publication number Publication date
CN113805571A (en) 2021-12-17
WO2021238001A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
CN113805571B (en) Robot walking control method, system, robot and readable storage medium
US20220324112A1 (en) Domestic robotic system and method
US8736820B2 (en) Apparatus and method for distinguishing ground and obstacles for autonomous mobile vehicle
CN111035327A (en) Cleaning robot, carpet detection method, and computer-readable storage medium
CN112327878B (en) Obstacle classification and obstacle avoidance control method based on TOF camera
CN107315410B (en) Automatic obstacle removing method for robot
US20140115797A1 (en) Self-driven floor cleaning device
US10068141B2 (en) Automatic operation vehicle
GB2313971A (en) Obstacle tracking by moving vehicle
JP2012244708A5 (en)
CN115328175B (en) Logistics robot system
US20230031843A1 (en) Cleaning machine for a road or pavement or gutter
CN211933898U (en) Cleaning robot
CN112720408B (en) Visual navigation control method for all-terrain robot
CN113807118B (en) Robot edge working method, system, robot and readable storage medium
CN116466724A (en) Mobile positioning method and device of robot and robot
JPH0436405B2 (en)
CN114937258B (en) Control method for mowing robot, and computer storage medium
Aggarwal et al. Vision based collision avoidance by plotting a virtual obstacle on depth map
WO2020100264A1 (en) Autonomous work machine, method for controlling autonomous work machine, and program
CN111731324A (en) Control method and system for guiding AGV intelligent vehicle based on vision
CN105922258B (en) A kind of Omni-mobile manipulator autonomous navigation method based on iGPS
EP4328698A1 (en) Self-moving device, moving trajectory adjusting method, and computer-readable storage medium
Yin et al. Development of an obstacle avoidance system for a field robot using a 3D camera
CN113848872B (en) Automatic walking device, control method thereof and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230601

Address after: 215000 No. 8 Ting Rong Street, Suzhou Industrial Park, Jiangsu, China

Applicant after: Suzhou Cleva Precision Machinery & Technology Co.,Ltd.

Applicant after: SKYBEST ELECTRIC APPLIANCE (SUZHOU) Co.,Ltd.

Address before: 215000 Huahong street, Suzhou Industrial Park, Jiangsu 18

Applicant before: Suzhou Cleva Precision Machinery & Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant