CN111949021B - Self-propelled robot and control method thereof - Google Patents

Self-propelled robot and control method thereof Download PDF

Info

Publication number
CN111949021B
CN111949021B CN202010747842.9A CN202010747842A CN111949021B CN 111949021 B CN111949021 B CN 111949021B CN 202010747842 A CN202010747842 A CN 202010747842A CN 111949021 B CN111949021 B CN 111949021B
Authority
CN
China
Prior art keywords
self
propelled robot
obstacle
curvature
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010747842.9A
Other languages
Chinese (zh)
Other versions
CN111949021A (en
Inventor
王旭宁
孙斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharkninja China Technology Co Ltd
Original Assignee
Sharkninja China Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharkninja China Technology Co Ltd filed Critical Sharkninja China Technology Co Ltd
Priority to CN202010747842.9A priority Critical patent/CN111949021B/en
Publication of CN111949021A publication Critical patent/CN111949021A/en
Application granted granted Critical
Publication of CN111949021B publication Critical patent/CN111949021B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure provides a control method of a self-propelled robot provided with an obstacle sensor, the control method comprising: detecting an obstacle by the obstacle sensor; extracting at least three feature points of the obstacle; fitting a curvature of a contour line of at least part of the respective obstacle based on the at least three feature points; and controlling the self-propelled robot to travel according to the curvature. The control method disclosed by the invention can enable the self-propelled robot to timely acquire the curvature of the obstacle, so that the self-propelled robot can judge whether the obstacle is provided with the bulge and the recess, and further enable the self-propelled robot to make a corresponding strategy to prevent the self-propelled robot from colliding with the bulge on the obstacle.

Description

Self-propelled robot and control method thereof
Technical Field
The disclosure belongs to the technical field of artificial intelligence, and particularly provides a self-propelled robot and a control method thereof.
Background
With the improvement of living water, intelligent sweeper is entering more and more families.
In the prior art, when the sweeping robot cleans along the edges, the walking direction of the sweeping robot needs to be adjusted so as to keep a constant distance between the sweeping robot and the wall. During the edge sweeping, the sweeping robot generally advances at a constant speed, and there are the following problems: when there is a protrusion on the wall, the robot for sweeping floor is liable to hit the protrusion on the wall due to not decelerating in time.
In order to solve the technical problem, the existing floor sweeping robot can acquire a map of a room from a user's mobile phone or automatically scan an environment in the room through a SLAM (synchronous localization and mapping) technique and thus acquire a map of the room, thereby enabling the floor sweeping robot to perform an edge sweeping operation according to the map. However, the above-described problems still remain in the case where the user changes the arrangement in the room without the floor sweeping robot updating the map in time.
Disclosure of Invention
In order to solve the above-mentioned problems in the prior art, that is, in order to solve the problem that the existing sweeping robot easily collides with the protrusion on the wall, the present disclosure provides a control method of a self-propelled robot provided with an obstacle sensor, the control method comprising:
detecting an obstacle by the obstacle sensor;
extracting at least three characteristic points of the obstacle;
fitting a curvature of a contour line of at least part of the corresponding obstacle based on the at least three feature points;
and controlling the traveling of the self-propelled robot according to the curvature.
Optionally, the step of "controlling the travel of the self-propelled robot according to the curvature" includes:
controlling the traveling speed of the self-propelled robot according to the curvature; and/or the number of the groups of groups,
and determining the shape of the contour line according to the curvature, and controlling the traveling direction of the self-propelled robot according to the shape.
Optionally, the step of "determining the shape of the contour line according to the curvature and controlling the traveling direction of the self-propelled robot according to the shape" further includes:
responding to the shape of the contour line to be a straight line structure, so that the self-propelled robot keeps the current walking direction;
turning the self-propelled robot in a direction away from the obstacle in response to the shape of the contour line being a convex structure;
and turning the self-propelled robot toward a direction approaching the obstacle in response to the contour line having a concave structure.
Optionally, the step of "the greater the curvature is, the smaller the traveling speed of the self-propelled robot" includes:
causing the self-propelled robot to compare the curvature with a curvature-speed comparison table stored in advance, and thereby obtaining a target travel speed corresponding to the curvature; alternatively, the self-propelled robot is caused to substitute the curvature into a curvature-speed formula edited in advance and thus obtain a target walking speed;
the self-propelled robot is changed from the current travel speed to the target travel speed.
Optionally, the aforementioned at least three feature points satisfy the following relationship:
sequentially marking the at least three characteristic points as D1 and D2 … … Dn according to the sequence from front to back, wherein n is more than or equal to 3 and n is a natural number;
a straight line connecting D1 and the obstacle sensor is denoted as L1, and an acute angle between L1 and the traveling direction of the self-propelled robot is denoted as β1;
a straight line connecting D2 and the obstacle sensor is denoted as L2, and an acute angle between L2 and the traveling direction of the self-propelled robot is denoted as β2;
……
a straight line connecting Dn and the obstacle sensor is denoted as Ln, and an acute angle between Ln and the traveling direction of the self-propelled robot is denoted as βn;
wherein β1 is more than β2 and … … is more than βn.
Optionally, the step of "extracting at least three feature points of the aforementioned obstacle" includes: extracting three characteristic points of the obstacle;
the step of fitting the curvature of the contour line of at least part of the corresponding obstacle based on the aforementioned at least three feature points includes:
acquiring the arcs where the three characteristic points are located;
the curvature of the aforementioned arc is obtained and thus the curvature of the contour line of at least part of the aforementioned obstacle is obtained.
Optionally, the step of "extracting at least three feature points of the aforementioned obstacle" includes: extracting at least four characteristic points of the obstacle;
the step of fitting the curvature of the contour line of at least part of the corresponding obstacle based on the aforementioned at least three feature points includes:
fitting the at least four characteristic points into a curve;
the average curvature of the aforesaid curve is obtained and thus the curvature of the contour line of at least part of the aforesaid obstacle is obtained.
Optionally, the step of "extracting at least three feature points of the obstacle" is performed by an upper computer of the self-propelled robot;
the step of fitting the curvature of the contour line of at least part of the corresponding obstacle based on the at least three feature points is performed by a lower computer of the self-propelled robot.
Optionally, the obstacle sensor includes a laser radar, an edge sensor disposed at a side of the autonomous robot, and/or an image acquisition device.
In addition, the disclosure also provides a self-propelled robot, which includes a processor, a memory, and an execution instruction stored on the memory, where the execution instruction is configured to enable the self-propelled robot to execute the control method according to any one of the foregoing technical solutions when executed by the processor.
As can be appreciated by those skilled in the art, the self-propelled robot and the control method thereof disclosed in the present disclosure have at least the following advantages:
1. through extracting at least three feature point on the obstacle, then fitting out the curvature of the outline of at least part of corresponding obstacle based on this at least three feature point, and then according to the self-propelled robot of curvature control marcing for self-propelled robot can in time acquire the curvature of obstacle, and then can judge whether there is arch and recess on the obstacle, and then makes self-propelled robot can make corresponding tactics, prevents self-propelled robot collision to the arch on the obstacle.
2. The walking speed of the self-propelled robot is controlled through the curvature, so that the self-propelled robot can match different speeds for the curvature on the obstacle, and the situation that the self-propelled robot cannot reach the obstacle during the speed reduction and collision when the walking speed of the self-propelled robot is too high is avoided.
3. By causing the self-propelled robot to compare the curvature with a curvature-speed comparison table stored in advance, and thus obtain a target walking speed corresponding to the curvature; alternatively, the self-propelled robot is caused to substitute the curvature into a curvature-speed formula edited in advance and thus obtain the target traveling speed, so that the self-propelled robot can be changed from the current traveling speed to the target traveling speed, thereby reducing the traveling speed when the curvature is large, and preventing the self-propelled robot from reaching the obstacle after the traveling speed is too fast.
4. The shape of the contour line is determined through the curvature, so that the self-propelled robot can change the walking direction according to the shape, and the situation that the self-propelled robot collides with an obstacle is avoided.
5. When the shape of the contour line is a straight line structure, the current walking direction is kept, and the edge-following efficiency of the self-propelled robot is ensured. When the contour line is in a convex structure, the self-propelled robot turns in a direction away from the obstacle, so that the self-propelled robot is prevented from decelerating and colliding with the obstacle when the traveling speed of the self-propelled robot is too high. When the shape of the contour line is a concave structure, the self-propelled robot turns towards the direction close to the obstacle, so that the self-propelled robot can traverse the concave structure, and the situation of no traversal is avoided.
6. By extracting at least four characteristic points of the obstacle and fitting the at least four characteristic points to a curve, and then obtaining the average curvature of the curve and thus obtaining at least part of the contour line of the obstacle, the self-propelled robot is enabled to perform edge-following operations on protrusions or depressions of various shapes.
7. The upper computer is used for extracting at least three characteristic points of the obstacle, and the lower computer is used for fitting the curvature of at least part of the contour line of the corresponding obstacle based on the at least three characteristic points, so that the operation rate of the self-propelled robot is improved, the operation period is shortened, the reliability of the curvature obtained by the self-propelled robot is improved, and the situation that the self-propelled robot collides with the obstacle or misses a traversing area is further avoided.
Drawings
Some embodiments of the present disclosure are described below with reference to the accompanying drawings, in which:
FIG. 1 is a schematic illustration of the effect of a wall having a rugged surface in the present disclosure;
FIG. 2 is a schematic illustration of the effect of a wall of the present disclosure having only raised surfaces;
FIG. 3 is a schematic illustration of the effect of the present disclosure with walls having only right angles;
FIG. 4 is a schematic illustration of the effect of a barrier of cylindrical cross section in the present disclosure;
fig. 5 is a flowchart of main steps of a control method of the self-propelled robot in the first embodiment of the present disclosure;
FIG. 6 is a schematic view showing the effects of the self-propelled robot during the edging operation according to the first embodiment of the present disclosure;
fig. 7 is a flowchart of main steps of a control method of the self-propelled robot in the second embodiment of the present disclosure;
FIG. 8 is a schematic view of a first effect during an edge operation of the self-propelled robot in a second embodiment of the present disclosure;
FIG. 9 is a schematic view showing a second effect of the self-propelled robot during the edgewise operation in the second embodiment of the present disclosure;
fig. 10 is a schematic structural view of a control module of the self-propelled robot in the third embodiment of the present disclosure.
List of reference numerals:
1. a self-propelled robot; 11. a laser radar; 12. edge sensors; 13. side brushing; 14. a walking wheel;
2. a wall body;
3. cylindrical obstacles.
Detailed Description
It should be understood by those skilled in the art that the embodiments described below are only a part of the embodiments of the present disclosure, and not all of the embodiments of the present disclosure, and the part of the embodiments are intended to explain the technical principles of the present disclosure and are not intended to limit the scope of the present disclosure. All other embodiments, which can be made by one of ordinary skill in the art without undue burden on the person of ordinary skill in the art based on the embodiments provided by the present disclosure, are still within the scope of the present disclosure.
It should be noted that, in the description of the present disclosure, terms such as "center", "upper", "lower", "top", "bottom", "left", "right", "vertical", "horizontal", "inner", "outer", and the like indicate directions or positional relationships, which are based on the directions or positional relationships shown in the drawings, are merely for convenience of description, and do not indicate or imply that the foregoing devices or elements must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
As shown in fig. 1 to 4, the self-propelled robot 1 of the present disclosure is provided with a laser radar 11, an edge sensor 12, an edge brush 13, and a traveling wheel 14. The laser radar 11 is disposed on top of the self-propelled robot 1, and the laser radar 11 can acquire the outline of the obstacle around the self-propelled robot 1 in a scanning manner. The edge sensor 12 is provided in front of the side of the self-propelled robot 1, and the edge sensor 12 is configured to provide an edge signal to the self-propelled robot 1 so that the self-propelled robot 1 can perform an edge operation based on the edge signal. The side brush 13 is used for sweeping garbage on the ground to the front side of the self-propelled robot 1, and then is picked up by the self-propelled robot 1, so that the purpose of cleaning the ground is achieved. The travelling wheel 14 is disposed at the bottom of the self-propelled robot 1 and is used for supporting the self-propelled robot 1 to walk.
As will be appreciated by those skilled in the art, edge sensor 12 may be an infrared pair-tube along-wall sensor, an infrared TOF along-wall sensor, or an infrared triangulation along-wall sensor.
It will be appreciated by those skilled in the art that the self-propelled robot 1 of the present disclosure is not limited to the floor sweeping robot shown in fig. 1 to 4, but may be any feasible walking device such as a floor mopping robot, a floor sweeping and mopping robot, a floor mopping robot, a navigation robot, etc. Likewise, the self-propelled robot 1 described in the later embodiments of the present disclosure may be any viable apparatus such as a sweeping robot, a mopping robot, a sweeping and mopping robot, a navigation robot, and the like.
As shown in fig. 1 to 4, there are mainly four cases of the object to be edged when the self-propelled robot 1 performs the edgewise operation. In the first case, as shown in fig. 1, the surface of the wall 2 has a continuous rugged structure; in the second case, as shown in fig. 2, a protrusion structure exists on the surface of the wall 2; in case three, as shown in fig. 3, the wall 2 has a right angle; in case four, as shown in fig. 4, the volume of the obstacle is small, for example, a cylindrical obstacle 3.
The self-propelled robot 1 of the present disclosure can recognize the curvature of the cylindrical obstacle 3 shown in fig. 4 by recognizing the surface curvature of the wall 2 shown in fig. 1 to 3, thereby allowing the self-propelled robot 1 to travel along the curved direction of the upper contour of the obstacle, preventing the collision of the self-propelled robot 1 with the convex structure in fig. 1 and 2, preventing the self-propelled robot 1 from being unable to clean the groove structure in fig. 1, and preventing the phenomenon in which the self-propelled robot 1 shown in fig. 3 and 4 rotates in place due to the inability of the edge sensor 12 to effectively detect the obstacle.
The self-propelled robot of the present disclosure and the control method thereof will be described in detail with reference to specific embodiments.
In a first embodiment of the present disclosure:
the autonomous robot of the present embodiment is provided with an obstacle sensor (not shown).
As shown in fig. 5, the control method of the self-propelled robot of the present embodiment includes:
step S110, detecting an obstacle through an obstacle sensor;
the obstacle sensor may be at least one of a laser radar, an edge sensor provided at a side of the self-propelled robot, and an image acquisition device.
As example one: the obstacle sensor is a laser radar, and the laser radar detects whether an obstacle exists around the self-propelled robot by scanning.
As example two: the obstacle sensor is an edge sensor (e.g., an infrared pair-tube wall sensor, an infrared TOF wall sensor, or an infrared triangulation wall sensor) that determines that an obstacle is detected when it receives a signal or is triggered.
As an example three: the obstacle sensor is an image pickup device that acquires an image of the surroundings of the self-propelled robot, and then determines whether an obstacle exists around the self-propelled robot from the acquired image.
Step S120, extracting at least three characteristic points of the obstacle;
wherein the at least three feature points are all on the same plane or almost on a plane, which is parallel to the horizontal plane.
Specifically, any feasible number of feature points such as 3 feature points, 4 feature points, 5 feature points, and the like on the aforementioned obstacle may be extracted.
Further specifically, as shown in fig. 6, the aforementioned at least three feature points are sequentially denoted as D1 (point a shown in fig. 6), D2 (point B shown in fig. 6), … … Dn (point N shown in fig. 6), where N is 3 or more and N is a natural number, in the order from front to back;
a straight line connecting D1 and the obstacle sensor is denoted as L1 (a straight line segment passing through point a as shown in fig. 6), and an acute angle between L1 and the traveling direction of the self-propelled robot is denoted as β1 (not shown);
a straight line connecting D2 and the obstacle sensor is denoted as L2 (a straight line segment passing through the point B as shown in fig. 6), and an acute angle between L2 and the traveling direction of the self-propelled robot is denoted as β2 (not shown);
……
a straight line connecting Dn and the obstacle sensor is denoted as Ln (a straight line segment passing through N points as shown in fig. 6), and an acute angle between Ln and the traveling direction of the self-propelled robot is denoted as βn (not shown);
wherein β1 is more than β2 and … … is more than βn and 90 degrees.
It will be appreciated by those skilled in the art that β1 < β2 < … … < βn < 90 ° is so that most of the characteristic points are located on the front side of the autonomous robot 1 to give the autonomous robot 1 sufficient reaction time to reduce the forward speed and control the rotation angle to change the traveling direction.
Step S130, fitting out the curvature of the contour line of at least part of the corresponding obstacle based on the at least three characteristic points;
specifically, the aforementioned at least three feature points are fitted to a curve, and then the average curvature of the aforementioned curve is obtained, and thus the curvature of the contour line of at least part of the aforementioned obstacle is obtained.
It should be noted that, since the technical means for fitting a plurality of discrete points into a curve is known to those skilled in the art, the description thereof is omitted herein.
More specifically, the average curvature is obtained by dividing the fitted curve into a plurality of curve segments according to actual needs, then calculating the curvature of each curve segment separately, then summing all the curvatures, and then calculating the average value. Alternatively, the person skilled in the art may choose a minimum curvature from the above-mentioned curvatures as the curvature of the contour line of at least part of the above-mentioned obstacle, as desired.
Step S140, controlling the self-propelled robot to travel according to the curvature.
Specifically, the shape of the contour line is determined according to the curvature, and the traveling direction of the self-propelled robot is controlled according to the shape.
More specifically, when the shape of the contour line is a straight line structure, the self-propelled robot is caused to maintain the current traveling direction; when the shape of the contour line is a convex structure, turning the self-propelled robot in a direction away from the obstacle; when the contour line is in a concave structure, the self-propelled robot turns in a direction approaching the obstacle. So that the self-propelled robot can always advance along the aforementioned contour, in other words, the self-propelled robot can always walk along the edge of the obstacle, i.e., perform an edge-following operation on the obstacle.
Further, the traveling speed of the self-propelled robot is controlled according to the magnitude of the curvature.
More specifically, the self-propelled robot is caused to compare the curvature with a curvature-speed map stored in advance, and thus obtain a target travel speed corresponding to the curvature; the self-propelled robot is changed from the current travel speed to the target travel speed. Wherein the curvature-velocity table may be obtained through a plurality of experiments or according to empirical values in the art. Those skilled in the art will appreciate that in the curvature-speed comparison table, the speed value corresponding to the curvature with a larger value is smaller, so as to reserve enough time for the self-propelled robot to turn, and avoid the phenomenon that the self-propelled robot does not react timely or collides with the protrusion on the obstacle due to inertia when the speed of the self-propelled robot is too high. Illustratively, a multi-level curvature threshold is set, such as 0 < K1 < K2 < … < Kn, and a corresponding multi-level linear velocity v1 > v2 > … > vn > v0.
Alternatively, the self-propelled robot is caused to shift from the current travel speed to the target travel speed by substituting the curvature into a curvature-speed formula which is edited in advance and thereby obtaining the target travel speed. Wherein the curvature-velocity formula may be formulated based on empirical values in the art, for example, velocity=1/curvature set-up velocity. Similarly, in the result of curvature-speed formula calculation, the larger the value is, the smaller the speed value corresponding to the curvature is, so that enough time is reserved for the self-propelled robot to turn, and the phenomenon that the self-propelled robot does not react timely or collides with the protrusion on the obstacle due to inertia when the speed of the self-propelled robot is too high is avoided.
Based on the foregoing description, it can be understood by those skilled in the art that the control method of the present embodiment can control the traveling speed of the self-propelled robot according to the magnitude of the curvature by acquiring a plurality of characteristic points on an obstacle (e.g., a wall body) and then fitting the plurality of characteristic points into a curve, thereby calculating the curvature of the curve. In other words, the control method of the embodiment can make the traveling speed of the self-propelled robot smaller when the curvature of the surface of the obstacle is larger, so as to provide enough turning time for the self-propelled robot and avoid the collision of the self-propelled robot to the protrusion on the surface of the obstacle.
In a second embodiment of the present disclosure:
as shown in fig. 8 to 10, a laser radar 11 as an obstacle sensor is provided on the top of the autonomous robot 1, and an edge sensor 12 as an obstacle sensor is provided in front of the autonomous robot 1.
As shown in fig. 7, the control method of the self-propelled robot 1 according to the present embodiment includes:
step S210 of detecting an obstacle by the laser radar 11;
specifically, the self-propelled robot 1 is caused to detect whether or not an obstacle exists around the self-propelled robot 1 by scanning with the laser radar 11.
Step S220, extracting three characteristic points of the obstacle;
wherein the three feature points are all on the same plane or almost on a plane, and the plane is parallel to the horizontal plane.
Specifically, the lidar 11 is configured to be capable of emitting three laser beams simultaneously, and when the laser beams are transmitted onto an obstacle, they are capable of being reflected back by the obstacle and received by the lidar, and thus characteristic points on the obstacle can be acquired.
Step S230, fitting the curvature of the contour line of at least part of the corresponding obstacle based on the three characteristic points;
specifically, the arc where the three feature points are located is obtained; the curvature of the aforementioned arc is obtained and thus the curvature of the contour line of at least part of the aforementioned obstacle is obtained.
Step S440, controlling the self-propelled robot 1 to travel according to the curvature.
Specifically, the traveling direction of the self-propelled robot is controlled according to the shape.
More specifically, when the shape of the contour line is a straight line structure, the self-propelled robot 1 is caused to maintain the current traveling direction; when the contour line is in a convex structure, turning the self-propelled robot 1 in a direction away from the obstacle; when the contour line has a concave shape, the autonomous robot 1 turns in a direction approaching the obstacle. So that the self-propelled robot 1 can always advance along the aforementioned contour, in other words, so that the self-propelled robot 1 can always walk along the edge of an obstacle, i.e. to lease the obstacle along the edge.
Further, the traveling speed of the self-propelled robot 1 is controlled according to the magnitude of the curvature.
More specifically, the self-propelled robot 1 is caused to compare the curvature with a curvature-speed map stored in advance, and thereby obtain a target travel speed corresponding to the curvature; the self-propelled robot 1 is changed from the current travel speed to the target travel speed. Wherein the curvature-velocity table may be obtained through a plurality of experiments or according to empirical values in the art. It can be understood by those skilled in the art that in the curvature-speed comparison table, the speed value corresponding to the curvature with the larger value is smaller, so as to reserve enough time for the self-propelled robot 1 to turn, and avoid the untimely reaction or the bump on the obstacle due to the inertia effect when the speed of the self-propelled robot 1 is too fast. Illustratively, a multi-level curvature threshold is set, such as 0 < K1 < K2 < … < Kn, and a corresponding multi-level linear velocity v1 > v2 > … > vn > v0.
Alternatively, the self-propelled robot 1 may be caused to change the current travel speed to the target travel speed by substituting the curvature into a curvature-speed formula which has been edited in advance and thereby obtaining the target travel speed. Wherein the curvature-velocity formula may be formulated based on empirical values in the art, exemplary velocity = 1/curvature set-up velocity. Similarly, in the result of curvature-speed formula calculation, the larger the value is, the smaller the speed value corresponding to the curvature is, so that enough time is reserved for the self-propelled robot to turn, and the phenomenon that the self-propelled robot does not react timely or collides with the protrusion on the obstacle due to inertia when the speed of the self-propelled robot is too high is avoided.
Based on the foregoing description, it can be understood by those skilled in the art that the control method of the present embodiment can control the traveling speed of the self-propelled robot according to the magnitude of the curvature by acquiring three feature points on an obstacle (e.g., a wall body) and then fitting the three feature points into a curve, thereby calculating the curvature of the curve. In other words, the control method of the embodiment can make the traveling speed of the self-propelled robot smaller when the curvature of the surface of the obstacle is larger, so as to provide enough turning time for the self-propelled robot and avoid the collision of the self-propelled robot to the protrusion on the surface of the obstacle.
In order to enable those skilled in the art to more clearly understand the technical solution of the present embodiment, the technical solution of the present embodiment will be exemplified below with reference to specific scenarios shown in fig. 8 to 9.
First, the self-propelled robot 1 is first set up to establish a planar coordinate system. The Y-axis of the planar coordinate system is parallel to the advancing direction of the self-propelled robot 1 as shown in fig. 8 to 9.
Then, the self-propelled robot 1 is caused to scan the surroundings in real time by the laser radar 11.
When the laser radar 11 receives the returned laser signal, it is determined that an obstacle exists around the self-propelled robot 1.
As shown in fig. 8, when there is an uneven surface on the wall 2, three laser beams can necessarily fall into the same groove on the wall 2 along with the movement of the self-propelled robot 1. The three feature points are sequentially marked as A, B and C in the order from front to back; the straight line connecting A and the laser radar 11 (i.e., the first laser beam) is denoted as L A (straight line segment passing through point A as shown in FIG. 8), L A With the self-propelled robot 1The acute angle between the directions is denoted as beta A (not labeled in the figure); the straight line connecting B and the laser radar 11 (i.e., the second laser beam) is denoted as L B (straight line segment passing through point B as shown in FIG. 8), L B The acute angle between the robot and the traveling direction of the self-propelled robot 1 is denoted as β B (not labeled in the figure); the straight line connecting C and the laser radar 11 (i.e., the third laser beam) is denoted as L C (straight line segment passing C point as shown in FIG. 8), L C The acute angle between the robot and the traveling direction of the self-propelled robot 1 is denoted as β C (not labeled in the figure). Wherein beta is A <β B <β C <90°。
Those skilled in the art will appreciate that beta is caused to A <β B <β C The purpose of < 90 ° is to have all three feature points located on the front side of the autonomous robot 1 in order to give the autonomous robot 1 enough reaction time to reduce the forward speed or turn.
Further, by line segment L A 、L B And L C Coordinates of three feature points A, B, C in fig. 8 are acquired. The method comprises the following steps: a (l) A sinβ A ,l A cosβ A ),B(l B sinβ B ,l B cosβ B ),C(l C sinβ C ,l C cosβ C ) The curvature k of the arc formed by the three points A, B and C is as follows:
wherein r is the radius of a circle formed by A, B, C,
wherein,
a=2*|l B sinβ B -l A sinβ A |
b=2*|l B cosβ B -l A cosβ A |
c=l B sinβ B *l B sinβ B +l B cosβ B *l B cosβ B -l A sinβ A *l A sinβ A -l A cosβ A *l A cosβ A
d=2*|l c sinβ C -l B sinβ B |
e=2*|l C cosβ C -l B cosβ B |
f=l C sinβ C *l C sinβ C +l C cosβ C *l C cosβ C -l B sinβ B *l B sinβ B -l B cosβ B *l B cosβ B
H=(b*f-e*c)/(b*d-e*a)
I=(d*c-a*f)/(b*d-e*a)
from this, the arc in which A, B, C is located and the curvature of the arc can be calculated by the above formula, and the self-propelled robot 1 can adjust its travel speed according to the curvature. In the walking process, the distance between the self-propelled robot 1 and the wall body is controlled through the edge sensor 12, and when the edge sensor 12 detects that the distance between the self-propelled robot 1 and the wall body is increased, the self-propelled robot 1 is rotated by a certain angle towards the direction approaching to the wall body 2; when the edge sensor 12 detects that the distance between the autonomous robot 1 and the wall 2 is reduced, the autonomous robot 1 is rotated by a certain angle in a direction away from the wall 2. So that a constant distance can be maintained between the self-propelled robot 1 and the wall 2.
As will be appreciated by those skilled in the art, when L A 、L B 、L C If one or both of them is larger than a predetermined range (for example, any feasible value of 10cm, 25cm, 30cm, etc.), it indicates that the direction of the autonomous robot 1 deviates greatly or encounters a turn. As shown in fig. 9, when the self-propelled robot 1 walks to the corner, the laser beam corresponding to the point a does not fall onto the wall 2, resulting in the pair aLine segment L of the correspondence A The length of (2) is out of the set range. At this time, the controller of the self-propelled robot 1 determines that the self-propelled robot 1 has moved to the corner position, and then rotates the self-propelled robot 1 by 90 ° at the corner, and continues the edge work along the next wall surface.
Based on the foregoing description, it can be understood by those skilled in the art that the control method of this embodiment not only can make the self-propelled robot 1 adjust its speed and traveling direction according to the rugged plane on the wall 2, but also can find the corner in time, and then can rotate a certain angle at the corner to continue traveling along the next wall.
On the premise of controlling the production cost, in order to increase the calculation rate of the self-propelled robot 1 and reduce the reaction time of the self-propelled robot 1, the step of acquiring the feature points on the obstacle by the self-propelled robot 1 is performed by an upper computer on the self-propelled robot 1, and the step of calculating the curvature of the surface profile of the obstacle is performed by a lower computer on the self-propelled robot 1. The host computer may be any feasible host computer, such as an algorithm circuit board. The lower computer may be any feasible lower computer, such as a micro control unit.
In a third embodiment of the present disclosure:
as shown in fig. 10, the present disclosure also provides a self-propelled robot. The self-propelled robot comprises a processor, optionally a memory and a bus, on a hardware level, and furthermore allows to include the hardware required for other services.
The memory is used for storing execution instructions, and the execution instructions are specifically computer programs capable of being executed. Further, the memory may include memory and non-volatile memory (non-volatile memory) and provide the processor with instructions and data for execution. By way of example, the Memory may be a Random-Access Memory (RAM), and the non-volatile Memory may be at least 1 disk Memory.
Wherein the bus is used to interconnect the processor, memory, and network interfaces together. The bus may be an ISA (Industry Standard Architecture ) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, an EISA (Extended Industry Standard Architecture ) bus, and the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one double-headed arrow is shown in fig. 10, but this does not represent only one bus or one type of bus.
In one possible implementation manner of the self-propelled robot, the processor may first read the corresponding execution instruction from the nonvolatile memory to the memory for execution, or may first obtain the corresponding execution instruction from another device for execution. The processor, when executing the execution instructions stored in the memory, can implement the control method in any one of the control method embodiments of the present disclosure.
Those skilled in the art will appreciate that the control method described above may be applied to the processor or may be implemented by the processor. The processor is illustratively an integrated circuit chip having the capability of processing signals. In the process of executing the control method by the processor, each step of the control method can be completed by an integrated logic circuit in a hardware form or an instruction in a software form in the processor. Further, the processor may be a general purpose processor such as a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Field-programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, a microprocessor, and any other conventional processor.
Those skilled in the art will also appreciate that the steps of the above-described embodiments of the control method of the present disclosure may be performed by a hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor. The software modules may be located in other well-known storage media such as ram, flash memory, rom, eeprom, registers, etc. The storage medium is located in the memory, and the processor performs the steps in the above-described embodiments of the recording and supervising method in combination with its hardware after reading the information in the memory.
Thus far, the technical solution of the present disclosure has been described in connection with the foregoing embodiments, but it is easily understood by those skilled in the art that the protective scope of the present disclosure is not limited to only these specific embodiments. The technical solutions in the above embodiments may be split and combined by those skilled in the art without departing from the technical principles of the present disclosure, and equivalent modifications or substitutions may be made to related technical features, which all fall within the scope of the present disclosure.

Claims (9)

1. A control method of a self-propelled robot, the self-propelled robot being provided with an obstacle sensor, the control method comprising:
detecting an obstacle by the obstacle sensor;
extracting at least three feature points of the obstacle;
fitting a curvature of a contour line of at least part of the respective obstacle based on the at least three feature points;
controlling the self-propelled robot to travel according to the curvature;
the step of controlling travel of the self-propelled robot according to the curvature includes:
causing the self-propelled robot to compare the curvature with a curvature-speed comparison table stored in advance, and thus obtaining a target walking speed corresponding to the curvature; alternatively, the self-propelled robot is caused to substitute the curvature into a curvature-speed formula edited in advance and thus obtain a target walking speed;
the self-propelled robot is shifted from a current travel speed to the target travel speed.
2. The control method according to claim 1, characterized in that the step of controlling travel of the self-propelled robot according to the curvature includes:
controlling the walking speed of the self-propelled robot according to the curvature; and/or the number of the groups of groups,
and determining the shape of the contour line according to the curvature, and controlling the walking direction of the self-propelled robot according to the shape.
3. The control method according to claim 2, characterized in that the step of determining the shape of the contour line according to the curvature and controlling the traveling direction of the self-propelled robot according to the shape further comprises:
in response to the shape of the contour line being a straight line structure, maintaining the current walking direction of the self-propelled robot;
turning the self-propelled robot in a direction away from the obstacle in response to the shape of the contour line being a convex structure;
and turning the self-propelled robot toward a direction approaching the obstacle in response to the shape of the contour line being a concave structure.
4. The control method according to claim 1, characterized in that the at least three feature points satisfy the following relationship:
sequentially marking the at least three characteristic points as D1 and D2 … … Dn according to the sequence from front to back, wherein n is more than or equal to 3 and n is a natural number;
a straight line connecting D1 and the obstacle sensor is denoted as L1, and an acute angle between L1 and a traveling direction of the self-propelled robot is denoted as β1;
the straight line connecting D2 and the obstacle sensor is denoted as L2, and the acute angle between L2 and the traveling direction of the self-propelled robot is denoted as β2;
……
a straight line connecting Dn and the obstacle sensor is denoted as Ln, and an acute angle between Ln and the traveling direction of the self-propelled robot is denoted as βn;
wherein β1 is more than β2 and … … is more than βn.
5. The control method according to claim 1, characterized in that the step of extracting at least three feature points of the obstacle includes: extracting three characteristic points of the obstacle;
the step of "fitting the curvature of the contour line of at least part of the respective obstacle based on the at least three feature points" includes:
acquiring arcs where the three feature points are located;
the curvature of the arc is obtained and thus the curvature of the contour of at least part of the obstacle is obtained.
6. The control method according to claim 1, characterized in that the step of extracting at least three feature points of the obstacle includes: extracting at least four feature points of the obstacle;
the step of "fitting the curvature of the contour line of at least part of the respective obstacle based on the at least three feature points" includes:
fitting the at least four feature points into a curve;
the average curvature of the curve is obtained and thus the curvature of the contour line of at least part of the obstacle is obtained.
7. The control method according to claim 1, characterized in that the step of "extracting at least three feature points of the obstacle" is performed by an upper computer of the self-propelled robot;
the step of fitting the curvature of the contour line of at least part of the corresponding obstacle based on the at least three feature points is performed by a lower computer of the self-propelled robot.
8. The control method according to any one of claims 1 to 7, characterized in that the obstacle sensor comprises a laser radar, an edge sensor arranged laterally of the autonomous robot and/or an image acquisition device.
9. A self-propelled robot, characterized in that it comprises a processor, a memory and execution instructions stored on the memory, the execution instructions being arranged, when executed by the processor, to enable the self-propelled robot to perform the control method of any of claims 1 to 8.
CN202010747842.9A 2020-07-30 2020-07-30 Self-propelled robot and control method thereof Active CN111949021B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010747842.9A CN111949021B (en) 2020-07-30 2020-07-30 Self-propelled robot and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010747842.9A CN111949021B (en) 2020-07-30 2020-07-30 Self-propelled robot and control method thereof

Publications (2)

Publication Number Publication Date
CN111949021A CN111949021A (en) 2020-11-17
CN111949021B true CN111949021B (en) 2024-02-09

Family

ID=73338528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010747842.9A Active CN111949021B (en) 2020-07-30 2020-07-30 Self-propelled robot and control method thereof

Country Status (1)

Country Link
CN (1) CN111949021B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112308039A (en) * 2020-11-25 2021-02-02 珠海市一微半导体有限公司 Obstacle segmentation processing method and chip based on TOF camera
CN115202330A (en) * 2021-04-09 2022-10-18 美智纵横科技有限责任公司 Control method for cleaning robot to move along obstacle and cleaning robot
CN116058726A (en) * 2021-10-29 2023-05-05 追觅创新科技(苏州)有限公司 Obstacle identification method and device applied to cleaning equipment and cleaning equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009026161A (en) * 2007-07-23 2009-02-05 Panasonic Corp Self-propelled apparatus and program thereof
CN108196555A (en) * 2018-03-09 2018-06-22 珠海市微半导体有限公司 The control method that autonomous mobile robot is walked along side
CN108209743A (en) * 2017-12-18 2018-06-29 深圳市奇虎智能科技有限公司 A kind of fixed point clean method, device, computer equipment and storage medium
CN109696913A (en) * 2018-12-13 2019-04-30 中国航空工业集团公司上海航空测控技术研究所 A kind of sweeping robot intelligent barrier avoiding system and method based on deep learning
CN110794831A (en) * 2019-10-16 2020-02-14 深圳乐动机器人有限公司 Method for controlling robot to work and robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009026161A (en) * 2007-07-23 2009-02-05 Panasonic Corp Self-propelled apparatus and program thereof
CN108209743A (en) * 2017-12-18 2018-06-29 深圳市奇虎智能科技有限公司 A kind of fixed point clean method, device, computer equipment and storage medium
CN108196555A (en) * 2018-03-09 2018-06-22 珠海市微半导体有限公司 The control method that autonomous mobile robot is walked along side
CN109696913A (en) * 2018-12-13 2019-04-30 中国航空工业集团公司上海航空测控技术研究所 A kind of sweeping robot intelligent barrier avoiding system and method based on deep learning
CN110794831A (en) * 2019-10-16 2020-02-14 深圳乐动机器人有限公司 Method for controlling robot to work and robot

Also Published As

Publication number Publication date
CN111949021A (en) 2020-11-17

Similar Documents

Publication Publication Date Title
CN111949021B (en) Self-propelled robot and control method thereof
JP6881723B2 (en) Angle correction method for mobile robots in the work area and mobile robots
WO2019144541A1 (en) Cleaning robot
EP3770711A1 (en) Method for repositioning robot
US9149167B2 (en) Robot cleaner and control method thereof
WO2019062119A1 (en) Autonomous mobile robot and control method and device for automatic docking thereof
JP2020083140A (en) Parking support device
JP2015092348A (en) Mobile human interface robot
CN112806912B (en) Robot cleaning control method and device and robot
CN112171675B (en) Obstacle avoidance method and device for mobile robot, robot and storage medium
CN112051844B (en) Self-moving robot and control method thereof
WO2021243978A1 (en) Working area expansion method based on laser map, and chip and robot
CN113741438A (en) Path planning method and device, storage medium, chip and robot
CN110874101B (en) Method and device for generating cleaning path of robot
EP3693826B1 (en) Moving vehicle
CN113475977B (en) Robot path planning method and device and robot
JP2010026727A (en) Autonomous moving device
CN113863195B (en) Edge cleaning method and cleaning vehicle
CN114489050A (en) Obstacle avoidance route control method, device, equipment and storage medium for straight line driving
CN114690755A (en) Cleaning robot and obstacle detouring method thereof
TWI846276B (en) Path planning method and autonomous travelling robot
WO2022213636A1 (en) Traveling control method and apparatus for autonomous mobile robot, device, and storage medium
CN116513168B (en) Path planning method and device, electronic equipment and storage medium
JP7403423B2 (en) robot vacuum cleaner
TW201913263A (en) Robots and navigation control methods thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant