CN114253266A - Mobile robot, edgewise moving method thereof and computer storage medium - Google Patents

Mobile robot, edgewise moving method thereof and computer storage medium Download PDF

Info

Publication number
CN114253266A
CN114253266A CN202111564238.3A CN202111564238A CN114253266A CN 114253266 A CN114253266 A CN 114253266A CN 202111564238 A CN202111564238 A CN 202111564238A CN 114253266 A CN114253266 A CN 114253266A
Authority
CN
China
Prior art keywords
edge
boundary
mobile robot
position point
target boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111564238.3A
Other languages
Chinese (zh)
Inventor
于行尧
张展鹏
邓文钧
龙有炼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202111564238.3A priority Critical patent/CN114253266A/en
Publication of CN114253266A publication Critical patent/CN114253266A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a mobile robot and an edgewise moving method thereof, and a computer storage medium, wherein the method comprises the following steps: acquiring edge information in a map scene based on the map scene information; selecting a target boundary in a map scene by using the edge information; determining an edge initial edge movement position point of the mobile robot by using the target boundary; moving the mobile robot according to the moving path; rotating the mobile robot under the condition that the mobile robot moves until the linear distance between the mobile robot and the target boundary reaches a first preset distance; stopping rotating the mobile robot under the condition that the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance; and controlling the mobile robot to move along the edge from the edge starting position point along the target boundary. According to the method and the device, the appropriate target boundary is screened through the edge information, the appropriate boundary in the map scene can be automatically judged to be used as the standard position of the edgewise movement, and the accuracy of the edgewise movement of the mobile robot is improved.

Description

Mobile robot, edgewise moving method thereof and computer storage medium
Technical Field
The present disclosure relates to the field of robot application technologies, and in particular, to a mobile robot, a method for moving the mobile robot along an edge, and a computer storage medium.
Background
In recent years, mobile robots, especially sweeping robots, slowly enter thousands of households, become common electrical appliances and assistance aids in daily life of people, the sweeping robot industry is developed vigorously, and the research on common technologies of the sweeping robots is more and more important.
One of the core tasks of the sweeping robot is to perform full-coverage sweeping on the to-be-swept area of the whole room, that is, to move by itself, so that the effective cleaning area of the sweeping robot covers the whole to-be-swept area. In different areas in a room, in order to improve the cleaning effect, the sweeping robot needs to use different cleaning modes in different areas, for example, a common mode is used in an open scene, and vacuum suction force needs to be increased at the wall edges and corners where dust is easy to accumulate so as to achieve a better cleaning effect.
The traditional sweeping robot using the collision ring sensor can continue to move forwards after the edgewise behavior is started until an object collides to trigger a sensor signal. After collision, the vehicle is considered to reach the boundary position, and then the distance between the vehicle and the boundary is controlled by using the measurement result of the side distance sensor to carry out edge following. When obstacles such as boxes appear in the middle of a house, the scheme can cause the sweeping robot to sweep the edges of the wall body only along the edges of the boxes, and the accuracy of sweeping along the edges of the sweeping robot is reduced.
Disclosure of Invention
The application provides a mobile robot, a method for moving the mobile robot along an edge and a computer storage medium.
One technical solution adopted by the present application is to provide an edge moving method, including:
acquiring edge information in a map scene based on the map scene information;
selecting a target boundary in the map scene by using the edge information, wherein the target boundary is a closed boundary of the outer edge of the current working area of the mobile robot in the map scene;
determining an edgewise starting position point of the mobile robot by using the target boundary;
moving the mobile robot to the edge starting position point;
controlling the mobile robot to move along the edge from the edge starting position point along the target boundary;
the linear distance between the starting position point of the edge and the target boundary is a first preset distance;
the moving the mobile robot to the edge start position point includes:
acquiring a moving path of the mobile robot by using a preset planning algorithm;
moving the mobile robot according to the moving path;
when the mobile robot moves to the condition that the linear distance between the mobile robot and the target boundary reaches the first preset distance, rotating the mobile robot, and judging whether the linear distance between a distance sensor on the mobile robot and the target boundary reaches a second preset distance or not;
and stopping rotating the mobile robot under the condition that the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance.
Through the method, the proper target boundary is screened through the edge information, namely the closed boundary of the outer edge of the current working area of the mobile robot in the map scene is used as the target boundary moving along the edge, the proper boundary in the map scene can be automatically judged to be used as the standard position of the mobile robot moving along the edge, the situations that the mobile robot moves along the boundary of the obstacle in the working area and omits the outer edges of a wall body and the like are effectively reduced, and the accuracy of the mobile robot moving along the edge is improved; and whether the mobile robot reaches the edge starting position point or not is positioned through the first preset distance and the second preset distance, so that the mobile robot can start moving along the edge at the edge starting position point.
Wherein the determining an edge start position point of the mobile robot using the target boundary comprises:
acquiring a current position point of the mobile robot;
taking the projection of the current position point on the target boundary as a reference position point;
determining an edgewise start position point of the mobile robot based on the current position point and the reference position point.
By the method, the current position point and the reference position point of the mobile robot are obtained, the edge initial position point which can be reached by the mobile robot is planned, and the practicability of the mobile robot in edge movement work is guaranteed.
Wherein the using the projection of the current position point on the target boundary as the reference position point comprises:
acquiring a straight line extraction result of the target boundary, wherein the straight line extraction result comprises a plurality of straight line segments;
selecting the straight line segment with the longest length from the plurality of straight line segments as a reference straight line segment;
and taking the projection of the current position point on the reference straight-line segment as the reference position point.
By means of the method, the straight line fitting is carried out on the target boundary, the scattered boundary points of the target boundary are fitted into the straight line points on the straight line segments, accuracy of the reference position points is improved, the straight line segment with the longest length is selected as the reference straight line segment to determine the reference position points, stability of control edgewise distance during follow-up edgewise movement can be improved, and the efficiency of edgewise movement is improved.
Wherein, the edge moving method further comprises:
acquiring a first preset distance between the starting position point of the edge and the target boundary;
acquiring structural parameters of the mobile robot and the installation angle of the distance sensor;
and acquiring the second preset distance by using the first preset distance, the structural parameters and the installation angle.
Through the mode, the mobile robot is calibrated through the distance information of the distance sensor, and the position calibration accuracy of the mobile robot is improved.
Wherein the controlling the mobile robot to perform the edgewise moving work along the target boundary from the edgewise start position point further includes:
under the condition that the linear distance between the mobile robot and the other boundary in the advancing direction reaches the first preset distance, rotating the mobile robot until the linear distance between the distance sensor and the other boundary reaches the second preset distance, and continuing to move edgewise along the other boundary; wherein the another boundary is another partial boundary which intersects with a partial boundary along which the mobile robot currently moves, among the target boundaries.
Through the mode, the program for aligning the boundary is executed in a circulating mode, so that the mobile robot can well run alternately between the two states of edgewise movement and corner rotation, and the stability of edgewise movement is improved.
Wherein the controlling the mobile robot to perform the edgewise moving work along the target boundary from the edgewise start position point further includes:
determining a boundary of an obstacle in a case where the moving robot forward direction detects the existence of the obstacle;
judging whether the boundary of the obstacle is overlapped with the target boundary;
updating the target boundary with a boundary of the obstacle if the boundary of the obstacle overlaps the target boundary;
and controlling the mobile robot to move along the edge along the updated target boundary.
By the mode, the mobile robot can detect whether the obstacle exists in the advancing direction in real time in the moving process along the edge, and can update the target boundary in time when the obstacle is overlapped with the target boundary, so that the moving work along the edge can be smoothly carried out, the stability of the moving along the edge is improved, the obstacle can be bypassed, and the obstacle can be avoided in time.
Wherein the selecting a boundary of an object in the map scene using the edge information comprises:
extracting a plurality of edge boundaries in the edge information;
setting a first weight for each edge boundary according to the boundary length of each edge boundary, wherein the boundary length and the first weight are in a positive correlation relationship;
setting a second weight for the corresponding edge boundary according to the included angle corresponding to the internal turning point of each edge boundary;
and taking the edge boundary with the largest sum of the first weight and the second weight in the plurality of edge boundaries as the target boundary.
In the above manner, by configuring the first weight and the second weight to the edge boundary, the probability that the edge boundary is a suitable target boundary is evaluated through the weight sum value, so that the mobile robot can select an optimal boundary to perform the edgewise moving work.
The obtaining of the edge information in the map scene based on the map scene information includes:
acquiring a map scene image based on the map scene information;
filtering the map scene image by adopting morphological filtering;
and acquiring the edge information of the filtered map scene image by adopting a preset edge detection algorithm.
By the method, the boundary in the map scene can be closed through morphological filtering operation, and the feasibility of edge moving work is improved.
Another technical solution adopted by the present application is to provide a mobile robot, which includes an edge acquisition module, a boundary acquisition module, and an edgewise movement module; wherein the content of the first and second substances,
the edge acquisition module is used for acquiring edge information in a map scene based on the map scene information;
the boundary acquisition module is used for selecting a target boundary in the map scene by using the edge information, wherein the target boundary is a closed boundary of the outer edge of the current working area of the mobile robot in the map scene;
the edgewise moving module is used for determining an edgewise starting position point of the mobile robot by utilizing the target boundary; moving the mobile robot to the edge starting position point; controlling the mobile robot to move along the edge from the edge starting position point along the target boundary;
the edgewise moving module is further used for acquiring a moving path of the mobile robot by using a preset planning algorithm; moving the mobile robot according to the moving path; when the mobile robot moves to the condition that the linear distance between the mobile robot and the target boundary reaches the first preset distance, rotating the mobile robot, and judging whether the linear distance between a distance sensor on the mobile robot and the target boundary reaches a second preset distance or not; stopping rotating the mobile robot when the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance; and the linear distance between the starting point of the edge and the target boundary is a first preset distance.
The boundary acquisition module is further used for acquiring the current position point of the mobile robot; taking the projection of the current position point on the target boundary as a reference position point; determining an edgewise start position point of the mobile robot based on the current position point and the reference position point.
The boundary acquisition module is further configured to acquire a straight line extraction result of the target boundary, where the straight line extraction result includes a plurality of straight line segments; selecting the straight line segment with the longest length from the plurality of straight line segments as a reference straight line segment; and taking the projection of the current position point on the reference straight-line segment as the reference position point.
The edge moving module is further used for acquiring a first preset distance between the edge starting position point and the target boundary; acquiring structural parameters of the mobile robot and the installation angle of the distance sensor; and acquiring the second preset distance by using the first preset distance, the structural parameters and the installation angle.
The edgewise moving module is further configured to rotate the mobile robot when a linear distance between the mobile robot and another boundary in the advancing direction of the mobile robot reaches the first preset distance until the linear distance between the distance sensor and the another boundary reaches the second preset distance, and continue to perform edgewise moving work along the another boundary; wherein the another boundary is another partial boundary which intersects with a partial boundary along which the mobile robot currently moves, among the target boundaries.
The edgewise moving module is further used for determining the boundary of the obstacle when the obstacle is detected to exist in the advancing direction of the mobile robot; judging whether the boundary of the obstacle is overlapped with the target boundary; updating the target boundary with a boundary of the obstacle if the boundary of the obstacle overlaps the target boundary; and controlling the mobile robot to move along the edge along the updated target boundary.
The boundary acquisition module is further used for extracting a plurality of edge boundaries in the edge information; setting a first weight for each edge boundary according to the boundary length of each edge boundary, wherein the boundary length and the first weight are in a positive correlation relationship; setting a second weight for the corresponding edge boundary according to the included angle corresponding to the internal turning point of each edge boundary; and taking the edge boundary with the largest sum of the first weight and the second weight in the plurality of edge boundaries as the target boundary.
The edge obtaining module is further used for obtaining a map scene image based on the map scene information; filtering the map scene image by adopting morphological filtering; and acquiring the edge information of the filtered map scene image by adopting a preset edge detection algorithm.
Another technical solution adopted by the present application is to provide another mobile robot, including a memory and a processor coupled to the memory;
wherein the memory is adapted to store program data and the processor is adapted to execute the program data to implement the edgewise moving method as described above.
Another technical solution adopted by the present application is to provide a computer storage medium, where the computer storage medium is used to store program data, and the program data is used to implement the above-mentioned edgewise moving method when being executed by a computer.
The beneficial effect of this application is: the mobile robot acquires edge information in a map scene based on the map scene information; selecting a target boundary in a map scene by using the edge information; determining an edge initial edge movement position point of the mobile robot by using the target boundary; moving the mobile robot according to the moving path; rotating the mobile robot under the condition that the mobile robot moves until the linear distance between the mobile robot and the target boundary reaches a first preset distance; stopping rotating the mobile robot under the condition that the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance; and controlling the mobile robot to move along the edge from the edge starting position point along the target boundary. According to the method and the device, the proper target boundary is screened through the edge information, namely the closed boundary of the outer edge of the current working area of the mobile robot in the map scene is used as the target boundary moving along the edge, the proper boundary in the map scene can be automatically judged to be used as the standard position of the mobile robot moving along the edge, the situations that the mobile robot moves along the boundary of the internal obstacles in the working area and leaves out the outer edges of a wall body and the like are effectively reduced, and the accuracy of the mobile robot moving along the edge is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of an edge moving method provided herein;
FIG. 2 is a schematic diagram of an embodiment of a map scene image provided by the present application;
FIG. 3 is a schematic flow chart of specific sub-steps of step 13 shown in FIG. 1;
fig. 4 is a schematic view of a sweeping robot provided by an embodiment of the present application, which is located at a rotation angle position;
FIG. 5 is a schematic diagram of an embodiment of a mobile robot provided herein;
FIG. 6 is a schematic diagram of another embodiment of a mobile robot provided herein;
FIG. 7 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flow chart of an embodiment of an edge moving method provided in the present application. The edgewise moving method according to the embodiment of the present application is applicable to a mobile robot, for example, a cleaning robot such as a sweeping robot or other types of mobile robots such as a transfer robot, a processing system mounted on the mobile robot, or a control system other than the mobile robot. The following description will be given taking a sweeping robot as an example.
As shown in fig. 1, the edge moving method according to the embodiment of the present application may specifically include the following steps:
step S11: and acquiring edge information in the map scene based on the map scene information.
In the embodiment of the application, the sweeping robot acquires map scene information of an environment where the sweeping robot is located by using a laser radar and/or an RGBD (red green blue deep, red green blue depth) camera and the like, wherein the map scene information includes obstacle information, wall information and the like of the environment where the sweeping robot is located. Then, edge information in the map scene is extracted through the map scene information and used for selecting the target boundary in the map scene. In other embodiments, the sweeping robot may also obtain the edge information in the map scene through pre-stored map information or through an external device (e.g., a server, an external camera).
Specifically, the sweeping robot generates a map scene image based on the map scene information, for example, on a grid map image generated by the sweeping robot, an obstacle model/obstacle boundary is generated according to the obstacle information, a wall boundary is generated according to the wall information, and the like. The map scene image at least comprises an outer edge boundary of the map scene, an obstacle and the like. Due to the influence of obstruction, signal noise and the like, partial noise points and partial boundary gaps may exist in the map scene image directly generated from the map scene information. Due to the above factors, the sweeping robot may have a large error when performing the edgewise moving work such as the edgewise sweeping task, and even the sweeping robot cannot perform the edgewise moving work due to the problem of boundary deletion.
In contrast, the embodiment of the present application proposes to perform filtering processing on a map scene image by using a morphological filtering technology to remove noise points on the map scene image and to seal existing boundary gaps.
Specifically, the sweeping robot binarizes the map scene image, rasterizes pixels representing obstacles and boundaries to 0, and rasterizes pixels representing free space to 1. Then, the sweeping robot performs morphological closing operation on the binary map scene image, namely, expansion processing is performed firstly and then corrosion processing is performed, and local barrier grids on the map scene image can be filtered through the morphological closing operation. And finally, the sweeping robot carries out morphological open operation on the map scene image after the morphological open operation, namely, firstly carrying out corrosion treatment and then carrying out expansion treatment, and the unclosed boundary on the map scene image can be closed through the morphological open operation to form a closed boundary. Specifically, referring to fig. 2, fig. 2 is a schematic view of an embodiment of a map scene image after morphological filtering. As shown in fig. 2, the map scene image after the morphological filtering process can form a boundary with good sealing effect, such as a boundary B shown in fig. 2, an area a shown in fig. 2 represents an obstacle, and an area C shown in fig. 2 represents a non-working area of the map scene other than the current working area of the mobile robot.
Further, the sweeping robot can acquire edge information in a map scene by adopting a preset edge detection algorithm. For example, the position of the edge in the map scene image is acquired using a preset edge detection algorithm.
In a possible implementation manner, the sweeping robot acquires edge information of the map scene image after morphological filtering processing by adopting a preset edge detection algorithm. Specifically, after the morphological filtering processing, the border of the map scene image is repaired, and the barrier grid with a small volume is filtered out, so that a good image material is provided for extracting edge information. The sweeping robot can use a mature Sobel operator (Sobel operator), a Canny edge detection algorithm and other edge detection algorithms to acquire edge information in a map scene image. In the embodiment of the present application, the edge information may be several closed boundaries in the map scene image, such as a boundary formed by the boundary B and the edge of the obstacle a in fig. 2.
Step S12: object boundaries in a map scene are selected using edge information.
In the embodiment of the application, due to the fact that obstacles such as a box with a large volume may exist in an environment where the sweeping robot is located, for example, a room environment, a plurality of edge extraction results based on the edge information generally exist, including a closed boundary of the obstacle and a wall boundary at the outer edge of a current working area of the robot. Therefore, the embodiment of the present application provides a rule for extracting a closed boundary of an outer edge of a current working area of a robot in a map scene from a plurality of edge boundaries, as an object boundary moving along an edge. The map scene can include the current working area of the robot and other areas, for example, the map scene is a map of the whole house scene, including rooms, living rooms, kitchens and other areas, and the current working area of the robot can be set as a room area or a room and living room area; the map scene may also be a current working area scene of the robot, such as an entire house or a certain room. For example, the sweeping robot may compare lengths of a plurality of edge boundaries, and use the edge boundary with the longest length as the target boundary moving along the edge; or, the sweeping robot may select an edge boundary with the largest area of an area surrounded by the edge boundary among the plurality of edge boundaries as a target boundary moving along the edge; or, the sweeping robot may compare the distances between the boundary points in each edge boundary, and use the edge boundary where the boundary point with the longest distance is located as the target boundary moving along the edge.
In a possible implementation manner, one of the rules provided in the embodiment of the present application is as follows:
the sweeping robot extracts a plurality of edge boundaries in the edge information, and sets a first weight WI for the corresponding edge boundary according to the boundary length of each edge boundary, wherein the boundary length and the first weight WI are in a positive correlation relationship, namely the longer the boundary length of the edge boundary is, the larger the first weight WI is.
In addition, the sweeping robot detects boundary points with discontinuous image gradients, namely boundary points with discontinuous image coordinates, for each edge boundary, and in the moving scene along the edge, the boundary points correspond to turning points (such as wall corners) of the boundary. The sweeping robot sets a second weight WA for each edge boundary according to the included angle corresponding to the internal turning point of each edge boundary, and assigns a larger weight to the edge boundary with the included angle close to a preset angle (such as 90 degrees).
And finally, the sweeping robot takes the edge boundary with the largest sum of the first weight WI and the second weight WA in the plurality of edge boundaries as a target boundary.
For example, since there may be an obstacle such as a box with a large volume in a room, among a plurality of edge boundaries in the edge information extracted by the sweeping robot, there are edge boundaries of a wall and edge boundaries of an obstacle. The length of the wall is greater than the length of the obstacle, and the sweeping robot can set a first weight WI according to the length of the boundary, wherein the first weight of the edge boundary with the wall is greater than the first weight of the edge boundary with the obstacle, so that the edge boundary of the obstacle is screened out through the first weight WI.
Further, the sweeping robot can acquire the inner turning point of the edge boundary in a sliding window mode. For example, the sweeping robot slides along the edge boundary using a 5X5 sliding window, collects a plurality of edge subsets, and calculates the intersection angle of adjacent edge subsets, where the boundary point in the adjacent edge subset with a larger intersection angle is the internal turning point. The closer the crossing angle corresponding to the inner turning point is to 90 °, the greater the second weight WA is given.
Therefore, the optimal edge boundary is selected by setting the first weight and the second weight, and the length information of the edge boundary and the internal turning point included angle information can be considered at the same time, so that the accuracy of target boundary selection is improved, and the selection of a proper edge moving boundary in a complex map scene is facilitated.
Step S13: and performing edge movement according to the target boundary.
In the embodiment of the application, after the target boundary is determined, the sweeping robot can start moving along the edge according to the target boundary. Specifically, the sweeping robot may select a position closer to the target boundary (e.g., a distance from the target boundary is less than a predetermined value) as a starting point, and move in a direction parallel to the target boundary, randomly or according to a predetermined rule. When the sweeping robot moves along the edge, the sweeping robot can be controlled to move along the target boundary by combining the map scene and the position of the sweeping robot. For example, the position of the sweeping robot is located by using information collected by a camera and a sensor of the sweeping robot, and the movement direction and speed of the sweeping robot are controlled by a PID (proportional integral differential) algorithm in combination with a target boundary in a map scene, so that the distance between the sweeping robot and the target boundary is kept within a preset range.
Further, since the initial position of the sweeping robot may have a certain distance from the target boundary, the sweeping robot needs to move to the edge initial position point corresponding to the target boundary before the edge moving operation is started, so as to ensure accurate execution of the edge moving operation. The edge starting position point is a starting point of the sweeping robot for executing the edge moving task. Therefore, the sweeping robot needs to determine the edge starting position point first to accurately position the sweeping robot. Specifically, please refer to fig. 3 continuously for the process of the edge-shifting operation provided by the embodiment of the present application, and fig. 3 is a schematic flow chart of specific sub-steps of step 13 shown in fig. 1.
As shown in fig. 3, the edge moving method according to the embodiment of the present application may specifically include the following steps:
step S131: and determining an edge starting position point of the mobile robot by using the target boundary.
In the embodiment of the application, the sweeping robot acquires the current position point of the sweeping robot, and takes the projection of the current position point on the target boundary as the reference position point. The projection direction of the current position point is the direction of a vertical connecting line between the current position point and the target boundary; generally, the target boundary may include multiple intersecting line segments, each line segment serving as a partial target boundary, and the partial target boundary locating the reference position point may select a partial target boundary closest to the current position point of the sweeping robot. In addition, the sweeping robot can also select a part of the target boundary with the longest length as the target boundary of the positioning reference position point.
As shown in fig. 2, for the sweeping robot D in fig. 2, the linear distance from the upper target boundary is smaller than the linear distance from the left target boundary, so the sweeping robot D can use the upper target boundary to locate the reference position point.
Optionally, the reference position point and the current position point are connected, the sweeping robot scans from the reference position point to the current position point, the obtained first distance from the target boundary is the designated distance, and the reachable position point of the sweeping robot closest to the current position point is used as the starting position point along the edge.
Optionally, the reference position point and the current position point are connected, the sweeping robot scans from the reference position point to the current position point, the obtained first distance from the target boundary is the designated distance, and the reachable position point of the sweeping robot closest to the reference position point is used as the starting position point along the edge.
Optionally, the reference position point and the current position point are connected, the sweeping robot scans from the reference position point to the current position point to obtain a first position in the reachable space of the sweeping robot, and the distance from the target boundary is a specified distance, that is, the position point of the first preset distance is used as the starting position point of the edge. The specified distance is set so that the sweeping robot can be close to the target boundary as much as possible, and the specified distance is kept a certain distance from the target boundary when the sweeping robot moves along the edge, so that the situation that the running of the sweeping robot is blocked due to collision with the target boundary is reduced.
The space which can be reached by the sweeping robot is an area where the sweeping robot can move in a map scene. For example, as shown in fig. 2, in the area surrounded by the closed boundary B in fig. 2, except for the position point in the area a and the boundary of a, a reachable space is provided; in area a, the sweeping robot cannot reach or traverse the location of the area due to the presence of the obstacle.
Optionally, the sweeping robot may further perform Hough Transform (Hough Transform) on the selected target boundary to obtain a straight line extraction result, where the straight line extraction result includes a plurality of straight line segments. The sweeping robot selects the straight line segment with the longest length from the extracted straight line segments as a reference straight line segment, and then takes the projection of the current position point on the reference straight line segment as a reference position point.
Because the target boundary is usually generated based on map scene information acquired by a laser radar and/or an RGBD camera, signal points may not be completely smooth, and the target boundary is actually formed by fitting a plurality of straight line segments. The sweeping robot adopts the straight line segment with the longest length as the reference straight line segment, so that the influence of the straight line segment formed by noise points on the positioning reference position point can be reduced.
Step S132: and moving the mobile robot to the starting position point along the edge.
In the embodiment of the application, after the starting position point of the edge is determined, the sweeping robot obtains the effective moving path from the current position point to the starting position point of the edge by using path planning algorithms such as an a-algorithm, a D-algorithm, a genetic algorithm and the like. The effective moving road section, namely the whole road section, is the reachable space of the sweeping robot, namely, no obstacle position point and no boundary position point exist on the path. Then, the moving path is moved to the starting position point of the edge according to the effective moving path.
After moving to the starting point, the direction of the sweeping robot may not meet the requirement of moving along the edge, for example, the direction of the sweeping robot is not parallel to the target boundary, and if the sweeping robot directly continues to move along the edge, the sweeping robot collides with the boundary. Therefore, the sweeping robot usually needs to be rotated to make the orientation angle meet the requirement of moving along the edge, such as adjusting the orientation of the sweeping robot to be parallel to the target boundary.
After the sweeping robot reaches the initial position point of the edge, the sweeping robot can detect whether the self orientation meets the requirement of the edge movement, and if so, the sweeping robot does not need to rotate; if not, the rotation is needed until the requirement of edgewise movement is met.
Specifically, when the sweeping robot moves to reach a first preset distance from a linear distance with a target boundary, the sweeping robot is rotated so that the linear distance between a distance sensor on the sweeping robot and the target boundary reaches a second preset distance, namely the orientation of the robot meets the requirement of edgewise movement. For example, please refer to fig. 4, fig. 4 is a schematic view of the sweeping robot in a rotation angle position according to the embodiment of the present application. In fig. 4, the reference numbers are as follows: a sweeping robot E, a target boundary G and a distance sensor F.
For example, when the calibration sweeping robot is located at the position of the edgewise starting position point, the sweeping robot should satisfy: the shortest distance between the sweeping robot and the target boundary is l, the distance between the distance sensor and the target boundary is t, and the relation between t and l is specifically as the following formula (1):
t=l+r-rcosa (1)
wherein l is a first preset distance, t is a second preset distance, r is a structural parameter of the sweeping robot, such as a radius parameter, and a is a mounting angle of the distance sensor.
When the sweeper moves to reach a first preset distance from the linear distance with the target boundary, the sweeping robot can be rotated, so that the relation between the shortest distance between the sweeping robot and the target boundary and the distance between the distance sensor and the target boundary accords with the formula (1), the orientation angle of the sweeping robot meets the requirement of moving along the edge, and the sweeping robot can move along the edge.
Step S133: and controlling the mobile robot to move along the edge from the edge starting position point along the target boundary.
In the embodiment of the application, after the sweeping robot reaches the edge starting position point, the sweeping robot can keep a first preset distance with the target boundary for forward sweeping by using a PID control algorithm. When the front end of the sweeping robot is detected to be away from an obstacle or the boundary reaches a first preset distance by sensors capable of sensing the forward distance such as a laser radar and an RGBD sensor, the sweeping robot is controlled to stop moving if a boundary corner is met, a program which is the same as the alignment boundary is started, and the process of adjusting the orientation angle of the sweeping robot to meet the requirement of moving along the edge is carried out. Specifically, the sweeping robot is controlled to rotate until the linear distance between the distance sensor and the other boundary reaches a second preset distance, and the edgewise movement work is continued along the other boundary, wherein the other boundary is the other part of the target boundary which is intersected with the part of the boundary where the sweeping robot moves edgewise currently. Through the mode, the sweeping robot can alternately run between the two states of edgewise movement and corner rotation, and the stability of edgewise movement is improved. When the sweeping robot returns to the initial position point of the edge, the sweeping robot is controlled to stop moving after finishing the moving work of the edge, and the repeated edge is reduced.
Specifically, as shown in fig. 4, when the sweeping robot reaches the corner, the distance between the sweeping robot and the left boundary (the other boundary) reaches the first preset distance, and the orientation angle of the sweeping robot satisfies the sweeping requirement along the upper boundary (the partial boundary moving along the edge at present). At this time, the sweeping robot needs to stop running and rotate until the orientation angle of the sweeping robot meets the sweeping requirement along the left boundary, that is, the linear distance between the distance sensor of the sweeping robot and the left boundary reaches the second preset distance.
In a possible implementation manner, after a circle of edgewise moving work is completed, the mobile robot may reselect a new target boundary from the boundaries identified in the map scene to perform the edgewise moving according to the edgewise moving method provided by the embodiment of the application.
In a possible implementation manner, during the moving process of the mobile robot along the edge, an obstacle (such as a pet, garbage, etc.) may suddenly appear in a front area, the mobile robot may determine whether the obstacle boundary overlaps with the target boundary according to the obstacle boundary, and in the case that the obstacle boundary overlaps with the target boundary, update the target boundary using the obstacle boundary, and control the mobile robot to perform the moving work along the updated target boundary.
For example, after detecting that an obstacle exists in front of the road, the sweeping robot may mark the position and the boundary of the obstacle in a map scene, and update the boundary of the obstacle to a target boundary. For example, a portion where the boundary of the obstacle overlaps the target boundary is deleted from the target boundary, and the non-overlapping portion is added to the target boundary. After updating the target boundary, the mobile robot may continue to move along the target boundary.
By the mode, the mobile robot can detect whether the obstacle exists in the advancing direction in real time in the moving process along the edge, and can update the target boundary in time when the obstacle is overlapped with the target boundary, so that the moving work along the edge can be smoothly carried out, the stability of the moving along the edge is improved, the obstacle can be bypassed, and the obstacle can be avoided in time.
In the embodiment of the application, a mobile robot, such as a sweeping robot, acquires edge information in a map scene based on map scene information; selecting a target boundary in a map scene by utilizing the edge information, wherein the target boundary is a closed boundary; determining an edge starting position point of the mobile robot by using the target boundary; moving the mobile robot to an edge starting position point; and controlling the mobile robot to move along the edge from the edge starting position point along the target boundary. According to the method and the device, the appropriate target boundary is screened through the edge information and is used as the target boundary moving along the edge, the appropriate boundary in the map scene can be automatically judged and used as the standard position of the mobile robot moving along the edge, the accuracy of the mobile robot moving along the edge is improved, for example, the accuracy of the sweeping robot sweeping along the edge can be improved, and the influence of large obstacles in the scene on the moving operation along the edge is reduced.
The above embodiments are only one of the common cases of the present application and do not limit the technical scope of the present application, so that any minor modifications, equivalent changes or modifications made to the above contents according to the essence of the present application still fall within the technical scope of the present application.
With continued reference to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a mobile robot provided in the present application. The mobile robot 30 includes an edge acquisition module 31, a boundary acquisition module 32, and an edge movement module 33.
The edge obtaining module 31 is configured to obtain edge information in a map scene based on map scene information.
The boundary obtaining module 32 is configured to select a target boundary in the map scene by using the edge information, where the target boundary is a closed boundary of an outer edge of a current working area of the mobile robot in the map scene.
The edgewise moving module 33 is configured to determine an edgewise starting position point of the mobile robot by using the target boundary; moving the mobile robot to the edge starting position point; and controlling the mobile robot to perform the edgewise movement work along the target boundary from the edgewise initial position point.
The boundary obtaining module 32 is further configured to obtain a current location point of the mobile robot; taking the projection of the current position point on the target boundary as a reference position point; determining an edgewise start position point of the mobile robot based on the current position point and the reference position point.
The boundary obtaining module 32 is further configured to obtain a straight line extraction result of the target boundary, where the straight line extraction result includes a plurality of straight line segments; selecting the straight line segment with the longest length from the plurality of straight line segments as a reference straight line segment; and taking the projection of the current position point on the reference straight-line segment as the reference position point.
The edgewise moving module 33 is further configured to obtain a moving path of the mobile robot by using a preset planning algorithm; moving the mobile robot according to the moving path; when the mobile robot moves to the condition that the linear distance between the mobile robot and the target boundary reaches the first preset distance, rotating the mobile robot, and judging whether the linear distance between a distance sensor on the mobile robot and the target boundary reaches a second preset distance or not; and stopping rotating the mobile robot under the condition that the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance.
The edge moving module 33 is further configured to obtain a first preset distance between the edge starting position point and the target boundary; acquiring structural parameters of the mobile robot and the installation angle of the distance sensor; and acquiring the second preset distance by using the first preset distance, the structural parameters and the installation angle.
The edgewise moving module 33 is further configured to, when the linear distance between the distance sensor and the other boundary in the moving direction of the mobile robot reaches the first preset distance, rotate the mobile robot until the linear distance between the distance sensor and the other boundary reaches the second preset distance, and continue to perform the edgewise moving operation along the other boundary; wherein the another boundary is another partial boundary which intersects with a partial boundary along which the mobile robot currently moves, among the target boundaries.
The edgewise moving module 33 is further configured to determine a boundary of the obstacle when the obstacle is detected to exist in the forward direction of the mobile robot; judging whether the boundary of the obstacle is overlapped with the target boundary; updating the target boundary with a boundary of the obstacle if the boundary of the obstacle overlaps the target boundary; and controlling the mobile robot to move along the edge along the updated target boundary.
The boundary obtaining module 32 is further configured to extract a plurality of edge boundaries in the edge information; setting a first weight for each edge boundary according to the boundary length of each edge boundary, wherein the boundary length and the first weight are in a positive correlation relationship; setting a second weight for the corresponding edge boundary according to the included angle corresponding to the internal turning point of each edge boundary; and taking the edge boundary with the largest sum of the first weight and the second weight in the plurality of edge boundaries as the target boundary.
The edge obtaining module 31 is further configured to obtain a map scene image based on the map scene information; filtering the map scene image by adopting morphological filtering; and acquiring the edge information of the filtered map scene image by adopting a preset edge detection algorithm.
With continuing reference to fig. 6, fig. 6 is a schematic structural diagram of another embodiment of the mobile robot provided in the present application. The mobile robot 500 according to the embodiment of the present application includes a processor 51, a memory 52, an input/output device 53, and a bus 54.
The processor 51, the memory 52, and the input/output device 53 are respectively connected to the bus 54, the memory 52 stores program data, and the processor 51 is configured to execute the program data to implement the edgewise moving method according to any of the above embodiments.
In the embodiment of the present application, the processor 51 may also be referred to as a CPU (Central Processing Unit). The processor 51 may be an integrated circuit chip having signal processing capabilities. The processor 51 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 51 may be any conventional processor or the like.
Please refer to fig. 7, fig. 7 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application, the computer storage medium 600 stores program data 61, and the program data 61 is used to implement the edgewise moving method according to any of the above embodiments when being executed by a processor.
Embodiments of the present application may be implemented in software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, which is defined by the claims and the accompanying drawings, and the equivalents and equivalent structures and equivalent processes used in the present application and the accompanying drawings are also directly or indirectly applicable to other related technical fields and are all included in the scope of the present application.

Claims (10)

1. An edge moving method based on a mobile robot, the edge moving method comprising:
acquiring edge information in a map scene based on the map scene information;
selecting a target boundary in the map scene by using the edge information, wherein the target boundary is a closed boundary of the outer edge of the current working area of the mobile robot in the map scene;
determining an edgewise starting position point of the mobile robot by using the target boundary;
moving the mobile robot to the edge starting position point;
controlling the mobile robot to move along the edge from the edge starting position point along the target boundary;
the linear distance between the starting position point of the edge and the target boundary is a first preset distance;
the moving the mobile robot to the edge start position point includes:
acquiring a moving path of the mobile robot by using a preset planning algorithm;
moving the mobile robot according to the moving path;
when the mobile robot moves to the condition that the linear distance between the mobile robot and the target boundary reaches the first preset distance, rotating the mobile robot, and judging whether the linear distance between a distance sensor on the mobile robot and the target boundary reaches a second preset distance or not;
and stopping rotating the mobile robot under the condition that the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance.
2. An edge movement method according to claim 1,
the determining an edge start position point of the mobile robot by using the target boundary includes:
acquiring a current position point of the mobile robot;
taking the projection of the current position point on the target boundary as a reference position point;
determining an edgewise start position point of the mobile robot based on the current position point and the reference position point;
the step of taking the projection of the current position point on the target boundary as a reference position point includes:
acquiring a straight line extraction result of the target boundary, wherein the straight line extraction result comprises a plurality of straight line segments;
selecting the straight line segment with the longest length from the plurality of straight line segments as a reference straight line segment;
and taking the projection of the current position point on the reference straight-line segment as the reference position point.
3. An edge movement method according to claim 1,
the edge moving method further includes:
acquiring a first preset distance between the starting position point of the edge and the target boundary;
acquiring structural parameters of the mobile robot and the installation angle of the distance sensor;
and acquiring the second preset distance by using the first preset distance, the structural parameters and the installation angle.
4. An edge movement method according to claim 1,
the controlling the mobile robot to perform the edgewise moving work along the target boundary from the edgewise start position point further includes:
under the condition that the advancing direction of the mobile robot detects that the linear distance between the mobile robot and the other boundary reaches the first preset distance, the mobile robot is rotated until the linear distance between the distance sensor and the other boundary reaches the second preset distance, and the edgewise movement work is continued along the other boundary; wherein the another boundary is another partial boundary which intersects with a partial boundary along which the mobile robot currently moves, among the target boundaries.
5. The edgewise moving method according to claim 1, wherein the controlling the mobile robot to perform the edgewise moving work along the target boundary from the edgewise start position point further comprises:
determining a boundary of an obstacle in a case where the moving robot forward direction detects the existence of the obstacle;
judging whether the boundary of the obstacle is overlapped with the target boundary;
updating the target boundary with a boundary of the obstacle if the boundary of the obstacle overlaps the target boundary;
and controlling the mobile robot to move along the edge along the updated target boundary.
6. An edge movement method according to any one of claims 1 to 5,
the selecting a boundary of an object in the map scene using the edge information includes:
extracting a plurality of edge boundaries in the edge information;
setting a first weight for each edge boundary according to the boundary length of each edge boundary, wherein the boundary length and the first weight are in a positive correlation relationship;
setting a second weight for the corresponding edge boundary according to the included angle corresponding to the internal turning point of each edge boundary;
and taking the edge boundary with the largest sum of the first weight and the second weight in the plurality of edge boundaries as the target boundary.
7. An edge movement method according to any one of claims 1 to 5,
the obtaining of the edge information in the map scene based on the map scene information includes:
acquiring a map scene image based on the map scene information;
filtering the map scene image by adopting morphological filtering;
and acquiring the edge information of the filtered map scene image by adopting a preset edge detection algorithm.
8. A mobile robot, characterized in that the mobile robot comprises: the device comprises an edge acquisition module, a boundary acquisition module and an edge movement module; wherein the content of the first and second substances,
the edge acquisition module is used for acquiring edge information in a map scene based on the map scene information;
the boundary acquisition module is used for selecting a target boundary in the map scene by using the edge information, wherein the target boundary is a closed boundary of the outer edge of the current working area of the mobile robot in the map scene;
the edgewise moving module is used for determining an edgewise starting position point of the mobile robot by utilizing the target boundary; moving the mobile robot to the edge starting position point; controlling the mobile robot to move along the edge from the edge starting position point along the target boundary;
the edgewise moving module is further used for acquiring a moving path of the mobile robot by using a preset planning algorithm; moving the mobile robot according to the moving path; when the mobile robot moves to the condition that the linear distance between the mobile robot and the target boundary reaches the first preset distance, rotating the mobile robot, and judging whether the linear distance between a distance sensor on the mobile robot and the target boundary reaches a second preset distance or not; stopping rotating the mobile robot when the linear distance between the distance sensor on the mobile robot and the target boundary reaches a second preset distance; and the linear distance between the starting point of the edge and the target boundary is a first preset distance.
9. A mobile robot, comprising a memory and a processor coupled to the memory;
wherein the memory is configured to store program data, and the processor is configured to execute the program data to implement the edgewise moving method according to any one of claims 1 to 7.
10. A computer storage medium for storing program data for implementing the edgewise moving method according to any one of claims 1 to 7 when executed by a computer.
CN202111564238.3A 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium Withdrawn CN114253266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111564238.3A CN114253266A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111564238.3A CN114253266A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium
CN202111125877.XA CN113568415B (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111125877.XA Division CN113568415B (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium

Publications (1)

Publication Number Publication Date
CN114253266A true CN114253266A (en) 2022-03-29

Family

ID=78174495

Family Applications (4)

Application Number Title Priority Date Filing Date
CN202111566186.3A Withdrawn CN114265405A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium
CN202111564238.3A Withdrawn CN114253266A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium
CN202111566191.4A Withdrawn CN114253267A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium
CN202111125877.XA Active CN113568415B (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111566186.3A Withdrawn CN114265405A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202111566191.4A Withdrawn CN114253267A (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium
CN202111125877.XA Active CN113568415B (en) 2021-09-26 2021-09-26 Mobile robot, edgewise moving method thereof and computer storage medium

Country Status (1)

Country Link
CN (4) CN114265405A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016498A (en) * 2022-07-05 2022-09-06 未岚大陆(北京)科技有限公司 Method and device for constructing picture of mower, storage medium and mower

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114617484A (en) * 2021-11-30 2022-06-14 追觅创新科技(苏州)有限公司 Cleaning method of cleaning device, and storage medium
CN114427310A (en) * 2022-02-18 2022-05-03 智橙动力(北京)科技有限公司 Swimming pool edge cleaning method and device, electronic equipment and computer storage medium
CN114442639B (en) * 2022-02-18 2022-09-13 智橙动力(北京)科技有限公司 Swimming pool cleaning robot side control method and device and electronic equipment
CN114663316B (en) * 2022-05-17 2022-11-04 深圳市普渡科技有限公司 Method for determining edgewise path, mobile device and computer storage medium
CN117193278A (en) * 2022-05-31 2023-12-08 深圳市普渡科技有限公司 Method, apparatus, computer device and storage medium for dynamic edge path generation

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104914870B (en) * 2015-07-08 2017-06-16 中南大学 Transfinited the outdoor robot local paths planning method of learning machine based on ridge regression
CN106647765B (en) * 2017-01-13 2021-08-06 深圳拓邦股份有限公司 Planning platform based on mowing robot
CN108143364B (en) * 2017-12-28 2021-02-19 湖南格兰博智能科技有限责任公司 Method for dividing map cleaning area by self-moving cleaning robot
CN108829095B (en) * 2018-05-11 2022-02-08 云鲸智能科技(东莞)有限公司 Geo-fence setting method and method for limiting robot movement
CN109062225A (en) * 2018-09-10 2018-12-21 扬州方棱机械有限公司 The method of grass-removing robot and its generation virtual boundary based on numerical map
CN111240322B (en) * 2020-01-09 2020-12-29 珠海市一微半导体有限公司 Method for determining working starting point of robot movement limiting frame and motion control method
CN111427360B (en) * 2020-04-20 2023-05-05 珠海一微半导体股份有限公司 Map construction method based on landmark positioning, robot and robot navigation system
CN112401758B (en) * 2020-11-16 2021-10-01 上海交通大学 Deformable sweeping robot with corner cleaning mode and control method thereof
CN112476433B (en) * 2020-11-23 2023-08-04 深圳怪虫机器人有限公司 Mobile robot positioning method based on identification array boundary
CN112484718B (en) * 2020-11-30 2023-07-28 海之韵(苏州)科技有限公司 Edge navigation device and method based on environment map correction

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016498A (en) * 2022-07-05 2022-09-06 未岚大陆(北京)科技有限公司 Method and device for constructing picture of mower, storage medium and mower
CN115016498B (en) * 2022-07-05 2023-07-11 未岚大陆(北京)科技有限公司 Mower, and image building method and device thereof as well as storage medium
US11917938B2 (en) 2022-07-05 2024-03-05 Willand (Beijing) Technology Co., Ltd. Method for constructing map for mower, storage medium, mower, and mobile terminal

Also Published As

Publication number Publication date
CN114265405A (en) 2022-04-01
CN114253267A (en) 2022-03-29
CN113568415B (en) 2022-01-18
CN113568415A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN113568415B (en) Mobile robot, edgewise moving method thereof and computer storage medium
CN108553041B (en) Method for judging trapped robot
CN111104933B (en) Map processing method, mobile robot, and computer-readable storage medium
EP4043988A1 (en) Robot edge treading areal sweep planning method, chip, and robot
US20130118528A1 (en) Robot cleaner and control method thereof
CN112180931B (en) Cleaning path planning method and device of sweeper and readable storage medium
WO2021047348A1 (en) Method and apparatus for establishing passable area map, method and apparatus for processing passable area map, and mobile device
CN111609852A (en) Semantic map construction method, sweeping robot and electronic equipment
CN110680253A (en) Robot edge cleaning method and robot
CN114365974B (en) Indoor cleaning and partitioning method and device and floor sweeping robot
CN111609853A (en) Three-dimensional map construction method, sweeping robot and electronic equipment
CN114431771B (en) Sweeping method of sweeping robot and related device
CN113876246A (en) Control method for visual obstacle avoidance of mechanical arm of intelligent cleaning robot
CN111240322B (en) Method for determining working starting point of robot movement limiting frame and motion control method
CN114967698A (en) Cleaning method, cleaning device, electronic apparatus, and storage medium
Berrio et al. Updating the visibility of a feature-based map for long-term maintenance
CN112045654B (en) Detection method and device for unmanned closed space and robot
CN115444328B (en) Obstacle detection method, cleaning robot and storage medium
WO2023197839A1 (en) Cleaning device control method and apparatus, and computer-readable storage medium
CN115316887B (en) Robot control method, robot, and computer-readable storage medium
CN114617477B (en) Cleaning control method and device for cleaning robot
WO2024022452A1 (en) Method for exploring ground material, cleaning robot, and storage medium
CN117630890A (en) Robot edge control method and method for judging whether TOF sensor is shielded or not
CN116465404A (en) Optimal collision point searching method based on preset detection distance range
CN117330069A (en) Obstacle detection method, path planning method and self-mobile device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220329