CN109645892B - Obstacle identification method and cleaning robot - Google Patents

Obstacle identification method and cleaning robot Download PDF

Info

Publication number
CN109645892B
CN109645892B CN201811517738.XA CN201811517738A CN109645892B CN 109645892 B CN109645892 B CN 109645892B CN 201811517738 A CN201811517738 A CN 201811517738A CN 109645892 B CN109645892 B CN 109645892B
Authority
CN
China
Prior art keywords
obstacle
area
sensor
marking
cleaning robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811517738.XA
Other languages
Chinese (zh)
Other versions
CN109645892A (en
Inventor
王晓佳
谌鎏
郭盖华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen LD Robot Co Ltd
Original Assignee
Shenzhen LD Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen LD Robot Co Ltd filed Critical Shenzhen LD Robot Co Ltd
Priority to CN201811517738.XA priority Critical patent/CN109645892B/en
Publication of CN109645892A publication Critical patent/CN109645892A/en
Application granted granted Critical
Publication of CN109645892B publication Critical patent/CN109645892B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Abstract

The invention is suitable for the technical field of computer application, and provides an obstacle identification method and a cleaning robot, wherein the obstacle identification method comprises the following steps: if the obstacle is detected by the sensor, the pose of the cleaning robot is obtained, the obstacle is marked according to the pose and the sensor which detects the obstacle, a marked area is generated, the confidence corresponding to the marked area is obtained, a bypassing area is determined according to the confidence, the bypassing area is marked on a pre-stored map, and the bypassing area is the area where the obstacle is located. According to the obstacle identification method and device, when the sensor detects the obstacle, the pose of the cleaning robot is obtained, the obstacle is marked according to the pose and the sensor which detects the obstacle, a marked area is generated, the confidence degree corresponding to the marked area is obtained, the bypassing area is determined according to the confidence degree, the area corresponding to the obstacle is determined, the obstacle identification accuracy is high, and the labor cost is reduced.

Description

Obstacle identification method and cleaning robot
Technical Field
The invention belongs to the technical field of computer application, and particularly relates to an obstacle identification method and a cleaning robot.
Background
With the development of robotics, more and more intelligent mobile robots are emerging. In the operation process of the mobile robot, the mobile robot needs to avoid obstacles and reaches a designated area to work.
In the prior art, generally, related personnel acquire position information of an obstacle in a moving area of a robot in advance and store the position information into the moving robot so that the moving robot avoids the obstacle, and in the process of manually searching for the obstacle, the obstacle may be omitted, so that the obstacle identification accuracy is low, and when the moving area of the moving robot is large, a plurality of personnel are required to search for the obstacle, so that the labor cost is high.
Disclosure of Invention
In view of this, embodiments of the present invention provide an obstacle recognition method and a cleaning robot, so as to solve the problems of low obstacle recognition accuracy and high labor cost in a manner of manually determining an obstacle position in the prior art.
A first aspect of an embodiment of the present invention provides an obstacle recognition method applied to a cleaning robot including at least one sensor and a moving mechanism, the obstacle recognition method including:
if the sensor detects the obstacle, acquiring the pose of the cleaning robot, and marking the obstacle according to the pose and the sensor which detects the obstacle to generate a marked area;
obtaining a confidence coefficient corresponding to the marked region, determining a detour region according to the confidence coefficient, and marking the detour region on a pre-stored map;
and the bypassing area is an area where the obstacle is located.
A second aspect of an embodiment of the present invention provides a cleaning robot including at least one sensor and a control module connected to each sensor, respectively;
the control module includes:
the marking area generation submodule is used for acquiring the pose of the cleaning robot if the sensor detects the obstacle, marking the obstacle according to the pose and the sensor which detects the obstacle, and generating a marking area;
the bypassing area determining submodule is used for acquiring the confidence corresponding to the marked area, determining a bypassing area according to the confidence and marking the bypassing area on a pre-stored map;
wherein, the bypassing area is the area where the obstacle is located.
A third aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method as described above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: according to the embodiment of the invention, in the moving process of the cleaning robot, the obstacle is detected through the sensor, if the obstacle is detected by the sensor, the pose of the cleaning robot is obtained, the obstacle is marked according to the pose and the sensor which detects the obstacle, the marking area is generated, the confidence coefficient corresponding to the marking area is obtained, the detour area is determined according to the confidence coefficient, namely the area corresponding to the obstacle is determined, the detour area is automatically marked, then the detour area is marked on the pre-stored map, the obstacle areas are not required to be manually determined one by one, the obstacle identification accuracy is higher, and the labor cost is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of a method for identifying an obstacle according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a specific implementation of step S101 in fig. 1 according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a control module according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a mark region generation submodule according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Example 1:
fig. 1 shows a flow of implementation of the obstacle identification method provided by an embodiment of the present invention, which is applied to a cleaning robot including at least one sensor, and a flow execution subject of the embodiment of the present invention may be a control module, and a process of the embodiment of the present invention is detailed as follows:
in step S101, if the sensor detects an obstacle, the pose of the cleaning robot is acquired, and the obstacle is marked according to the pose and the sensor that detected the obstacle, thereby generating a marked area.
In one embodiment of the invention, the sensor comprises at least one collision sensor, and the cleaning robot further comprises a collision fence.
In one embodiment of the present invention, step S101 includes:
1) if the collision sensor detects the obstacle, the position of the collision sensor which detects the obstacle is acquired, and the first mark shape corresponding to the collision column is acquired.
2) And drawing a first mark area on a pre-stored map according to the position.
The shape corresponding to the first mark region is a first mark shape, and the distance between the first mark region and the position is within a first preset range.
In this embodiment, the cleaning robot includes a collision bar, and the first mark shape corresponding to the collision bar is a shape of the collision bar, which may be square or circular, and if the collision bar is circular, the corresponding first mark shape is circular, and if the collision bar is square, the corresponding first mark shape is square.
In this embodiment, during the movement of the cleaning robot, if an obstacle is detected by a collision sensor, the pose (i.e., the position coordinates and/or the direction angle) of the collision sensor that detects the obstacle and the shape of the collision bar are obtained, a corresponding first mark shape is obtained, and a first mark area is drawn near the position on the pre-stored map, that is, the difference between the point in the first mark area and the position of the collision sensor is within a first preset range. The shape of the first mark region is a first mark shape.
In this embodiment, if all the collision sensors of the cleaning robot detect an obstacle, the positions of the middle points corresponding to all the collision sensors are acquired, and the first mark region is drawn near the positions of the middle points.
Taking a specific application scenario as an example, if the shape of the collision bar is circular, the corresponding first mark shape is circular, the cleaning robot includes two collision sensors, namely a left collision sensor and a right collision sensor, if the left collision sensor detects a collision, namely an obstacle is detected, the position of the left collision sensor is obtained, a first mark area is drawn near the position on a prestored map, the shape of the first mark area is circular, if the left collision sensor and the right collision sensor detect a collision simultaneously, the position of a middle point between the two collision sensors is obtained, and the first mark area is drawn near the position, namely, a distance difference between a point in the first mark area and the position is within a first preset range, and the shape of the first mark area is circular.
In one embodiment of the invention, the sensor comprises an optical sensor.
In one embodiment, step S101 includes:
1) and if the optical sensor detects the obstacle, acquiring the position of the optical sensor which detects the obstacle and the corresponding fourth mark shape.
2) And drawing a fourth mark area on the pre-stored map according to the position and the shape of the fourth mark.
And the distance between the fourth marking area and the position is within a fourth preset range.
In one embodiment of the present invention, the cleaning robot includes a driving wheel, and the sensor includes a wheel abnormality detection sensor.
In this embodiment, the control module is connected with the driving wheel.
In this embodiment, the distance between the fourth mark area and the position is within a fourth preset range, which indicates that the difference between the distances between the point in the fourth mark area and the position is within the fourth preset range.
In this embodiment, when the optical sensor detects an obstacle, the position of the optical sensor that detected the obstacle is acquired, and a fourth mark shape corresponding to the optical sensor is acquired, for example, the fourth mark shape is a circle.
In this embodiment, a fourth mark area is drawn near the position on the pre-stored map, a difference between a point in the fourth mark area and the position is within a fourth preset range, and the shape of the fourth mark area is a fourth mark shape.
In this embodiment, the cliff may be detected by an optical sensor, and the fifth marking area may be determined according to the method for detecting an obstacle by the optical sensor.
In one embodiment of the present invention, the cleaning robot further includes a dust brush, and the sensor further includes a dust brush abnormality detection sensor.
In one embodiment of the present invention, step S101 includes:
1) if the dust generating brush abnormity detection sensor detects that the dust generating brush is abnormal, acquiring a position and a third mark shape corresponding to the abnormal dust generating brush;
2) and drawing a third mark area on the pre-stored map according to the position and the third mark shape.
And the distance between the third marking area and the position is within a third preset range.
In this embodiment, the cleaning robot includes at least one dust brush that cleans dust when rotating, and the dust brush abnormality detection sensor is configured to detect whether the dust brush is abnormal (locked-up), and when the dust brush cleans an obstacle, if the volume of the obstacle exceeds the volume that the dust brush can clean or the dust brush is entangled with a cable, the dust brush may be abnormal (e.g., may not be rotated when being locked), and at this time, the dust brush abnormality detection sensor may accordingly detect that the dust brush is abnormal (e.g., the rotation speed of the dust brush is lower than the normal operating range).
In this embodiment, if the dust brush abnormality detection sensor detects that there is an abnormality in the dust brush, the position (i.e., the position coordinates) of the abnormal dust brush and the third mark shape (e.g., a circle) corresponding to the dust brush abnormality detection sensor are acquired. A third mark area is drawn near the position of the abnormal dusting brush on the map (i.e., the distance from the position of the third mark area is within a third preset range), and the shape of the third mark area is a third mark shape.
In this embodiment, when all the dust-emitting brushes are detected to be abnormal, it is indicated that the volume of the obstacle is large, and the position range corresponding to the entire body of the cleaning robot is marked as the third mark region.
In step S102, a confidence corresponding to the marked region is obtained, a detour region is determined according to the confidence, and the detour region is marked on a pre-stored map. Wherein, the bypassing area is the area where the obstacle is located.
In one embodiment of the invention, the marker region is comprised of several sub-regions;
in one embodiment of the present invention, step S102 includes:
1) and acquiring the confidence corresponding to the sub-regions contained in the marked region.
2) And if the confidence coefficient of the sub-region is greater than the preset threshold value, determining the sub-region with the confidence coefficient greater than the preset threshold value as a bypassing region. And the confidence coefficient is the corresponding overlapping times of the sub-regions.
In the present embodiment, the mark region includes a first mark region, a second mark region, a third mark region, and the like, and the mark region is composed of a plurality of sub-regions.
In this embodiment, the pre-stored map is a grid map, and each sub-area is a grid in the grid map.
In this embodiment, during the operation of the cleaning robot, if no detour region is marked, multiple collisions may occur near the real obstacle, so that collision obstacle marks are left at different positions, that is, multiple mark regions are generated near the obstacle, the mark regions are overlapped, that is, intersected, and the confidence of the sub-region corresponding to the mark region is determined, where the confidence is the number of intersected mark regions corresponding to the sub-region, that is, the number of times that the mark regions are overlapped, for example, the mark region a, the mark region B, and the mark region C are intersected, and the intersected portion of the three includes the sub-region a, that is, the number of times that the sub-region a is overlapped is 3.
In this embodiment, when the confidence of the sub-region is greater than the preset threshold, it indicates that the sub-region has a higher probability of being a region corresponding to the real obstacle, and the sub-region is marked as a detour region, and the detour region is marked on the pre-stored map, and in the subsequent operation process of the cleaning robot, the sub-region is bypassed so as to avoid collision again.
In this embodiment, the cleaning robot determines the confidence of the sub-region through multiple times of collision learning without only one time of collision, and if the confidence is greater than a certain value, the sub-region is determined as an obstacle region, i.e., a detour region, so that the accuracy of obstacle identification is improved.
In one embodiment, if it is detected that the cleaning robot passes through the detour area, the moving mechanism of the cleaning robot is controlled to detour.
In this embodiment, when the cleaning robot is moving again, if it is detected that the cleaning robot passes through the detour area, that is, if it is detected that the position coordinates of the cleaning robot are at the edge of the position range corresponding to the detour area, the moving mechanism of the cleaning robot is controlled to detour, and the cleaning robot avoids the obstacle corresponding to the detour area, thereby preventing collision.
In the embodiment, the cleaning robot marks the obstacle by using the characteristics of different sensors to generate a marked area, determines the bypassing area according to the confidence coefficient of the marked area, and records and avoids the obstacle as accurately as possible, instead of only depending on a single sensor or identifying the obstacle, so that the obstacle avoidance capability of the mobile cleaning robot is improved.
In this example, in the moving process of the cleaning robot, if the obstacle is detected by the sensor, the pose of the cleaning robot is obtained, the obstacle is marked according to the pose and the sensor which detects the obstacle, a marked area is generated, the confidence corresponding to the marked area is obtained, the detour area is determined according to the confidence, namely, the area corresponding to the obstacle is determined, after the detour area is determined, if the cleaning robot is detected to pass through the detour area, the cleaning robot detours, the detour area is avoided, the obstacle is avoided, the detour area is automatically marked, the obstacle area does not need to be determined one by one manually, the obstacle identification accuracy is high, and the labor cost is reduced.
Fig. 2 shows a specific implementation flow of step S101 in fig. 1 according to an embodiment of the present invention, and the process is detailed as follows:
in step S201, if the wheel abnormality detection sensor detects an abnormality of the primary wheel, a position and a second mark shape corresponding to the abnormal primary wheel are acquired.
In the present embodiment, the drive wheel abnormality includes slip and locked rotor.
In the present embodiment, the wheel abnormality detection sensor includes a wheel speed sensor or a current sensor.
In the present embodiment, the driving wheels include left and right wheels of the cleaning robot.
In the present embodiment, if the wheel abnormality detection sensor detects an abnormality of the drive wheel, the position (i.e., the position coordinates) of the abnormal drive wheel and the second mark shape (e.g., a circle) corresponding to the wheel abnormality detection sensor are acquired.
In step S202, a second marker region is drawn on the pre-stored map according to the position and the second marker shape. And the distance between the second marking area and the position is within a second preset range.
In the present embodiment, the shape of the second mark region is a second mark shape.
In this embodiment, a second mark area is drawn near the position of the abnormal driving wheel (i.e. the distance between the second mark area and the position is within a second preset range), and the second mark area is circular.
Taking a specific application scenario as an example, the second mark is circular, and if it is detected that the left wheel of the cleaning robot slips, the position of the left wheel is obtained, and a circular second mark area is drawn near the position of the left wheel, where a distance difference between the second mark area and the position is within a second preset range.
In this embodiment, if all the driving wheels are detected to be abnormal, the position range corresponding to the body of the entire cleaning robot is marked as the second mark region.
In this embodiment, when the wheel abnormality detection sensor detects an abnormality of the drive wheel, the control module marks an obstacle according to the attribute and the position of the wheel abnormality detection sensor, thereby improving the accuracy of the obstacle marking.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example 2:
fig. 3 shows a control module 100 according to an embodiment of the invention for carrying out the method steps in the embodiment corresponding to fig. 1, which is applied to a cleaning robot comprising at least one sensor, the control module being connected to each sensor. The control module 100 includes:
and the marking area generation sub-module 110 is configured to, if the sensor detects an obstacle, acquire a pose of the cleaning robot, and mark the obstacle according to the pose and the sensor that detected the obstacle, so as to generate a marking area.
And the detour area determining submodule 120 is configured to obtain a confidence level corresponding to the marked area, determine a detour area according to the confidence level, and mark the detour area on a pre-stored map.
Wherein, the bypassing area is the area where the obstacle is located.
In one embodiment of the invention, the sensor comprises at least one collision sensor, and the cleaning robot further comprises a collision fence.
In one embodiment of the present invention, the first marking area generation sub-module 110 includes:
and the first processing unit is used for acquiring the position of the collision sensor which detects the obstacle and acquiring the first mark shape corresponding to the collision column if the collision sensor detects the obstacle.
And the first drawing unit is used for drawing a first mark area on a pre-stored map according to the position.
The shape corresponding to the first mark region is a first mark shape, and the distance between the first mark region and the position is within a first preset range.
In one embodiment of the present invention, the cleaning robot includes a driving wheel, and the sensor includes a wheel abnormality detection sensor.
In one embodiment of the invention, the marker region is comprised of several sub-regions.
In one embodiment of the present invention, the bypass area determination submodule 120 includes:
the confidence coefficient acquisition unit is used for acquiring the confidence coefficient corresponding to the sub-region contained in the marking region;
the bypassing area determining unit is used for determining the sub-area with the confidence coefficient larger than the preset threshold value as a bypassing area if the confidence coefficient of the sub-area is larger than the preset threshold value; and the confidence coefficient is the corresponding overlapping times of the sub-regions.
In one embodiment of the present invention, the cleaning robot further includes a dust brush, and the sensor further includes a dust brush abnormality detection sensor.
In one embodiment of the present invention, the mark region generation sub-module 110 includes:
the third processing unit is used for acquiring the position and the third mark shape corresponding to the abnormal dust generating brush if the dust generating brush abnormality detection sensor detects that the dust generating brush is abnormal;
the third drawing unit is used for drawing a third mark area on a pre-stored map according to the position and the third mark shape;
and the distance between the third marking area and the position is within a third preset range.
As shown in fig. 4, in an embodiment, the marking area generation sub-module 110 specifically includes:
and a second processing unit 111, configured to, if the wheel abnormality detection sensor detects that the driving wheel is abnormal, obtain a position and a second mark shape corresponding to the abnormal driving wheel.
And a second drawing unit 112 for drawing a second marker region on the pre-stored map according to the position and the second marker shape. And the distance between the second marking area and the position is within a second preset range.
In one embodiment, the control module 100 further includes other functional modules/units for implementing the method steps in the embodiments of embodiment 1.
Example 3:
an embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps in the embodiments described in embodiment 1, for example, step S101 to step S102 shown in fig. 1. Alternatively, the computer program, when executed by a processor, implements the functions of the respective modules/units in the respective system embodiments as described in embodiment 2, for example, the functions of the modules 110 to 120 shown in fig. 3.
The computer program may be stored in a computer readable storage medium, which when executed by a processor, may implement the steps of the various method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs.
The modules or units in the system of the embodiment of the invention can be combined, divided and deleted according to actual needs.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. An obstacle recognition method applied to a cleaning robot including at least one sensor, the obstacle recognition method comprising:
if the sensor detects the obstacle, acquiring the pose of the cleaning robot, and marking the obstacle according to the pose and the sensor which detects the obstacle to generate a marked area;
obtaining a confidence coefficient corresponding to the marked region, determining a detour region according to the confidence coefficient, and marking the detour region on a prestored map, wherein the confidence coefficient is determined by the overlapping times of the marked region, and the overlapping times are obtained by marking the same obstacle through different sensors;
and the bypassing area is an area where the obstacle is located.
2. The obstacle identification method according to claim 1, wherein the sensor includes at least one collision sensor, the cleaning robot further includes a collision fence;
if the obstacle is detected by the sensor, acquiring the pose of the cleaning robot, marking the obstacle according to the pose and the sensor which detects the obstacle, and generating a marked area, wherein the method comprises the following steps:
if the collision sensor detects an obstacle, acquiring the position of the collision sensor which detects the obstacle, and acquiring a first mark shape corresponding to the collision bar;
drawing a first mark area on the pre-stored map according to the position;
the shape corresponding to the first marking area is the first marking shape, and the distance between the first marking area and the position is within a first preset range.
3. The obstacle recognition method according to claim 1, wherein the cleaning robot includes a driving wheel, and the sensor includes a wheel abnormality detection sensor;
if the obstacle is detected by the sensor, acquiring the pose of the cleaning robot, marking the obstacle according to the pose and the sensor which detects the obstacle, and generating a marked area, wherein the method comprises the following steps:
if the wheel abnormality detection sensor detects that the driving wheel is abnormal, acquiring a position and a second mark shape corresponding to the abnormal driving wheel;
drawing a second mark area on the pre-stored map according to the position and the second mark shape;
wherein the distance between the second marking area and the position is within a second preset range.
4. The method for identifying an obstacle according to claim 1, wherein the mark area is composed of a plurality of sub-areas;
the obtaining of the confidence corresponding to the marking region and the determining of the detour region according to the confidence include:
obtaining the confidence corresponding to the sub-region contained in the marking region;
if the confidence coefficient of the sub-region is greater than a preset threshold value, determining the sub-region with the confidence coefficient greater than the preset threshold value as a bypassing region;
and the confidence coefficient is the corresponding overlapping times of the sub-regions.
5. The obstacle recognition method according to claim 1, wherein the cleaning robot further includes a dust brush, and the sensor further includes a dust brush abnormality detection sensor;
if the obstacle is detected by the sensor, acquiring the pose of the cleaning robot, marking the obstacle according to the pose and the sensor which detects the obstacle, and generating a marked area, wherein the method comprises the following steps:
if the dust brush abnormity detection sensor detects that the dust brush is abnormal, acquiring a position and a third mark shape corresponding to the abnormal dust brush;
drawing a third mark area on the pre-stored map according to the position and the third mark shape;
wherein a distance between the third marking area and the position is within a third preset range.
6. A cleaning robot, characterized in that it comprises at least one sensor and a control module, said control module being connected to each sensor respectively;
the control module includes:
the marking area generation submodule is used for acquiring the pose of the cleaning robot if the sensor detects the obstacle, marking the obstacle according to the pose and the sensor which detects the obstacle, and generating a marking area;
the detour area determining submodule is used for acquiring a confidence coefficient corresponding to the marked area, determining the detour area according to the confidence coefficient, and marking the detour area on a pre-stored map, wherein the confidence coefficient is determined by the overlapping times of the marked area, and the overlapping times are obtained by marking the same obstacle through different sensors;
and the bypassing area is an area where the obstacle is located.
7. The cleaning robot of claim 6, wherein the sensor comprises at least one collision sensor, the cleaning robot further comprising a collision bar;
the marked region generation submodule comprises:
the first processing unit is used for acquiring the position of a collision sensor which detects an obstacle and acquiring a first mark shape corresponding to the collision bar if the collision sensor detects the obstacle;
the first drawing unit is used for drawing a first mark area on the pre-stored map according to the position;
the shape corresponding to the first marking area is the first marking shape, and the distance between the first marking area and the position is within a first preset range.
8. The cleaning robot according to claim 6, wherein the cleaning robot includes a drive wheel, and the sensor includes a wheel abnormality detection sensor;
the labeled region generation submodule further includes:
the second processing unit is used for acquiring a position corresponding to the abnormal driving wheel and a second mark shape if the wheel abnormality detection sensor detects that the driving wheel is abnormal;
the second drawing unit is used for drawing a second mark area on the pre-stored map according to the position and the second mark shape;
wherein the distance between the second marking area and the position is within a second preset range.
9. The cleaning robot according to claim 6, wherein the cleaning robot further includes a dust brush, and the sensor further includes a dust brush abnormality detection sensor;
the labeled region generation submodule further includes:
the third processing unit is used for acquiring the position and the third mark shape corresponding to the abnormal dust generating brush if the dust generating brush abnormality detection sensor detects that the dust generating brush is abnormal;
the third drawing unit is used for drawing a third mark area on the pre-stored map according to the position and the third mark shape;
wherein a distance between the third marking area and the position is within a third preset range.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of identifying an obstacle according to any one of claims 1 to 5.
CN201811517738.XA 2018-12-12 2018-12-12 Obstacle identification method and cleaning robot Active CN109645892B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811517738.XA CN109645892B (en) 2018-12-12 2018-12-12 Obstacle identification method and cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811517738.XA CN109645892B (en) 2018-12-12 2018-12-12 Obstacle identification method and cleaning robot

Publications (2)

Publication Number Publication Date
CN109645892A CN109645892A (en) 2019-04-19
CN109645892B true CN109645892B (en) 2021-05-28

Family

ID=66113891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811517738.XA Active CN109645892B (en) 2018-12-12 2018-12-12 Obstacle identification method and cleaning robot

Country Status (1)

Country Link
CN (1) CN109645892B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306049B (en) * 2019-07-15 2024-02-23 苏州宝时得电动工具有限公司 Autonomous robot, obstacle avoidance method and device thereof, and storage medium
CN110989031B (en) * 2019-10-18 2022-09-27 深圳绿米联创科技有限公司 Human body detection method, device, system, electronic equipment and storage medium
CN111427357A (en) * 2020-04-14 2020-07-17 北京石头世纪科技股份有限公司 Robot obstacle avoidance method and device and storage medium
CN111990929B (en) * 2020-08-26 2022-03-22 北京石头世纪科技股份有限公司 Obstacle detection method and device, self-walking robot and storage medium
CN112380942A (en) * 2020-11-06 2021-02-19 北京石头世纪科技股份有限公司 Method, device, medium and electronic equipment for identifying obstacle
CN112783158A (en) * 2020-12-28 2021-05-11 广州辰创科技发展有限公司 Method, equipment and storage medium for fusing multiple wireless sensing identification technologies
CN114794972A (en) * 2021-01-29 2022-07-29 尚科宁家(中国)科技有限公司 Cleaning robot and control method thereof
CN113855835B (en) * 2021-09-27 2023-04-07 丰疆智能(深圳)有限公司 Disinfection method and device, storage medium and disinfection robot
CN114652217B (en) * 2022-03-02 2023-10-27 美智纵横科技有限责任公司 Control method, cleaning robot, and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1945351B (en) * 2006-10-21 2010-06-02 中国科学院合肥物质科学研究院 Robot navigation positioning system and navigation positioning method
CN103941737A (en) * 2014-05-09 2014-07-23 济南大学 Motion planning and controlling method for tractor-trailer mobile robot in complex environment
US9754490B2 (en) * 2015-11-04 2017-09-05 Zoox, Inc. Software application to request and control an autonomous vehicle service
CN105700525B (en) * 2015-12-07 2018-09-07 沈阳工业大学 Method is built based on Kinect sensor depth map robot working environment uncertainty map
CN206840120U (en) * 2017-02-21 2018-01-05 深圳乐动机器人有限公司 Robot control system and robot
CN108803588A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 The control system of robot
CN107456163A (en) * 2017-09-06 2017-12-12 太仓埃特奥数据科技有限公司 A kind of unmanned supermarket's health automation clean method
CN108458715B (en) * 2018-01-18 2020-05-15 亿嘉和科技股份有限公司 Robot positioning initialization method based on laser map
CN108594828B (en) * 2018-06-05 2021-05-07 北京智行者科技有限公司 Method for generating cleaning operation path

Also Published As

Publication number Publication date
CN109645892A (en) 2019-04-19

Similar Documents

Publication Publication Date Title
CN109645892B (en) Obstacle identification method and cleaning robot
EP3566821A1 (en) Robot movement control method, and robot
CN110874101B (en) Method and device for generating cleaning path of robot
CN110587597B (en) SLAM closed loop detection method and detection system based on laser radar
CN110850859B (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
CN112161624A (en) Marking method, marking device, intelligent robot and readable storage medium
CN107609510B (en) Positioning method and device for lower set of quayside container crane
CN110705385B (en) Method, device, equipment and medium for detecting angle of obstacle
CN112051841B (en) Obstacle boundary generation method and device
CN113907663A (en) Obstacle map construction method, cleaning robot and storage medium
CN112558611B (en) Path planning method and device, computer equipment and storage medium
CN110989623A (en) Ground unmanned operation equipment, method and device for controlling movement of ground unmanned operation equipment, and storage medium
CN111157012B (en) Robot navigation method and device, readable storage medium and robot
CN112180914A (en) Map processing method, map processing device, storage medium and robot
KR102509852B1 (en) Control device and method
CN113446971B (en) Space recognition method, electronic device and non-transitory computer readable storage medium
CN115437384A (en) Obstacle avoidance method, equipment and medium for mobile robot
CN112180343A (en) Laser point cloud data processing method, device and equipment and unmanned system
Premachandra et al. High performance embedding environment for reacting suddenly appeared road obstacles
CN116382308B (en) Intelligent mobile machinery autonomous path finding and obstacle avoiding method, device, equipment and medium
CN110763232B (en) Robot and navigation positioning method and device thereof
CN117330069A (en) Obstacle detection method, path planning method and self-mobile device
CN111552286A (en) Robot and movement control method and device thereof
CN117770713A (en) Control method and device of cleaning robot and cleaning robot
CN112927278A (en) Control method, control device, robot and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Patentee after: Shenzhen Ledong robot Co.,Ltd.

Address before: 518000 16th floor, building B1, Nanshan wisdom garden, 1001 Xueyuan Avenue, Taoyuan Street, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN LD ROBOT Co.,Ltd.