CN115474863A - Obstacle detection method, obstacle detection device, medium, and cleaning robot - Google Patents

Obstacle detection method, obstacle detection device, medium, and cleaning robot Download PDF

Info

Publication number
CN115474863A
CN115474863A CN202211082989.6A CN202211082989A CN115474863A CN 115474863 A CN115474863 A CN 115474863A CN 202211082989 A CN202211082989 A CN 202211082989A CN 115474863 A CN115474863 A CN 115474863A
Authority
CN
China
Prior art keywords
obstacle
cleaning robot
information
state
detection unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211082989.6A
Other languages
Chinese (zh)
Inventor
刘帅
谢振生
夏志远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anker Innovations Co Ltd
Original Assignee
Anker Innovations Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anker Innovations Co Ltd filed Critical Anker Innovations Co Ltd
Priority to CN202211082989.6A priority Critical patent/CN115474863A/en
Publication of CN115474863A publication Critical patent/CN115474863A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the disclosure provides a method and a device for detecting obstacles, a medium and a cleaning robot, and relates to the technical field of robot control. Wherein, the method comprises the following steps: determining position change information of the cleaning robot in response to contact of the cleaning robot with the obstacle; acquiring state information of the first collision detection unit and the second collision detection unit, wherein the state information comprises a trigger state and/or a state maintaining time; based on the state information and the position change information, movable information of the obstacle is determined. Through this scheme, can realize carrying out accurate judgement to the portable information of barrier to can carry out corresponding flow of cleaning according to the portable information of barrier, can improve the coverage rate of cleaning of waiting to clean the region, improve the effect of cleaning of waiting to clean the region.

Description

Obstacle detection method, obstacle detection device, medium, and cleaning robot
Technical Field
The disclosure relates to the technical field of robot control, and in particular relates to an obstacle detection method, an obstacle detection device, an obstacle detection medium and a cleaning robot.
Background
With the improvement of living standard of people, the use of cleaning robots such as floor sweeping machines or mopping machines is more and more common. The appearance of the cleaning robot frees the hands of people and completes the traditional floor sweeping and mopping work.
Currently, a cleaning robot mainly detects whether an obstacle exists through a sensor such as a laser radar and an RGB (Red-Green-Blue) camera. When the cleaning robot detects an obstacle, the position of the obstacle is marked, and the obstacle position obtained by marking the position of the obstacle is set as an inaccessible area, so that the cleaning robot can avoid collision with the obstacle. Since the obstacle occupies a space, it may prevent the cleaning robot from performing cleaning, resulting in a problem of a decrease in cleaning coverage.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an obstacle detection method, apparatus, medium, and cleaning robot. The present disclosure can at least improve the cleaning coverage for cleaning the area in need of cleaning.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an obstacle detection method including: determining position change information of the cleaning robot in response to contact of the cleaning robot with an obstacle; acquiring state information of the first collision detection unit and the second collision detection unit, wherein the state information comprises a trigger state and/or state maintaining time; and determining movable information of the obstacle based on the state information and the position change information.
According to a second aspect of the present disclosure, there is provided an apparatus for obstacle detection, comprising: a first determination module: a position change information determining unit for determining position change information of the cleaning robot in response to contact of the cleaning robot with an obstacle; an acquisition module: the state information acquisition unit is used for acquiring state information of the first collision detection unit and the second collision detection unit, and the state information comprises a trigger state and/or a state maintaining time; a second determination module: and a controller configured to determine the movement information of the obstacle based on the state information and the position change information.
According to a third aspect of the present disclosure, there is provided a cleaning robot comprising a first collision detecting unit, a second collision detecting unit, a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the obstacle detecting method as in the above embodiments when executing the computer program.
According to a fourth aspect of the present disclosure, there is provided a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the obstacle detecting method as in the above-described embodiments.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in some embodiments of the present disclosure, position change information of the cleaning robot is determined in response to contact of the cleaning robot with an obstacle; acquiring state information of the first collision detection unit and the second collision detection unit, wherein the state information comprises a trigger state and/or state maintaining time; and determining the movable information of the obstacle based on the state information and the position change information. Through the steps, on one hand, the two-section/multi-section collision detection unit can be used for detecting the movable information of the obstacle, so that the defect that whether the obstacle is movable or not is difficult to judge by a non-contact sensor and a common one-section collision detection unit is overcome, and the sensing capability of the cleaning robot is enhanced; on the other hand, the movable information of the obstacle can be judged, so that whether the obstacle can be pushed away for cleaning or not can be judged by actively contacting the obstacle, the cleaning coverage rate of the area to be cleaned is improved, and the intelligent degree of cleaning is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It should be apparent that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived by those of ordinary skill in the art without inventive effort.
Fig. 1 schematically shows a schematic view of an application scenario to which the obstacle detection method of an embodiment of the present disclosure may be applied.
Fig. 2 schematically illustrates a flow diagram of an obstacle detection method in an exemplary embodiment according to the present disclosure.
Fig. 3 schematically illustrates a flow diagram for determining an obstacle status in an exemplary embodiment according to the present disclosure.
Fig. 4 schematically illustrates a flow diagram for performing a cleaning task in an exemplary embodiment according to the present disclosure.
Fig. 5 schematically illustrates a flow chart for determining an obstacle status in another exemplary embodiment according to the present disclosure.
FIG. 6 schematically illustrates a flow diagram for performing a cleaning task in accordance with another exemplary embodiment of the present disclosure.
Fig. 7 schematically shows a flowchart for determining a triggering state of a collision detection unit according to an exemplary embodiment of the present disclosure.
Fig. 8 schematically shows a structure of an obstacle detecting device in an exemplary embodiment according to the present disclosure.
Fig. 9 schematically illustrates a structure view of a cleaning robot according to an exemplary embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure more apparent, embodiments of the present disclosure will be described in further detail below with reference to the accompanying drawings.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the disclosure, as detailed in the claims that follow.
In the description of the present disclosure, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present disclosure can be understood in a specific case to those of ordinary skill in the art. In addition, in the description of the present disclosure, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
In addition to sensors such as laser radar and RGB camera, a collision detection switch is usually provided in an existing cleaning robot to detect obstacles under the condition that non-contact sensors such as vision and radar cannot sense the obstacles. However, the conventional cleaning robot cannot determine whether or not the obstacle can move, and thus cannot clean the area covered by the obstacle, which results in a decrease in the cleaning coverage and a decrease in the cleaning effect. In order to solve the above problem, the present disclosure provides an obstacle detection method, specifically:
referring to fig. 1, a schematic diagram schematically illustrates an application scenario to which the obstacle detection method according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the application scenario includes a cleaning robot 110, a collision detection module 120, and an obstacle 130.
The collision detection module 120 includes a first collision detection unit and a second collision detection unit, and the obstacle 130 includes but is not limited to: paper rolls, shoes, and small toys. The first collision detecting unit may be a contact sensor, a temperature sensor, or a distance sensor; the second collision detecting unit may be a strain type pressure sensor, a piezoresistive type pressure sensor, or a capacitive type pressure sensor.
For example, the first collision detection unit and the second collision detection unit may be integrated into one collision detection module 120, or may be respectively disposed in two independent collision detection modules 120, which is not limited in this disclosure.
Exemplary, fig. 2 schematically illustrates a flow chart of an obstacle detection method in an exemplary embodiment according to the present disclosure.
Specifically, referring to fig. 2, the obstacle detection method shown in the figure includes:
and S210, determining position change information of the cleaning robot in response to the contact of the cleaning robot with the obstacle.
In an exemplary embodiment, the position change information of the cleaning robot 110 is determined in response to the contact of the cleaning robot 110 with the obstacle 130. The position change information of the cleaning robot 110 includes, but is not limited to: the coordinates, speed magnitude, and speed direction of the cleaning robot 110 with respect to surrounding objects.
For example, the above-mentioned manner of determining whether the cleaning robot 110 is in contact with the obstacle 130 includes, but is not limited to: whether the cleaning robot 110 is in contact with the obstacle 130 is judged by the contact sensor, or whether the cleaning robot 110 is in contact with the obstacle 130 is judged by the distance meter.
S220, state information of the first collision detection unit and the second collision detection unit is obtained, and the state information comprises a trigger state and/or a state maintaining time.
In an exemplary embodiment, the cleaning robot 110 acquires state information of the first collision detection unit and the second collision detection unit, the state information including a trigger state and/or a state maintenance time. The collision detection unit may be a micro switch-based sensor, and is generally used to detect an obstacle by the collision detection unit when a non-contact sensor such as a vision sensor or a radar of the cleaning robot cannot sense the obstacle. The time unit of the state maintaining time may be seconds, minutes, hours, or the like.
Alternatively, the first collision detecting unit is mounted with a contact sensor, a temperature sensor, or a distance sensor for detecting whether the cleaning robot 110 is in contact with the obstacle 130. The second collision detecting unit is mounted with a strain type pressure sensor, a piezoresistive type pressure sensor, or a capacitive type pressure sensor for detecting a pressure between the cleaning robot 110 and the obstacle 130, and is triggered when the pressure is greater than a preset threshold (which may be 0). The first collision detection unit and the second collision detection unit may be integrated in the same collision detection module 120, or may be respectively disposed in two independent collision detection modules 120, which is not limited in this disclosure.
Illustratively, the first collision detection unit and the second collision detection unit are included in the collision detection module 120, and the triggering state includes: a triggered state and an untriggered state. Wherein, when the cleaning robot 110 contacts the obstacle 130, the first collision detecting unit is in a triggered state; when the cleaning robot 110 does not contact the obstacle 130, the first collision detecting unit is in an unfired state; when the cleaning robot 110 can push the obstacle 130, the second collision detecting unit is in an unfired state; when the cleaning robot 110 cannot push the obstacle 130, the second collision detecting unit is in a triggered state.
And S230, determining movable information of the obstacle based on the state information and the position change information.
In an exemplary embodiment, the cleaning robot 110 may determine the movable information of the obstacle according to a motion state when performing a synchronous motion with the obstacle 130. For example, the distance the cleaning robot moves is determined according to the above-described position change information of the cleaning robot, and the cleaning robot 110 may determine the movable information of the obstacle 130 based on the above-described state information of the first collision detecting unit, the state information of the second collision detecting unit, and the distance the cleaning robot 110 moves. The distance that the cleaning robot moves refers to a distance that the cleaning robot 110 moves while the first collision detecting unit is in the trigger state.
In the technical solution provided by the embodiment shown in fig. 2, it is first determined whether the cleaning robot is in contact with an obstacle, and when the cleaning robot is in contact with a target obstacle, the position change information of the cleaning robot is determined. Further acquiring state information of a first collision detection unit and a second collision detection unit of the cleaning robot, and finally determining movable information of the obstacle based on the state information and the position change information. Through the steps, the movable information of the barrier can be detected by using the two-section/multi-section collision detection unit, and the movable information of the barrier can be accurately judged, so that the corresponding cleaning process can be carried out according to the movable information of the barrier, the cleaning area of the area to be cleaned can be increased, and the cleaning effect of the area to be cleaned can be improved.
The following describes in detail the specific implementation of each step of the embodiment shown in fig. 2 with reference to the embodiments shown in fig. 3 to 7:
in an exemplary embodiment, fig. 3 schematically illustrates a flowchart of determining the obstacle status according to an exemplary embodiment of the present disclosure, which may be a specific implementation manner of S230. Referring to fig. 3, the second collision detecting unit is in a triggered state, and the method of determining the state of an obstacle shown in the figure includes:
and S310, determining the moving distance of the cleaning robot based on the position change information of the cleaning robot.
In an exemplary embodiment, referring to fig. 1, the cleaning robot 110 determines a distance the cleaning robot moves based on the above-described position change information.
For example, the above manner of acquiring the position change information includes but is not limited to: ultrasonic navigation positioning, a laser global positioning system, GPS positioning, and SLAM (Simultaneous Localization and Mapping) technology.
And S320, determining that the movable information of the obstacle is immovable when the state maintaining time of the trigger state of the second collision detecting unit is longer than a first time length and the moving distance of the cleaning robot is less than a first distance.
In an exemplary embodiment, the cleaning robot 110 pushes the obstacle 130 to move while the above-described second collision detecting unit is in an inactivated state. When the cleaning robot 110 cannot push the obstacle 130, the state of the second collision detecting unit is changed to the triggered state, and at this time, the cleaning robot 110 counts the time length of the second collision detecting unit in the triggered state by a timer. And determining the movable information of the obstacle 130 as immovable when the state maintaining time of the triggered state of the second collision detecting unit is longer than a first time period and the distance the cleaning robot moves is shorter than a first distance. The first duration and the first distance may be set by a user through a mobile terminal connected to the cleaning robot, and the way in which the mobile terminal is connected to the cleaning robot 110 includes, but is not limited to: bluetooth connection, wiFi (Wireless-Fidelity, wireless broadband) connection.
S320', in the case that the state maintaining time of the trigger state of the second collision detecting unit is greater than the first time period and the distance the cleaning robot moves is greater than the first distance, it is determined that the movable information of the obstacle cannot be returned.
In an exemplary embodiment, in the case where the state maintaining time of the triggered state of the second collision detecting unit is longer than a first time period and the distance the cleaning robot 110 moves is longer than a first distance, the movable information of the obstacle 130 is determined to be unreplaceable.
For example, assuming that the first duration is 20 seconds and the first distance is 0.5 m, when the cleaning robot 110 moves while pushing the obstacle 130, the cleaning robot records that the distance of the movement is 0.55 m, which is greater than the first distance, the second collision detection unit is triggered, which means that the cleaning robot 110 cannot push the obstacle 130. If the cleaning robot 110 cannot push the obstacle 130 for more than 20 seconds, the movable information of the obstacle 130 is determined to be not able to return. When the movable information of the obstacle 130 is determined to be not homing, the cleaning robot 110 determines that a part of the obstacle 130 contacts an immovable object (e.g., a wall surface, a cabinet), and the cleaning robot 110 moves along an edge of the obstacle 130 until the cleaning robot 110 changes its traveling direction to be perpendicular to the previous traveling direction, and attempts to move the obstacle 130 again. During the movement of the cleaning robot 110, the obstacle 130 may contact with an immovable obstacle again, and the cleaning robot 110 leaves the obstacle 130 to clean other areas.
For example, the cleaning robot 110 may collect the pushing force of the cleaning robot itself on the obstacle 130 during the process of attempting to push the obstacle 130, and if the pushing force is greater than a preset threshold, the cleaning robot may abandon the pushing of the obstacle 130, so as to avoid the obstacle similar to a vase being damaged.
It should be noted that, in the process of pushing the obstacle by the cleaning robot, the moving speed is automatically adjusted to a low speed mode, so as to prevent the obstacle 130 belonging to the valuables from being damaged.
In the embodiment shown in fig. 3, the moving distance of the cleaning robot is determined based on the position change information of the cleaning robot. Further, the cleaning robot judges whether the triggering time of the second collision detection unit is longer than a first time length and whether the moving distance of the cleaning robot is longer than a first distance, and determines that the movable information of the obstacle is immovable under the condition that the state maintaining time of the triggering state of the second collision detection unit is longer than the first time length and the moving distance of the cleaning robot is shorter than the first distance; and under the condition that the state maintaining time of the trigger state of the second collision detection unit is longer than a first time length and the moving distance of the cleaning robot is longer than a first distance, determining that the movable information of the obstacle cannot be returned. Through the steps, the movable information of the barrier can be detected by using the two-section/multi-section collision detection unit, so that the unmovable barrier and the barrier which cannot return can be accurately judged, the user is reminded of the existence of the area shielded by the barrier, the user can manually clean the area which is not cleaned, the cleaning effect is improved, and the cleaning thoroughness is improved.
Exemplarily, on the basis of the embodiment shown in fig. 3, fig. 4 schematically shows a flow chart for performing a cleaning task in an exemplary embodiment according to the present disclosure.
Referring to fig. 4, in S410, in response to the cleaning robot contacting the obstacle, position information of an initial contact point is acquired.
In an exemplary embodiment, the position information of the initial contact point is acquired in response to the cleaning robot 110 contacting the obstacle 130.
For example, the manner of obtaining the position information of the initial contact point includes, but is not limited to: the force direction of the initial contact point is firstly collected by a pressure sensor arranged on the body of the cleaning robot 110, and then obtained by combining the size of the cleaning robot and the position information of the cleaning robot.
In S420, when the movement information of the obstacle is not movable, the position information of the initial contact point is marked on a map, and the obstacle avoidance and cleaning task is executed, and the map is prestored in the cleaning robot.
In an exemplary embodiment, in a case where the movable information of the obstacle 130 is immovable, the cleaning robot 110 marks the position information of the initial contact point in a map, which is pre-stored in the cleaning robot 110, and performs an obstacle avoidance cleaning task. Wherein, the obstacle avoidance and cleaning task includes: the cleaning robot 110 moves along the boundary of the obstacle 130 and sweeps the floor surface while the cleaning robot 110 moves.
In S420', if the movable information of the obstacle is unable to be returned, the position information of the initial contact point is cleared, and an obstacle avoidance cleaning task is performed.
In an exemplary embodiment, in a case where the movable information of the obstacle 130 is not able to be returned, the cleaning robot 110 removes the position information of the initial contact point and performs an obstacle avoidance and cleaning task.
For example, in a case where the movable information of the obstacle 130 is unable to be returned, the cleaning robot 110 may record final position information of the obstacle 130, so that the user may search for the obstacle 130 through the final position information.
In the embodiment shown in fig. 4, when the cleaning robot contacts an obstacle, the position information of the initial contact point between the cleaning robot and the obstacle is obtained, and then the corresponding cleaning task is executed according to the movable information of the obstacle and the position information of the initial contact point. Through the scheme, the initial position information corresponding to the immovable obstacles in the area (such as the bed bottom) which is inconvenient for the user to see can be marked in the map, the initial position information corresponding to the obstacles which cannot be returned is cleared in the map, and then the obstacle avoidance cleaning task is executed, so that the purpose of marking the area which is not cleaned is realized, the user is reminded to manually clean the area which is not cleaned, and the cleaning effect is further improved.
For example, fig. 5 schematically illustrates a flowchart of another method for detecting an obstacle according to an exemplary embodiment of the present disclosure, which may be a specific implementation manner of S230. Referring to fig. 5, the second collision detecting unit is in an unfired state, or the state maintaining time of the activated state of the second collision detecting unit is less than the first duration, and the method for determining the state of the obstacle shown in the figure includes:
and S510, determining the moving distance of the cleaning robot based on the position change information of the cleaning robot.
In the exemplary embodiment, the specific implementation of S510 is the same as S310, and is not described herein again.
And S520, determining that the movable information of the barrier is to be restored under the condition that the moving distance of the cleaning robot is greater than the second distance and the first collision detection unit is in a trigger state.
In an exemplary embodiment, in the case where the cleaning robot 110 moves a distance greater than the second distance and the first collision detection unit is in the triggered state, it is determined that the movable information of the obstacle 130 is to be parked.
For example, assume that the second distance is 2 meters. In the traveling process in which the cleaning robot 110 pushes the obstacle 130, if the cleaning robot 110 is always in contact with the obstacle 130, the above-described first collision detecting unit is always in the triggered state. If the cleaning robot 110 is always in the movable information, the second collision detecting unit is not triggered, and when the first collision detecting unit is triggered and the second collision detecting unit is not triggered, the distance that the cleaning robot 110 pushes the obstacle 130 to move is more than 2 meters, the movable information of the obstacle 130 is determined to be homing, that is, the obstacle can be homed. In the traveling process in which the cleaning robot 110 pushes the obstacle 130, if the length of time that the cleaning robot 110 stops moving is less than the first length of time, the length of time that the second collision detection unit is triggered is less than the first length of time, and if the cleaning robot 110 can still push the obstacle 130 by a distance of more than 2 meters, the movable information of the obstacle 130 is determined to be homing.
S520', in case that the distance that the cleaning robot moves is less than the second distance and the state maintaining time of the non-triggered state of the first collision detecting unit is longer than the second time period, determining the movable information of the obstacle as being pushed askew.
In an exemplary embodiment, if the distance that the cleaning robot 110 pushes the obstacle 130 to move is less than the second distance and the state maintaining time of the non-activated state of the first collision detecting unit is longer than the second time period, it is determined that the cleaning robot 110 is no longer in contact with the obstacle 130, and the movable information of the obstacle 130 is determined as being falsified.
In the embodiment shown in fig. 5, when the second collision detection unit is in the non-triggered state or the state maintaining time of the triggered state of the second collision detection unit is shorter than the first time length, the moving distance of the cleaning robot is determined based on the position change information of the cleaning robot. And then, according to the moving distance of the cleaning robot and the state information of the first collision detection unit, determining that the movable information of the obstacle is to be returned or pushed askew. Through the steps, the two-section/multi-section collision detection unit can be used for detecting the movable information of the obstacles, so that the obstacles can be effectively classified and managed in the cleaning process, the cleaning area of the cleaning robot is increased, and the cleaning effect is improved.
Illustratively, based on the embodiment shown in fig. 5, fig. 6 schematically shows a flow chart for performing a cleaning task according to another exemplary embodiment of the present disclosure.
Referring to fig. 6, in S610, position information of an initial contact point is acquired in response to the cleaning robot contacting the obstacle.
In the exemplary embodiment, the specific implementation of S610 is the same as S410, and is not described herein again.
In S620, in the case where the movable information of the obstacle is to be restored, the boundary information of the obstacle is acquired.
In an exemplary embodiment, in the case where the above-described movable information of the obstacle 130 is to be parked, the cleaning robot 110 acquires boundary information of the obstacle 130.
In S620', when the movable information of the obstacle is pushed and tilted, the position information of the initial contact point is cleared, and the cleaning task is executed.
In an exemplary embodiment, in a case where the movable information of the obstacle 130 is pushed and tilted, the cleaning robot 110 clears the position information of the initial contact point and performs a cleaning task.
For example, when the cleaning robot 110 confirms that the obstacle 130 is pushed and tilted, the cleaning robot 110 may change the state of the first collision detection unit to the position where the first collision detection unit is not triggered, so that the user may search for the obstacle 130 according to the position information recorded by the cleaning robot 110 when the obstacle 130 is pushed and tilted.
In S630, the obstacle is returned based on the boundary information and the position information of the initial contact point.
In an exemplary embodiment, the cleaning robot 110 repositions the obstacle 130 based on the boundary information and the position information of the initial contact point.
For example, in the process of returning the obstacle 130, the cleaning robot 110 moves along the obstacle 130 according to the boundary information, and the cleaning robot 110 continuously changes its orientation so that the front surface of the cleaning robot 110 always faces the surface of the obstacle 130. If the cleaning robot 110 is prevented from traveling along the boundary information and cannot continue traveling, the obstacle 130 is not returned. When the front orientation of the cleaning robot 110 is opposite to the initial orientation, the obstacle 130 is pushed, and the obstacle 130 is moved. When the cleaning robot 110 pushes the obstacle 130 to the position of the initial contact point, the homing of the obstacle 130 is completed.
In the technical solution provided by the embodiment shown in fig. 6, when the cleaning robot contacts an obstacle, position information of an initial contact point is obtained, and when the movable information of the obstacle is to be restored, boundary information of the obstacle is obtained, and the obstacle is restored based on the boundary information and the position information of the initial contact point; when the movable information of the obstacle is pushed obliquely, the position information of the initial contact point is cleared, and the cleaning task is executed. Through the scheme, the obstacle is returned after being pushed away and cleaned, the problem that the position of the cleaning robot changes the position of an article in the sweeping process, and the changed article cannot be found by a user is solved, the cleaning area of the cleaning robot is increased, and the cleaning effect is improved.
For example, fig. 7 schematically shows a flowchart for determining a triggering state of a collision detection unit according to an exemplary embodiment of the present disclosure, which may be taken as a specific implementation manner of any of the above embodiments.
Referring to fig. 7, in S710, it is determined whether the cleaning robot is in contact with an obstacle.
In an exemplary embodiment, the cleaning robot 110 determines whether it is in contact with the obstacle 130 through the contact sensor, and in case the cleaning robot 110 is in contact with the obstacle 130, S720 is performed; if the cleaning robot 110 does not contact the obstacle 130, S720' is performed.
In S720, the first collision detection unit is turned on.
In an exemplary embodiment, the first collision detecting unit is turned on, and the first collision detecting unit is in a triggered state.
When the first collision detecting unit is turned on, the cleaning robot 110 may send a prompt message to let the user know that the cleaning robot 110 contacts the obstacle 130.
For example, the prompt message includes, but is not limited to: voice prompt and light prompt.
In S720', the first collision detecting unit is turned off.
In an exemplary embodiment, the first collision detecting unit is turned off, and the first collision detecting unit is in an unactuated state.
In S730, it is determined whether the position information of the cleaning robot is changed.
In an exemplary embodiment, the cleaning robot 110 determines whether its position information is changed, and in case the position information is not changed, S740 is performed; in the case where the above-described position information is changed, S740' is performed.
In S740, the second collision detection unit is turned on.
In an exemplary embodiment, the second collision detecting unit is turned on, and the second collision detecting unit is in a triggered state
In S740', the second collision detecting unit is turned off.
In an exemplary embodiment, the second collision detecting unit is turned off, and the second collision detecting unit is in an inactive state.
The embodiment shown in fig. 7 provides a technical solution that the first collision detection unit is controlled to be turned on or off by judging the contact state of the cleaning robot with the obstacle, and the second collision detection unit is controlled to be turned on or off by the movable information in the process that the cleaning robot pushes the obstacle. Through the two-section type obstacle collision detection, whether the obstacle can be moved or not is judged, and the cleaning area is maximized.
In an exemplary embodiment, as a specific implementation manner of any one of the foregoing embodiments, the method further includes:
before the cleaning robot 110 contacts the obstacle 130, the characteristics of the obstacle 130 are acquired, and the type of the obstacle is pre-determined according to the characteristics, wherein the type comprises: movable and immovable. Contacting the obstacle in the case where the type is movable; in the case where the above type is immovable, the above obstacle 130 is avoided. The above manner of obtaining the characteristics of the obstacle includes, but is not limited to: acquiring an image of the obstacle; and extracting the features of the obstacle according to the image of the obstacle. Wherein the features of the obstacle include, but are not limited to: the size of the obstacle (e.g., height, width, diameter, etc.), the type of obstacle (e.g., wall, wardrobe, flower pot, etc.).
Through the scheme, the cleaning robot can pre-judge the type of the obstacle, so that the cleaning robot can exclude the obstacle which cannot move in advance, such as a wall, and the cleaning robot can avoid the obstacle which cannot move.
In an exemplary embodiment, as a specific implementation manner of any one of the foregoing embodiments, the method further includes: after the cleaning robot 110 determines the movable information of the obstacle 130, the movable information of the obstacle may be transmitted to a terminal device connected to the cleaning robot 110 by a wireless communication manner, and when the cleaning robot 110 performs a cleaning task, the task information is returned to the terminal device. After the cleaning robot 110 has performed the cleaning task, the cleaning result is returned to the terminal device. The task information includes but is not limited to: time of sweeping, type of sweeping, and power of sweeping. The above sweeping results include one or more of the following: successful obstacle homing, failure of obstacle homing, large cleaning area, large obstacle and the number of obstacles.
Through the scheme, the understanding degree of the user on house cleaning can be enhanced, and a more reasonable cleaning plan can be formulated.
In an exemplary embodiment, the present disclosure provides a method for push-away cleaning of an obstacle, specifically comprising:
1. the cleaning robot 110 is equipped with the first collision detecting unit described above, and the second collision detecting unit described above. During the cleaning process, the cleaning robot 110 actively contacts the obstacle 130 that may be moved, ignoring the obstacle 130 that may not be moved such as a wall surface.
2. When the cleaning robot 110 comes into contact with the obstacle 130, the first collision detection unit is triggered to start the push-away cleaning strategy (step 3 to step 8).
3. The position of the collision point is temporarily stored as the position of the obstacle 130 according to the current positioning information (x, y, yaw) of the cleaning robot 110 itself and the model size of the body of the cleaning robot 110, and the current position information of the cleaning robot 110 is temporarily stored.
4. The cleaning robot 110 is adjusted to a low-speed movement mode, performs cleaning while pushing the obstacle 130, and acquires the state of the second collision detection unit in real time.
5. If the second collision detection unit is triggered and the holding TIME is greater than the threshold value KEEP _ TIME _ THRESH and the moving DISTANCE of the cleaning robot 110 is less than the threshold value PUSH _ DISTANCE _ THRESH during the PUSH-away cleaning process, the obstacle 130 is considered to be immovable; at this time, the push-away cleaning process is exited, the temporary position of the obstacle 130 is marked on the map, the cleaning failure information is returned, and the obstacle avoidance strategy is executed.
6. If the triggering and maintaining TIME of the second collision detection unit is greater than the threshold value KEEP _ TIME _ THRESH during the PUSH-away cleaning process, and the moving DISTANCE of the cleaning robot 110 is greater than the threshold value PUSH _ DISTANCE _ THRESH, it is considered that the obstacle 130 is pushed to a position that cannot be returned, but is still likely to be pushed from a certain side (for example, the side of the obstacle); at this time, the push-away cleaning process is exited, the cleaning failure information is returned, the obstacle avoidance strategy is executed, and the position of the obstacle 130 is not marked on the map.
7. If the triggering maintaining TIME of the second collision detection unit or the second collision detection unit is not triggered in the push-away cleaning process and is less than the threshold value KEEP _ TIME _ THRESH, but before the push distance reaches the set value, the first collision detection unit is in a non-triggered state, and the state maintaining TIME is greater than the threshold value KEEP _ TIME _ THRESH, the obstacle is considered to be pushed askew; the push-away cleaning flow is exited, cleaning failure information is returned, and the position of the obstacle 130 is not marked on the map.
8. If the second collision detection unit is not triggered in the push-away cleaning process or the trigger maintaining TIME of the second collision detection unit is less than the threshold value KEEP _ TIME _ THRESH until the push distance reaches the set value, the push-away cleaning is considered to be successful, the cleaning success information is returned, and the obstacle homing process is started to be executed (step 9).
9. With the wall-up sensor, the cleaning robot 110 is made to travel along the edge of the obstacle 130 and turn around.
10. If the obstacle 130 is found to be unreachable in the process of traveling, the homing process is exited, and homing is abandoned.
11. If the cleaning robot 110 smoothly travels until its own direction is opposite to the temporarily stored initial position direction, the push-away cleaning strategy is executed again, and the cleaning robot retreats from the push-away cleaning process after the execution is completed.
Through the scheme, the movable information of the barrier can be detected by using the two-section/multi-section collision detection unit, the movable information of the barrier can be accurately judged, the defect that whether the barrier is movable or not is difficultly judged by using a non-contact sensor and a common one-section collision detection unit is overcome, and the sensing capability of the cleaning robot is enhanced. And can combine the strategy of cleaning according to the portable information of barrier, carry out corresponding cleaning flow, can increase and clean the coverage, promote the intelligent degree of cleaning, improve the effect of cleaning of treating the region of cleaning to can push away the lighter barriers such as paper group, shoes, toy, promote the intelligent degree of cleaning.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 8 schematically shows a block diagram of an apparatus for obstacle detection according to an exemplary embodiment of the present disclosure. Referring to fig. 8, the apparatus 800 for detecting an obstacle is shown as applied to a cleaning robot configured with a first collision detecting unit and a second collision detecting unit, the apparatus 800 for detecting an obstacle includes: a first determination module 810, an acquisition module 820, and a second determination module 830.
Specifically, the first determining module 810 is configured to determine position change information of the cleaning robot in response to contact of the cleaning robot with an obstacle.
The obtaining module 820 is configured to obtain status information of the first collision detection unit and the second collision detection unit, where the status information includes a trigger status and/or a status maintenance time.
The second determining module 830 is configured to determine movable information of the obstacle based on the state information and the position change information.
In an exemplary embodiment, based on the foregoing, the second collision detecting unit is in a triggered state, and the second determining module 830 is further configured to: determining a distance that the cleaning robot moves based on the position change information;
and determining that the movable information of the obstacle is immovable when the state maintaining time of the triggered state of the second collision detecting unit is longer than a first time period and the distance over which the cleaning robot moves is shorter than a first distance.
In an exemplary embodiment, based on the foregoing scheme, the second determining module 830 is further configured to: and determining that the movable information of the obstacle cannot be returned when the state maintaining time of the triggered state of the second collision detecting unit is longer than the first time period and the distance traveled by the cleaning robot is longer than the first distance.
In an exemplary embodiment, based on the foregoing scheme, the second determining module 830 is further configured to: acquiring position information of an initial contact point in response to the cleaning robot contacting the obstacle; marking the position information of the initial contact point in a map and executing an obstacle avoidance cleaning task under the condition that the movable information of the obstacle is immovable, wherein the map is prestored in the cleaning robot; and when the movable information of the obstacle cannot be restored, clearing the position information of the initial contact point and executing the obstacle avoidance and cleaning task.
In an exemplary embodiment, based on the foregoing solution, the state maintaining time of the second collision detecting unit in the non-triggered state or the triggered state of the second collision detecting unit is less than a first time period, and the second determining module 830 is further configured to: determining a distance that the cleaning robot moves based on the position change information; and determining that the movable information of the obstacle is to be restored when the distance of the movement of the cleaning robot is greater than a second distance and the first collision detection unit is in a triggered state.
In an exemplary embodiment, based on the foregoing solution, the second determining module 830 is further configured to: and determining that the movable information of the obstacle is mispushed if the distance over which the cleaning robot moves is less than the second distance and the state maintaining time of the non-activated state of the first collision detecting unit is longer than a second time period.
In an exemplary embodiment, based on the foregoing solution, the second determining module 830 is further configured to: acquiring position information of an initial contact point in response to the cleaning robot contacting the obstacle; acquiring boundary information of the obstacle under the condition that the movable information of the obstacle is to be restored; homing the obstacle based on the boundary information and the position information of the initial contact point; and when the movable information of the obstacle is pushed obliquely, clearing the position information of the initial contact point to execute a cleaning task.
In an exemplary embodiment, based on the foregoing solution, the apparatus further includes a control module, where the control module is configured to: the first collision detecting unit is turned on in response to the cleaning robot contacting the obstacle; the first collision detecting unit is turned off in response to the cleaning robot being separated from the obstacle; the second collision detection unit is turned off when the first collision detection unit is turned on and the position information of the cleaning robot is changed; the second collision detecting means is turned on when the first collision detecting means is turned on and the position information of the cleaning robot is not changed.
In an exemplary embodiment, based on the foregoing scheme, the apparatus further includes a pre-determining module, where the pre-determining module is configured to: acquiring the characteristics of the obstacles; according to the above feature, a type of the obstacle is pre-determined, the type including: movable and immovable.
It should be noted that, when the obstacle detection apparatus provided in the foregoing embodiment executes the obstacle detection method, only the division of the above functional modules is taken as an example, and in practical applications, the above functions may be distributed to different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to complete all or part of the above described functions. In addition, the obstacle detection apparatus and the obstacle detection method provided in the embodiments described above belong to the same concept, and therefore, for details that are not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the obstacle detection method described above in the present disclosure, which will not be described herein again.
The above-mentioned serial numbers of the embodiments of the present disclosure are merely for description, and do not represent the advantages or disadvantages of the embodiments.
The disclosed embodiments also provide a readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the method of any of the preceding embodiments. The readable storage medium may include, but is not limited to, any type of disk including floppy disks, optical disks, DVD (Digital Video disk), CD-ROMs (Compact disk Read-Only memories), microdrive, and magneto-optical disks, ROMs (Read-Only memories), RAMs (Random Access memories), EPROMs (Erasable Programmable Read-Only memories), EEPROMs (Electrically Erasable Programmable Read-Only memories), DRAMs (Dynamic Random Access memories), VRAMs (Video Random Access memories), flash Memory devices, magnetic or optical cards, nanosystems, or any type of disk or device suitable for storing instructions and/or data.
The embodiment of the disclosure further provides a cleaning robot, which includes a first collision detection unit, a second collision detection unit, a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the steps of any of the above-mentioned embodiments are implemented.
Fig. 9 schematically illustrates a block diagram of a cleaning robot in an exemplary embodiment according to the present disclosure. Referring to fig. 9, the cleaning robot 900 includes: a first collision detection unit 902, a second collision detection unit 904, a processor 910 and a memory 920. The first collision detection unit 902 may be a contact sensor, a temperature sensor, or a distance sensor; the second collision detection unit 904 may be a strain gauge pressure sensor, a piezoresistive pressure sensor, or a capacitive pressure sensor.
In the embodiment of the present disclosure, the processor 910 is a control center of a computer system, and may be a processor of a physical machine or a processor of a virtual machine. The processor 910 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 910 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 910 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state.
In an embodiment of the present disclosure, the processor 910 is specifically configured to: determining position change information of the cleaning robot in response to contact of the cleaning robot with an obstacle; acquiring state information of the first collision detection unit and the second collision detection unit, wherein the state information comprises a trigger state and/or a state maintaining time; and determining the movable information of the obstacle based on the state information and the position change information.
Further, the second collision detection means being in a triggered state, the determining the movable information of the obstacle based on the state information and the position change information, includes: determining a distance that the cleaning robot moves based on the position change information; and determining that the movable information of the obstacle is immovable when the state maintaining time of the triggered state of the second collision detecting unit is longer than a first time period and the distance over which the cleaning robot moves is shorter than a first distance.
Further, the method further comprises: and determining that the movable information of the obstacle cannot be returned when the state maintaining time of the triggered state of the second collision detecting unit is longer than the first time period and the distance traveled by the cleaning robot is longer than the first distance.
Further, the method further comprises: acquiring position information of an initial contact point in response to the cleaning robot contacting the obstacle; marking the position information of the initial contact point in a map and executing an obstacle avoidance cleaning task under the condition that the movable information of the obstacle is immovable, wherein the map is prestored in the cleaning robot; and when the movable information of the obstacle cannot be restored, clearing the position information of the initial contact point and executing the obstacle avoidance and cleaning task.
Further, the determination of the movable information of the obstacle based on the state information and the position change information, in which the state maintaining time of the second collision detection unit in an un-triggered state or a triggered state of the second collision detection unit is shorter than a first time period, includes: determining a distance that the cleaning robot moves based on the position change information; and determining that the movable information of the obstacle is to be restored when the distance of the movement of the cleaning robot is greater than a second distance and the first collision detection unit is in a triggered state.
Further, the method further comprises: and determining that the movable information of the obstacle is mispushed if the distance over which the cleaning robot moves is less than the second distance and the state maintaining time of the non-activated state of the first collision detecting unit is longer than a second time period.
Further, the method further comprises: acquiring position information of an initial contact point in response to the cleaning robot contacting the obstacle; acquiring boundary information of the obstacle under the condition that the movable information of the obstacle is to be restored; homing the obstacle based on the boundary information and the position information of the initial contact point; and when the movable information of the obstacle is pushed obliquely, clearing the position information of the initial contact point to execute a cleaning task.
Further, the method further comprises: the first collision detecting unit is turned on in response to the cleaning robot contacting the obstacle; the first collision detecting unit is turned off in response to the cleaning robot being separated from the obstacle; the second collision detection unit is turned off when the first collision detection unit is turned on and the position information of the cleaning robot is changed; the second collision detecting means is turned on when the first collision detecting means is turned on and the position information of the cleaning robot is not changed.
Further, the method further comprises: acquiring the characteristics of the obstacles; according to the above feature, a type of the obstacle is pre-determined, the type including: movable and immovable.
Memory 920 may include one or more readable storage media, which may be non-transitory. Memory 920 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments of the present disclosure, a non-transitory readable storage medium in the memory 920 is used to store at least one instruction for execution by the processor 910 to implement a method in embodiments of the present disclosure.
In some embodiments, terminal 900 further includes: a peripheral interface 930 and at least one peripheral. The processor 910, memory 920 and peripheral interface 930 may be connected by bus or signal lines. Various peripheral devices may be connected to peripheral interface 930 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a display 940, a camera 950, and an audio circuit 960.
Peripheral interface 930 may be used to connect at least one peripheral associated with an I/O (Input/Output) to processor 910 and memory 920. In some embodiments of the present disclosure, processor 910, memory 920, and peripheral interface 930 are integrated on the same chip or circuit board; in some other embodiments of the present disclosure, any one or both of the processor 910, the memory 920, and the peripheral device interface 930 may be implemented on separate chips or circuit boards. The embodiments of the present disclosure are not particularly limited in this regard.
The display screen 940 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 940 is a touch display screen, the display screen 940 also has the ability to capture touch signals on or over the surface of the display screen 940. The touch signal may be input to the processor 910 as a control signal for processing. At this point, the display 940 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments of the present disclosure, the display screen 940 may be one, which is provided as a front panel of the terminal 900; in other embodiments of the present disclosure, the number of the display screens 940 may be at least two, and the display screens may be respectively disposed on different surfaces of the terminal 900 or in a folding design; in still other embodiments of the present disclosure, the display 940 may be a flexible display, disposed on a curved surface or a folded surface of the terminal 900. Even more, the display 940 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 940 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera 950 is used to capture images or video. Optionally, cameras 950 include front cameras and rear cameras. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments of the present disclosure, camera 950 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 960 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 910 for processing. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of the terminal 900. The microphone may also be an array microphone or an omni-directional pick-up microphone.
Power supply 970 is used to provide power to the various components in terminal 900. Power source 970 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When power source 970 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery can also be used to support fast charge technology.
The block diagram of the terminal structure shown in the embodiments of the present disclosure does not constitute a limitation to terminal 900, and terminal 900 may include more or fewer components than those shown, or may combine some components, or may employ a different arrangement of components.
In the present disclosure, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or order; the term "plurality" means two or more unless explicitly defined otherwise. The terms "mounted," "connected," "fixed," and the like are to be construed broadly, and for example, "connected" may be a fixed connection, a removable connection, or an integral connection; "coupled" may be direct or indirect through an intermediary. The specific meaning of the above terms in the present disclosure can be understood by those of ordinary skill in the art as appropriate.
In the description of the present disclosure, it is to be understood that the terms "upper", "lower", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present disclosure and simplifying the description, but do not indicate or imply that the referred device or unit must have a specific direction, be configured and operated in a specific orientation, and thus, should not be construed as limiting the present disclosure.
The above description is only for the specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered within the scope of the present disclosure. Accordingly, equivalents may be resorted to as falling within the scope of the disclosure as claimed.

Claims (10)

1. An obstacle detection method, applied to a cleaning robot provided with a first collision detection unit and a second collision detection unit, the method comprising:
determining position change information of the cleaning robot in response to contact of the cleaning robot with an obstacle;
acquiring state information of the first collision detection unit and the second collision detection unit, wherein the state information comprises a trigger state and/or a state maintaining time;
determining movable information of the obstacle based on the state information and the position change information.
2. The obstacle detection method according to claim 1, wherein the determining movable information of the obstacle based on the state information and the position change information in a case where the second collision detection unit is in a trigger state includes:
determining a distance that the cleaning robot moves based on the position change information;
determining that the movable information of the obstacle is immovable, in a case that a state maintaining time of a trigger state of the second collision detecting unit is longer than a first time period and a distance that the cleaning robot moves is shorter than a first distance;
determining that the movable information of the obstacle is not resettable in a case where a state maintaining time of a trigger state of the second collision detecting unit is greater than the first duration and a distance that the cleaning robot moves is greater than the first distance.
3. The obstacle detection method according to claim 2, characterized in that the method further comprises:
acquiring position information of an initial contact point in response to the cleaning robot contacting the obstacle;
under the condition that the movable information of the obstacle is immovable, marking the position information of the initial contact point in a map, and executing an obstacle avoidance and cleaning task, wherein the map is prestored in the cleaning robot;
and under the condition that the movable information of the obstacle cannot be restored, clearing the position information of the initial contact point, and executing the obstacle avoidance and cleaning task.
4. The obstacle detection method according to claim 1, wherein the determining the movable information of the obstacle based on the state information and the position change information, in a case where the second collision detection unit is in an unfired state or in a case where a state maintaining time of a triggered state of the second collision detection unit is less than a first time length, includes:
determining a distance that the cleaning robot moves based on the position change information;
determining that the movable information of the obstacle is to be restored when the distance of movement of the cleaning robot is greater than a second distance and the first collision detection unit is in a trigger state;
and determining the movable information of the obstacle as being skewed when the distance moved by the cleaning robot is less than the second distance and the state maintaining time of the non-triggered state of the first collision detecting unit is longer than a second length of time.
5. The obstacle detection method according to claim 4, characterized in that the method further comprises:
acquiring position information of an initial contact point in response to the cleaning robot contacting the obstacle;
acquiring boundary information of the obstacle under the condition that the movable information of the obstacle is to be restored;
homing the obstacle based on the boundary information and the position information of the initial contact point;
and when the movable information of the obstacle is pushed to be inclined, clearing the position information of the initial contact point, and executing a cleaning task.
6. The obstacle detection method according to any one of claims 1 to 5, further comprising:
the first collision detection unit is turned on in response to the cleaning robot coming into contact with the obstacle;
the first collision detecting unit is turned off in response to the cleaning robot being separated from the obstacle;
the second collision detecting unit is turned off in a case where the first collision detecting unit is turned on and the position information of the cleaning robot is changed;
the second collision detecting unit is turned on in a case where the first collision detecting unit is turned on and the position information of the cleaning robot is not changed.
7. The obstacle detection method according to any one of claims 1 to 5, characterized by further comprising:
acquiring characteristics of the obstacle;
according to the features, a type of the obstacle is pre-determined, the type including: movable and immovable.
8. An obstacle detection device, applied to a cleaning robot provided with a first collision detection unit and a second collision detection unit, the device comprising:
a first determination module: for determining position change information of the cleaning robot in response to contact of the cleaning robot with an obstacle;
an acquisition module: the state information acquisition unit is used for acquiring state information of the first collision detection unit and the second collision detection unit, and the state information comprises a trigger state and/or a state maintaining time;
a second determination module: and a controller configured to determine movable information of the obstacle based on the state information and the position change information.
9. A cleaning robot comprising a first collision detection unit, a second collision detection unit, a memory, a processor, and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the obstacle detection method according to any one of claims 1 to 7 when executing the computer program.
10. A readable storage medium on which a computer program is stored, which computer program, when being executed by a processor, carries out the obstacle detection method according to any one of claims 1 to 7.
CN202211082989.6A 2022-09-06 2022-09-06 Obstacle detection method, obstacle detection device, medium, and cleaning robot Pending CN115474863A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211082989.6A CN115474863A (en) 2022-09-06 2022-09-06 Obstacle detection method, obstacle detection device, medium, and cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211082989.6A CN115474863A (en) 2022-09-06 2022-09-06 Obstacle detection method, obstacle detection device, medium, and cleaning robot

Publications (1)

Publication Number Publication Date
CN115474863A true CN115474863A (en) 2022-12-16

Family

ID=84393000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211082989.6A Pending CN115474863A (en) 2022-09-06 2022-09-06 Obstacle detection method, obstacle detection device, medium, and cleaning robot

Country Status (1)

Country Link
CN (1) CN115474863A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060108849A (en) * 2005-04-14 2006-10-18 엘지전자 주식회사 Method for controlling a cleaning robot according to detecting obstacle, the cleaning robot using the method
KR20090019479A (en) * 2007-08-21 2009-02-25 에이스로봇 주식회사 Method for sensing an obstacle of robot cleaning apparatus and robot cleaning apparatus adapted to the same
CN112237399A (en) * 2019-07-17 2021-01-19 尚科宁家(中国)科技有限公司 Obstacle avoiding method of cleaning robot and cleaning robot
US20210216070A1 (en) * 2020-01-10 2021-07-15 Bissell Inc. Autonomous floor cleaner and method for autonomous floor cleaning
WO2021157799A1 (en) * 2020-02-06 2021-08-12 엘지전자 주식회사 Robot cleaner and control method thereof
WO2022062470A1 (en) * 2020-09-28 2022-03-31 珠海一微半导体股份有限公司 Cleaning control method based on dense obstacles
CN114601399A (en) * 2021-12-08 2022-06-10 北京石头创新科技有限公司 Control method and device of cleaning equipment, cleaning equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060108849A (en) * 2005-04-14 2006-10-18 엘지전자 주식회사 Method for controlling a cleaning robot according to detecting obstacle, the cleaning robot using the method
KR20090019479A (en) * 2007-08-21 2009-02-25 에이스로봇 주식회사 Method for sensing an obstacle of robot cleaning apparatus and robot cleaning apparatus adapted to the same
CN112237399A (en) * 2019-07-17 2021-01-19 尚科宁家(中国)科技有限公司 Obstacle avoiding method of cleaning robot and cleaning robot
US20210216070A1 (en) * 2020-01-10 2021-07-15 Bissell Inc. Autonomous floor cleaner and method for autonomous floor cleaning
WO2021157799A1 (en) * 2020-02-06 2021-08-12 엘지전자 주식회사 Robot cleaner and control method thereof
WO2022062470A1 (en) * 2020-09-28 2022-03-31 珠海一微半导体股份有限公司 Cleaning control method based on dense obstacles
CN114601399A (en) * 2021-12-08 2022-06-10 北京石头创新科技有限公司 Control method and device of cleaning equipment, cleaning equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110522359B (en) Cleaning robot and control method of cleaning robot
US20220167820A1 (en) Method and Apparatus for Constructing Map of Working Region for Robot, Robot, and Medium
CN109730590B (en) Cleaning robot and method for automatically returning and charging same
KR101887055B1 (en) Robot cleaner and control method for thereof
WO2019144541A1 (en) Cleaning robot
WO2020207390A1 (en) Detection method and apparatus, and mobile robot and storage medium
EP3585571B1 (en) Moving robot and control method thereof
CN103845003B (en) Robot cleaner
CN110086905B (en) Video recording method and electronic equipment
CN114521836A (en) Automatic cleaning equipment
CN109920424A (en) Robot voice control method and device, robot and medium
US11801602B2 (en) Mobile robot and driving method thereof
CA2969202A1 (en) Vacuum cleaner
CN211022482U (en) Cleaning robot
WO2018233493A1 (en) Autonomous robot and control method, apparatus and system therefor, and computer readable storage medium
CN109254580A (en) The operation method of service equipment for self-traveling
JP2018190391A (en) Portable mobile robot and operation method thereof
CN110506415A (en) A kind of kinescope method and electronic equipment
JP2010231359A (en) Remote control device
WO2022017341A1 (en) Automatic recharging method and apparatus, storage medium, charging base, and system
TW201841586A (en) Method for operating a self-traveling vehicle
CN110505549A (en) The control method and device of earphone
US20220280007A1 (en) Mobile robot and method of controlling the same
CN109227600A (en) Device, robot, method and program
US20180329424A1 (en) Portable mobile robot and operation thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination