CN111158378A - Sweeping method of sweeping robot and sweeping robot - Google Patents

Sweeping method of sweeping robot and sweeping robot Download PDF

Info

Publication number
CN111158378A
CN111158378A CN202010048000.4A CN202010048000A CN111158378A CN 111158378 A CN111158378 A CN 111158378A CN 202010048000 A CN202010048000 A CN 202010048000A CN 111158378 A CN111158378 A CN 111158378A
Authority
CN
China
Prior art keywords
sweeping
sweeping robot
obstacle
information
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010048000.4A
Other languages
Chinese (zh)
Inventor
马鑫磊
陈彦宇
谭泽汉
马雅奇
刘欢
谭龙田
林晟杰
周慧子
叶盛世
李茹
刘金龙
郭少峰
邓剑锋
汪立富
曾安福
黎小坚
孙波
杜洋
刘郑宇
张磊
邝英兰
许荣雪
刘晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202010048000.4A priority Critical patent/CN111158378A/en
Publication of CN111158378A publication Critical patent/CN111158378A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The invention discloses a sweeping method of a sweeping robot and the sweeping robot, wherein the method comprises the following steps: the sweeping robot acquires the contour information and the height information of the obstacles in the current home scene; performing deep learning training based on the contour information and the height information of the obstacle, and determining that the current home scene of the sweeping robot is a first home scene; determining a first cleaning mode corresponding to the first household scene based on the first household scene; sweeping the ground in the first home scenario according to a first sweeping mode. Through the mode, the sweeping robot can automatically determine different sweeping modes according to different household scenes, a user does not need to manually set different sweeping modes according to different household scenes, and the user experience can be improved.

Description

Sweeping method of sweeping robot and sweeping robot
Technical Field
The invention relates to the technical field of sweeping robots, in particular to a sweeping method of a sweeping robot and the sweeping robot.
Background
Along with the continuous improvement of the living standard and the acceleration of the life rhythm of people, the application of the sweeping robot is more and more extensive. The existing sweeping robot needs to manually set a cleaning mode before cleaning. For example, the sweeping robot is manually set to use the same sweeping mode to sweep different household scenes, or the sweeping mode of the sweeping robot is manually adjusted according to different application scenes.
However, the cleaning mode of the cleaning robot needs to be manually set, which brings inconvenience to the user and reduces the experience of the user.
Disclosure of Invention
The embodiment of the invention provides a sweeping method of a sweeping robot and the sweeping robot, which are used for solving the problem that the sweeping robot in the prior art needs to manually set a sweeping mode.
In a first aspect, the embodiment of the invention provides a sweeping method of a sweeping robot, which includes:
acquiring contour information and height information of obstacles in a current household scene of the sweeping robot;
performing deep learning training based on the contour information and the height information of the obstacle, and determining that the current home scene of the sweeping robot is a first home scene; the household scenes at least comprise bedroom household scenes, kitchen household scenes and living room household scenes;
determining a first sweeping mode corresponding to the first household scene based on the first household scene; wherein, the cleaning mode at least comprises a mute mode, a circulation cleaning mode and a powerful cleaning mode; the volume of the sound generated by the sweeping robot when the sweeping robot works in the silent mode is lower than a preset volume threshold value, the sweeping robot repeatedly cleans for multiple times when the sweeping robot works in the circulating sweeping mode, and the corresponding oil stain cleaning strength is higher than the preset oil stain strength threshold value when the sweeping robot works in the powerful sweeping mode;
sweeping the ground in the first home scenario according to the first sweeping mode.
In one possible design, acquiring contour information and height information of obstacles in a home scene where the sweeping robot is currently located includes:
horizontally scanning the obstacle through a laser radar to acquire the outline information of the obstacle; the laser radar is arranged in a robot body of the sweeping robot;
controlling the robot body to incline relative to a first horizontal plane so as to enable an inclination angle to be generated between the emission direction of a detection signal of the laser radar and the first horizontal plane, and enabling the laser radar to acquire height information of the obstacle in the positive direction or the negative direction when the laser radar scans the obstacle at the inclination angle; if the transmitting direction of the detection signal is located above the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a first angle, height information of the positive direction of the obstacle is obtained through the laser radar; if the transmitting direction of the detection signal is located below the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a second angle, height information of the negative direction of the obstacle is obtained through the laser radar; the first horizontal plane is a horizontal plane where the laser radar is located when the robot body is in a horizontal state; the height information in the negative direction is height information which corresponds to the obstacle and is positioned below the first horizontal plane; the height information of the positive direction is the height information which is located above the first horizontal plane and corresponds to the barrier.
In one possible design, performing deep learning training based on the contour information and the height information of the obstacle, and determining that the home scene where the sweeping robot is currently located is a first home scene includes:
performing deep learning training based on the contour information and the height information, and determining first size information of the obstacle; the first size information includes length information, width information, and height information of the obstacle;
acquiring a first corresponding relation between pre-stored size information and the type of an obstacle;
determining a type of the obstacle based on the first correspondence and the first size information;
and determining that the household scene where the sweeping robot is located currently is the first household scene based on the type of the obstacle.
In one possible design, sweeping the ground in the first home scenario according to the first sweeping mode includes:
constructing a map of the first home scene based on the contour information and the height information of the obstacle and the first home scene;
determining a sweeping path of the sweeping robot in the first household scene based on the map;
and cleaning the ground corresponding to the cleaning path according to the first cleaning mode.
In one possible design, determining, based on the first home scenario, a first sweeping mode corresponding to the first home scenario includes:
acquiring a second corresponding relation between a pre-stored household scene and a cleaning mode;
and determining a first cleaning mode corresponding to the first household scene based on the second corresponding relation and the first household scene.
In a second aspect, an embodiment of the present invention provides a sweeping robot, including:
the acquisition unit is used for acquiring the contour information and the height information of the obstacles in the current household scene of the sweeping robot;
the processing unit is used for carrying out deep learning training based on the contour information and the height information of the obstacle and determining that the current home scene of the sweeping robot is a first home scene; the household scenes at least comprise bedroom household scenes, kitchen household scenes and living room household scenes;
the processing unit is further configured to determine, based on the first home scenario, a first sweeping mode corresponding to the first home scenario; wherein, the cleaning mode at least comprises a mute mode, a circulation cleaning mode and a powerful cleaning mode; the volume of the sound generated by the sweeping robot when the sweeping robot works in the silent mode is lower than a preset volume threshold value, the sweeping robot repeatedly cleans for multiple times when the sweeping robot works in the circulating sweeping mode, and the corresponding oil stain cleaning strength is higher than the preset oil stain strength threshold value when the sweeping robot works in the powerful sweeping mode;
a cleaning unit for cleaning the ground in the first home scenario according to the first cleaning mode.
In one possible design, the obtaining unit is specifically configured to:
horizontally scanning the obstacle through a laser radar to acquire the outline information of the obstacle; the laser radar is arranged in a robot body of the sweeping robot;
the processing unit is specifically configured to:
controlling the robot body to incline relative to a first horizontal plane so as to generate an inclination angle between the emission direction of the detection signal of the laser radar and the first horizontal plane; the first horizontal plane is a horizontal plane where the laser radar is located when the robot body is in a horizontal state;
the acquisition unit is further configured to acquire height information of the obstacle in a positive direction or a negative direction when the laser radar scans the obstacle at the inclination angle; if the transmitting direction of the detection signal is located above the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a first angle, height information of the positive direction of the obstacle is obtained through the laser radar; if the transmitting direction of the detection signal is located below the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a second angle, height information of the negative direction of the obstacle is obtained through the laser radar; the height information in the negative direction is height information which corresponds to the obstacle and is positioned below the first horizontal plane; the height information of the positive direction is the height information which is located above the first horizontal plane and corresponds to the barrier.
In one possible design, the obtaining unit is specifically configured to:
acquiring a first corresponding relation between pre-stored size information and the type of an obstacle;
the processing unit is specifically configured to:
performing deep learning training based on the contour information and the height information, and determining first size information of the obstacle; the first size information includes length information, width information, and height information of the obstacle;
determining a type of the obstacle based on the first correspondence and the first size information;
and determining that the household scene where the sweeping robot is located currently is the first household scene based on the type of the obstacle.
In one possible design, the sweeping unit is used in particular for:
constructing a map of the first home scene based on the contour information and the height information of the obstacle and the first home scene;
determining a sweeping path of the sweeping robot in the first household scene based on the map;
and cleaning the ground corresponding to the cleaning path according to the first cleaning mode.
In one possible design, the obtaining unit is specifically configured to:
acquiring a second corresponding relation between a pre-stored household scene and a cleaning mode;
the processing unit is specifically configured to:
and determining a first cleaning mode corresponding to the first household scene based on the second corresponding relation and the first household scene.
In a third aspect, an embodiment of the present invention provides a sweeping robot, including: at least one processor and memory; wherein the memory is to store one or more computer programs; the one or more computer programs stored in the memory, when executed by the at least one processor, enable the sweeping robot to perform the method of the first aspect described above or any one of the possible designs of the first aspect described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, which stores computer instructions that, when executed on a computer, enable the computer to perform the method of the first aspect or any one of the possible designs of the first aspect.
The invention has the following beneficial effects:
in the embodiment of the invention, the sweeping robot acquires the contour information and the height information of the obstacles in the current home scene; performing deep learning training based on the contour information and the height information of the obstacle, and determining that the current home scene of the sweeping robot is a first home scene; determining a first cleaning mode corresponding to the first household scene based on the first household scene; sweeping the ground in the first home scenario according to a first sweeping mode. Through the mode, the sweeping robot can automatically determine different sweeping modes according to different household scenes, a user does not need to manually set different sweeping modes according to different household scenes, and the user experience can be improved.
Drawings
Fig. 1 is a schematic flow chart of a cleaning method of a cleaning robot according to an embodiment of the present invention;
fig. 2 is a schematic view illustrating a robot body of a sweeping robot provided in an embodiment of the present invention tilting upward relative to a first horizontal plane;
fig. 3 is a schematic view illustrating a robot body of a sweeping robot provided in an embodiment of the present invention being inclined downward with respect to a first horizontal plane;
fig. 4 is a schematic structural diagram of a sweeping robot according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a sweeping robot according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The shapes and sizes of the various elements in the drawings are not to scale and are merely intended to illustrate the invention.
In the embodiments of the present invention, "first" to "third" are used to distinguish different objects, and are not used to describe a specific order. Furthermore, the term "comprises" and any variations thereof, which are intended to cover non-exclusive protection. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
In this embodiment of the present invention, "and/or" is only one kind of association relation describing an associated object, and indicates that three kinds of relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the "/" character in the specification and claims of the present invention and the above drawings generally indicates that the former and latter related objects are in an "or" relationship.
In the embodiment of the present invention, the laser radar may be a 2D single-line ultraviolet laser radar, a 2D single-line infrared laser radar, a 2D single-line visible laser radar, or the like, and of course, other 2D single-line laser radars may also be used, which is not limited in the embodiment of the present invention.
In the embodiment of the invention, the household scenes at least comprise bedroom household scenes, living room household scenes and kitchen household scenes. For example, the home scene may include a washroom home scene, a balcony home scene, a study home scene, and the like, in addition to a bedroom home scene, a living room home scene, and a kitchen home scene. The home scene can be set according to actual requirements, and the embodiment of the invention is not limited.
In the embodiment of the invention, the cleaning modes of the sweeping robot at least comprise a silent mode, a circulating cleaning mode and a powerful cleaning mode. Wherein, the produced volume of sound is less than preset volume threshold value (this volume threshold value can set up according to the actual demand) when the robot of sweeping the floor adopts the silence mode during operation, adopts the circulation to clean mode during operation repeated a lot of, adopts the powerful oil stain dynamics that cleans that the mode during operation corresponds (including cleaning speed, the quantity of deoiling stain liquid, the friction dynamics between cleaning the time and the ground etc.) to be higher than preset oil stain dynamics threshold value (this oil stain dynamics threshold value can set up according to the actual demand). Of course, in addition to the above three cleaning modes, the cleaning robot may also include other cleaning modes, such as a reservation mode, and the cleaning robot may automatically clean at a designated time when operating in the reservation mode. The scheduled cleaning mode can be combined with the above three cleaning modes to perform cleaning. For example, when the sweeping robot works in the combination of the silent mode and the reservation mode, the sweeping robot can perform sweeping at a specified time, and the volume of the sound generated during the working is lower than a preset volume threshold. The sweeping mode of the sweeping robot can be set according to actual requirements, and the embodiment of the invention is not limited.
As can be seen from the foregoing, the sweeping robot in the prior art has a problem that the cleaning mode needs to be set manually. In order to solve the problem, the embodiment of the invention provides a sweeping method of a sweeping robot.
For example, please refer to fig. 1, which is a schematic diagram of a cleaning method of a cleaning robot according to an embodiment of the present invention. As shown in fig. 1, the method includes:
s101, obtaining contour information and height information of obstacles in a current home scene where the sweeping robot is located.
Optionally, when the sweeping robot is started, the sweeping robot may horizontally scan the obstacle through the laser radar to obtain the profile information of the obstacle. Wherein, laser radar sets up in sweeping floor robot's robot body. Specifically, the sweeping robot can scan in the 360-degree direction when the robot body is in the horizontal state, so as to obtain the contour information of the obstacles in the household scene where the sweeping robot is currently located.
The existing sweeping robot cannot acquire the height information of the obstacle in the negative direction through the single-line laser radar, so that the sweeping robot cannot identify an object (such as a stair) with the height information in the positive and negative directions, and the phenomenon that the sweeping robot falls off is easily caused. For example, when the sweeping robot is located at the top stairs of the stairs, the sweeping robot cannot recognize the height information in the negative direction of the stairs, so that the stairs located below the top stairs of the stairs cannot be recognized, and the sweeping robot is easily caused to fall. In addition, the height information of the existing sweeping robot in the positive direction of the obstacle is difficult to acquire through the single-line laser radar, so that the sweeping robot cannot identify the space approximate to the height of the sweeping robot, and the phenomenon that the sweeping robot is easily blocked by the space approximate to the height of the sweeping robot is easily caused. For example, when the sweeping robot encounters a low table and chair with the height of the space below the low table and chair approximate to the height of the sweeping robot, the sweeping robot cannot recognize the height information in the positive direction of the table and chair, and cannot recognize the specific height of the space below the low table and chair, so that the sweeping robot is easily blocked when penetrating into the space below the low table and chair. The existing sweeping robot has the problems that an object with positive and negative direction height information cannot be identified, the sweeping robot is easy to fall, the sweeping robot cannot identify a space with a height approximate to the sweeping robot, and the sweeping robot is easy to be blocked by the space with the height approximate to the sweeping robot. In order to solve the problem, the embodiment of the invention provides a method for identifying height information of a barrier in a positive direction and a negative direction by a sweeping robot through a laser radar.
The following specifically describes a process of identifying height information of the obstacle in the positive and negative directions by the sweeping robot according to the embodiment of the invention.
Optionally, the sweeping robot may control the robot body to tilt relative to the first horizontal plane, so that an inclination angle is generated between the emission direction of the detection signal of the laser radar and the first horizontal plane. The first horizontal plane is a horizontal plane where the laser radar is located when the robot body is in a horizontal state. Then, when the laser radar scans the obstacle based on the inclination angle, the height information of the obstacle in the positive direction or the negative direction can be obtained by scanning. The sweeping robot can acquire height information of the positive and negative directions of the obstacle, which is obtained by scanning of the laser radar, by adjusting an inclination angle formed between the emission direction of the detection signal of the laser radar and the first horizontal plane.
For example, referring to fig. 2, when the robot cleaner controls the robot body 200 to tilt upward (upward) by, for example, 60 degrees with respect to the first horizontal plane 203, that is, when the emission direction of the detection signal of the laser radar 201 (the ray direction with an arrow in fig. 2) is located above the first horizontal plane 203, and the angle of the included angle α between the emission direction of the detection signal of the laser radar 201 and the first horizontal plane 203 is a first angle, for example, 60 degrees, the laser radar 201 may acquire height information of the positive direction of the obstacle 204, where the emission direction of the detection signal of the laser radar 201 is parallel to the plane 202 where the robot body 200 is located, the height information of the positive direction is height information of the obstacle 204 located above the first horizontal plane 203, and the horizontal plane 205 is parallel to the first horizontal plane 203.
For example, referring to fig. 3, when the robot cleaner control robot body 200 is tilted downward (looking down) by, for example, 50 degrees with respect to the first horizontal plane 203, that is, when the emission direction of the detection signal of the laser radar 201 (the ray direction with an arrow in fig. 3) is located below the first horizontal plane 203, and the angle of the included angle α between the emission direction of the detection signal of the laser radar 201 and the first horizontal plane 203 is a second angle, for example, 130 degrees, the laser radar 201 may acquire height information of the obstacle 204 in the negative direction, where the height information of the negative direction is height information of the obstacle 204 located below the first horizontal plane 203.
In the embodiment of the invention, the sweeping robot can tilt up or down relative to the first horizontal plane by controlling the robot body, that is, the sweeping robot can perform up-looking or down-looking operation. By the mode, the sweeping robot can acquire the height information of the obstacle in the positive and negative directions through the laser radar, so that the phenomenon that the sweeping robot falls due to the fact that the sweeping robot cannot identify an object with the height information in the negative direction can be avoided, and/or the phenomenon that the sweeping robot is easily blocked by a space which is highly approximate to the sweeping robot when the sweeping robot cannot identify the space which is highly approximate to the sweeping robot can be avoided.
In a specific implementation process, the sweeping robot can drive the robot body to tilt relative to the first horizontal plane through the two driving wheels and the universal auxiliary wheel. For example, the household machine sweeping robot can be provided with two driving wheels and one universal auxiliary wheel, wherein the two driving wheels and the universal auxiliary wheel (corresponding to the position of the laser radar) can be arranged below the robot body, and the two driving wheels and the universal auxiliary wheel form a triangular wheel structure of the sweeping robot. The sweeping robot can control the two driving wheels and the auxiliary wheels to lift and is used for driving the robot body to tilt relative to the first horizontal plane. For example, in the process that the robot for sweeping the floor is controlled to incline the robot body by 60 degrees relative to the first horizontal plane, the robot for sweeping the floor can keep the heights of the two driving wheels unchanged, control the universal auxiliary wheel to perform lifting operation, or keep the heights of the universal auxiliary wheel unchanged, and control the two driving wheels to perform descending operation until the inclination angle between the emission direction of the detection signal of the laser radar and the first horizontal plane is 60 degrees. In the process that the robot body is controlled to incline downwards by 50 degrees relative to the first horizontal plane by the sweeping robot, the sweeping robot can keep the heights of the two driving wheels unchanged, control the universal auxiliary wheel to descend or keep the heights of the universal auxiliary wheel unchanged, and control the two driving wheels to ascend until the inclination angle between the emission direction of the detection signal of the laser radar and the first horizontal plane is 130 degrees.
It should be noted that the manner in which the sweeping robot controls the driving wheels and the universal auxiliary wheels to perform the lifting operation may be mechanical driving, or air flotation, magnetic levitation, or the like, and the embodiment of the present invention is not limited.
In the embodiment of the invention, the two driving wheels and the universal auxiliary wheel of the sweeping robot have the lifting function, so that the robot body of the sweeping robot can be inclined relative to a first horizontal plane conveniently, the sweeping robot can acquire the height information of the obstacle through the laser radar, and the problem that the existing 2D single-line laser radar can only identify the outline information of the obstacle is solved.
S102, performing deep learning training based on the contour information and the height information of the obstacle, and determining that the current home scene of the sweeping robot is the first home scene.
Optionally, after the sweeping robot acquires the contour information and the height information of the obstacle, deep learning training may be performed based on the contour information and the height information to determine the first size information of the obstacle. Wherein the first size information includes length information, width information, and height information of the obstacle. In a specific implementation process, the sweeping robot may obtain target data (e.g., size information) of different pieces of furniture, that is, obstacles (e.g., sofas, chairs, cabinets, tables, beds, etc.) before or after the robot leaves a factory, label feature quantities (e.g., height information, width information, length information, etc. of the obstacles) on the obtained target data, then construct a convolutional neural network model according to the target data labeled with the feature quantities, perform deep learning training on the target data labeled with the feature quantities, and generate a model corresponding to the obstacles. And then, the sweeping robot inputs the acquired outline information and height information of the obstacle into the model for deep training, so that first size information of the obstacle can be obtained.
In the embodiment of the invention, the sweeping robot determines the first size information of the obstacle by performing deep learning training on the acquired contour information and the acquired height information. Through the mode, the sweeping robot can identify the first size information of the obstacle, the problem that the existing sweeping robot can only identify the outline information of the obstacle through a 2D single-line laser radar is solved, and the accuracy of the sweeping robot in identifying the type of the obstacle is further improved.
Alternatively, the sweeping robot may acquire a first correspondence between the pre-stored size information and the type of the obstacle, and after determining the first size information of the obstacle, determine the type of the obstacle based on the first correspondence and the first size information. For example, the obstacle type corresponding to the size information a (for example, the length is 1.2-2.4 m, the width is 1.2-2 m, and the height is 0.5-1 m) pre-stored in the floor cleaning robot is taken as an example of the bed. If the first size information is 2 meters long, 1.8 meters wide and 1 meter high, the sweeping robot may determine that the obstacle corresponding to the first size information is a bed according to the first corresponding relationship.
In the embodiment of the invention, the sweeping robot can determine the type of the obstacle according to the first pairing relation and the first size information, so that the sweeping robot can conveniently identify the obstacle, and the accuracy of the sweeping robot in identifying the household scene can be further improved.
It should be noted that the sweeping robot may also determine the type of the obstacle in other ways, and the embodiment of the present invention is not limited. For example, when the sweeping robot labels feature quantities of target data of different obstacles, labels of types of the obstacles are added, then, the sweeping robot performs deep learning training on the target data labeled with the feature quantities, after a model corresponding to the obstacles is generated, the sweeping robot inputs the acquired contour information and height information into the model to perform the deep learning training, and the types of the obstacles can be acquired without the need of determining first size information of the obstacles by the sweeping robot first and then determining the types of the obstacles according to the first size information, so that the obstacle can be conveniently identified by the sweeping robot.
Optionally, after the sweeping robot determines the type of the obstacle, it may determine, based on the type of the obstacle, that the home scene where the sweeping robot is currently located is the first home scene. For example, a third correspondence between the type of the obstacle and a home scene may be prestored in the sweeping robot, for example, the home scene corresponding to the bed is a bedroom scene, the home scene corresponding to the sofa is a living room home scene, and the home scene corresponding to the cupboard is a kitchen home scene. After the sweeping robot determines the type of the obstacle, it can be determined that the home scene where the obstacle is located is the first home scene according to the third corresponding relationship, that is, it is determined that the home scene where the sweeping robot is currently located is the first home scene.
In the embodiment of the invention, the sweeping robot can determine the household scene where the sweeping robot is located according to the third pair of relations and the type of the obstacle, so that the accuracy of the sweeping robot in identifying the household scene can be improved.
S103, determining a first cleaning mode corresponding to the first household scene based on the first household scene.
Optionally, the sweeping robot may store a second correspondence between the home scene and the sweeping mode in advance. When the sweeping robot determines that the current home scene is the first home scene, the second corresponding relationship can be obtained, and the first sweeping mode corresponding to the first home scene can be determined based on the second corresponding relationship and the first home scene. For example, please refer to table 1, which shows a second corresponding relationship between a home scene and a cleaning mode according to an embodiment of the present invention.
TABLE 1
Home scene Cleaning mode
Bedroom household scene Silent cleaning mode
Family scene of living room Circulation cleaning mode
Kitchen household scene Strong cleaning mode
As shown in table 1, if the first home scene is a bedroom home scene, the sweeping robot may determine that a first cleaning mode corresponding to the first home scene is a silent cleaning mode; if the first household scene is a living room household scene, the sweeping robot can determine that a first sweeping mode corresponding to the first household scene is a circulating sweeping mode; if the first household scene is a kitchen household scene, the sweeping robot may determine that the first cleaning mode corresponding to the first household scene is a powerful cleaning mode.
It should be noted that table 1 exemplifies that the home scene includes a bedroom home scene, a living room home scene, and a kitchen home scene, and the cleaning mode includes a silent cleaning mode, a cyclic cleaning mode, and a power cleaning mode, and certainly, when the sweeping robot further sets other home scenes and other cleaning modes, table 1 may further include a corresponding relationship between the other cleaning modes and the other cleaning modes, which is not limited in the embodiment of the present invention.
In the embodiment of the invention, after the sweeping robot determines that the current home scene is the first home scene, the first sweeping mode corresponding to the first home scene can be determined according to the second corresponding relation, so that the sweeping robot can automatically determine different sweeping modes according to different home scenes, a user does not need to manually set different sweeping modes according to different home scenes, and the user experience can be further improved.
And S104, cleaning the ground in the first household scene according to the first cleaning mode.
Optionally, in the process that the sweeping robot sweeps the ground of the first home scene according to the first sweeping mode, the sweeping robot may construct a map of the first home scene based on the contour information and the height information of the obstacle and the first home scene. For example, the sweeping robot may synthesize a map (i.e., a three-dimensional map) of a home scene having positive and negative height information by using an Iterative Closest Points (ICP) algorithm and a General Polygon Clipping (GPC) algorithm according to the acquired contour information and height information of the obstacle. Then, the sweeping robot can determine a sweeping path of the sweeping robot in the first household scene based on the map, and sweep the ground corresponding to the sweeping path according to the first sweeping mode.
In the embodiment of the invention, the sweeping robot can construct a map of the first household scene with positive and negative direction height information according to the contour information and the height information of the obstacle and the first household scene, so that the phenomenon that the sweeping robot falls off due to the fact that the sweeping robot cannot identify the object with the positive and negative direction height information in the process of sweeping the ground by the sweeping robot can be avoided, and/or the phenomenon that the sweeping robot is easily clamped in the space with the height approximate to the height of the sweeping robot when the sweeping robot cannot identify the space with the height approximate to the height of the sweeping robot can be avoided, and the obstacle evasiveness of the sweeping robot on the obstacle is improved.
As can be seen from the above description, in the embodiment of the present invention, the sweeping robot acquires the contour information and the height information of the obstacle in the current home scene; performing deep learning training based on the contour information and the height information of the obstacle, and determining that the current home scene of the sweeping robot is a first home scene; determining a first cleaning mode corresponding to the first household scene based on the first household scene; sweeping the ground in the first home scenario according to a first sweeping mode. Through the mode, the sweeping robot can automatically determine different sweeping modes according to different household scenes, a user does not need to manually set different sweeping modes according to different household scenes, and the user experience can be improved.
Based on the same invention concept, the embodiment of the invention provides a sweeping robot. Fig. 4 is a schematic structural diagram of a sweeping robot according to an embodiment of the present invention.
As shown in fig. 4, the sweeping robot 400 includes:
the acquiring unit 401 is configured to acquire contour information and height information of an obstacle in a home scene where the sweeping robot 400 is currently located;
the processing unit 402 is configured to perform deep learning training based on the contour information and the height information of the obstacle, and determine that the current home scene of the sweeping robot 400 is a first home scene; the household scenes at least comprise bedroom household scenes, kitchen household scenes and living room household scenes;
the processing unit 402 is further configured to determine, based on the first home scenario, a first sweeping mode corresponding to the first home scenario; wherein, the cleaning mode at least comprises a mute mode, a circulation cleaning mode and a powerful cleaning mode; when the sweeping robot 400 works in a silent mode, the volume of the generated sound is lower than a preset volume threshold value, when the sweeping robot works in a circulating sweeping mode, sweeping is repeated for multiple times, and when the sweeping robot works in a powerful sweeping mode, the corresponding oil stain sweeping force is higher than a preset oil stain force threshold value;
a cleaning unit 403 for cleaning the ground in the first domestic scenario according to a first cleaning mode.
In one possible design, the obtaining unit 401 is specifically configured to:
horizontally scanning the obstacle through a laser radar to obtain the outline information of the obstacle; the laser radar is arranged in the robot body of the sweeping robot 400;
the processing unit 402 is specifically configured to:
the robot body is controlled to incline relative to a first horizontal plane, so that an inclination angle is formed between the emission direction of a detection signal of the laser radar and the first horizontal plane; the first horizontal plane is a horizontal plane where the laser radar is located when the robot body is in a horizontal state;
the obtaining unit 401 is further configured to obtain height information of the obstacle in the positive direction or the negative direction when the laser radar scans the obstacle at the oblique angle; if the transmitting direction of the detection signal is located above a first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a first angle, height information of the positive direction of the obstacle is obtained through a laser radar; if the transmitting direction of the detection signal is located below the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a second angle, height information of the obstacle in the negative direction is obtained through a laser radar; the height information in the negative direction is height information which corresponds to the barrier and is positioned below the first horizontal plane; the height information in the positive direction is height information which corresponds to the obstacle and is located above the first horizontal plane.
In one possible design, the obtaining unit 401 is specifically configured to:
acquiring a first corresponding relation between pre-stored size information and the type of an obstacle;
the processing unit 402 is specifically configured to:
performing deep learning training based on the contour information and the height information, and determining first size information of the obstacle; the first size information includes length information, width information, and height information of the obstacle;
determining the type of the obstacle based on the first corresponding relation and the first size information;
based on the type of the obstacle, it is determined that the home scene where the sweeping robot 400 is currently located is the first home scene.
In one possible design, the sweeping unit 403 is specifically configured to:
constructing a map of a first household scene based on the contour information and the height information of the barrier and the first household scene;
determining a sweeping path of the sweeping robot 400 in a first home scene based on the map;
and according to the first cleaning mode, cleaning the ground corresponding to the cleaning path.
In one possible design, the obtaining unit 401 is specifically configured to:
acquiring a second corresponding relation between a pre-stored household scene and a cleaning mode;
the processing unit 402 is specifically configured to:
and determining a first cleaning mode corresponding to the first household scene based on the second corresponding relation and the first household scene.
The sweeping robot 400 in the embodiment of the present invention and the sweeping method of the sweeping robot shown in fig. 1 are based on the same concept, and through the foregoing detailed description of the sweeping robot, a person skilled in the art can clearly understand the implementation process of the sweeping robot 400 in the embodiment, so for brevity of the description, further description is omitted here.
Based on the same invention concept, the embodiment of the invention provides a sweeping robot. Fig. 5 is a schematic structural diagram of a sweeping robot according to an embodiment of the present invention.
As shown in fig. 5, the sweeping robot 500 includes:
a memory 501 for storing one or more computer instructions;
at least one processor 502 configured to read computer instructions from the memory 501 to enable the sweeping robot 500 to implement all or some of the steps in the embodiment shown in fig. 1.
Optionally, the memory 501 may include a high-speed random access memory, and may further include a nonvolatile memory, such as a magnetic disk storage device, a flash memory device, or other nonvolatile solid state storage devices, and the like, which is not limited in the embodiments of the present invention.
Alternatively, the processor 502 may be a general purpose processor (CPU), or an ASIC, or an FPGA, or may be one or more integrated circuits for controlling program execution.
In some embodiments, the memory 501 and the processor 502 may be implemented on the same chip, and in other embodiments, they may be implemented on separate chips, which is not limited by the embodiments of the present invention.
Based on the same inventive concept, embodiments of the present invention provide a computer-readable storage medium, which stores computer instructions, and when the computer instructions are executed by a computer, the computer instructions cause the computer to perform the steps of the cleaning method of the sweeping robot.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (12)

1. A sweeping method of a sweeping robot is characterized by comprising the following steps:
acquiring contour information and height information of obstacles in a current household scene of the sweeping robot;
performing deep learning training based on the contour information and the height information of the obstacle, and determining that the current home scene of the sweeping robot is a first home scene; the household scenes at least comprise bedroom household scenes, kitchen household scenes and living room household scenes;
determining a first sweeping mode corresponding to the first household scene based on the first household scene; wherein, the cleaning mode at least comprises a mute mode, a circulation cleaning mode and a powerful cleaning mode; the volume of the sound generated by the sweeping robot when the sweeping robot works in the silent mode is lower than a preset volume threshold value, the sweeping robot repeatedly cleans for multiple times when the sweeping robot works in the circulating sweeping mode, and the corresponding oil stain cleaning strength is higher than the preset oil stain strength threshold value when the sweeping robot works in the powerful sweeping mode;
sweeping the ground in the first home scenario according to the first sweeping mode.
2. The method of claim 1, wherein obtaining contour information and height information of obstacles in a home scene in which the sweeping robot is currently located comprises:
horizontally scanning the obstacle through a laser radar to acquire the outline information of the obstacle; the laser radar is arranged in a robot body of the sweeping robot;
controlling the robot body to incline relative to a first horizontal plane so as to enable an inclination angle to be generated between the emission direction of a detection signal of the laser radar and the first horizontal plane, and enabling the laser radar to acquire height information of the obstacle in the positive direction or the negative direction when the laser radar scans the obstacle at the inclination angle; if the transmitting direction of the detection signal is located above the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a first angle, height information of the positive direction of the obstacle is obtained through the laser radar; if the transmitting direction of the detection signal is located below the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a second angle, height information of the negative direction of the obstacle is obtained through the laser radar; the first horizontal plane is a horizontal plane where the laser radar is located when the robot body is in a horizontal state; the height information in the negative direction is height information which corresponds to the obstacle and is positioned below the first horizontal plane; the height information of the positive direction is the height information which is located above the first horizontal plane and corresponds to the barrier.
3. The method of claim 1, wherein performing deep learning training based on the contour information and the height information of the obstacle to determine that the home scene in which the sweeping robot is currently located is a first home scene comprises:
performing deep learning training based on the contour information and the height information, and determining first size information of the obstacle; the first size information includes length information, width information, and height information of the obstacle;
acquiring a first corresponding relation between pre-stored size information and the type of an obstacle;
determining a type of the obstacle based on the first correspondence and the first size information;
and determining that the household scene where the sweeping robot is located currently is the first household scene based on the type of the obstacle.
4. The method of any of claims 1-3, wherein sweeping the ground in the first home scenario according to the first sweeping mode comprises:
constructing a map of the first home scene based on the contour information and the height information of the obstacle and the first home scene;
determining a sweeping path of the sweeping robot in the first household scene based on the map;
and cleaning the ground corresponding to the cleaning path according to the first cleaning mode.
5. The method of claim 4, wherein determining, based on the first home scenario, a first sweeping mode corresponding to the first home scenario comprises:
acquiring a second corresponding relation between a pre-stored household scene and a cleaning mode;
and determining a first cleaning mode corresponding to the first household scene based on the second corresponding relation and the first household scene.
6. A sweeping robot is characterized by comprising:
the acquisition unit is used for acquiring the contour information and the height information of the obstacles in the current household scene of the sweeping robot;
the processing unit is used for carrying out deep learning training based on the contour information and the height information of the obstacle and determining that the current home scene of the sweeping robot is a first home scene; the household scenes at least comprise bedroom household scenes, kitchen household scenes and living room household scenes;
the processing unit is further configured to determine, based on the first home scenario, a first sweeping mode corresponding to the first home scenario; wherein, the cleaning mode at least comprises a mute mode, a circulation cleaning mode and a powerful cleaning mode; the volume of the sound generated by the sweeping robot when the sweeping robot works in the silent mode is lower than a preset volume threshold value, the sweeping robot repeatedly cleans for multiple times when the sweeping robot works in the circulating sweeping mode, and the corresponding oil stain cleaning strength is higher than the preset oil stain strength threshold value when the sweeping robot works in the powerful sweeping mode;
a cleaning unit for cleaning the ground in the first home scenario according to the first cleaning mode.
7. The sweeping robot of claim 6, wherein the acquiring unit is specifically configured to:
horizontally scanning the obstacle through a laser radar to acquire the outline information of the obstacle; the laser radar is arranged in a robot body of the sweeping robot;
the processing unit is specifically configured to:
controlling the robot body to incline relative to a first horizontal plane so as to generate an inclination angle between the emission direction of the detection signal of the laser radar and the first horizontal plane; the first horizontal plane is a horizontal plane where the laser radar is located when the robot body is in a horizontal state;
the acquisition unit is further configured to acquire height information of the obstacle in a positive direction or a negative direction when the laser radar scans the obstacle at the inclination angle; if the transmitting direction of the detection signal is located above the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a first angle, height information of the positive direction of the obstacle is obtained through the laser radar; if the transmitting direction of the detection signal is located below the first horizontal plane and the inclination angle between the transmitting direction of the detection signal and the first horizontal plane is a second angle, height information of the negative direction of the obstacle is obtained through the laser radar; the height information in the negative direction is height information which corresponds to the obstacle and is positioned below the first horizontal plane; the height information of the positive direction is the height information which is located above the first horizontal plane and corresponds to the barrier.
8. The sweeping robot of claim 6, wherein the acquiring unit is specifically configured to:
acquiring a first corresponding relation between pre-stored size information and the type of an obstacle;
the processing unit is specifically configured to:
performing deep learning training based on the contour information and the height information, and determining first size information of the obstacle; the first size information includes length information, width information, and height information of the obstacle;
determining a type of the obstacle based on the first correspondence and the first size information;
and determining that the household scene where the sweeping robot is located currently is the first household scene based on the type of the obstacle.
9. A sweeping robot according to any one of claims 6-8, wherein the sweeping unit is specifically configured to:
constructing a map of the first home scene based on the contour information and the height information of the obstacle and the first home scene;
determining a sweeping path of the sweeping robot in the first household scene based on the map;
and cleaning the ground corresponding to the cleaning path according to the first cleaning mode.
10. The sweeping robot of claim 9, wherein the acquiring unit is specifically configured to:
acquiring a second corresponding relation between a pre-stored household scene and a cleaning mode;
the processing unit is specifically configured to:
and determining a first cleaning mode corresponding to the first household scene based on the second corresponding relation and the first household scene.
11. A sweeping robot is characterized by comprising: at least one processor and memory;
the memory for storing one or more computer programs;
the one or more computer programs stored in the memory, when executed by the at least one processor, cause the sweeping robot to perform the method of any of claims 1-5.
12. A computer-readable storage medium having stored thereon computer instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1-5.
CN202010048000.4A 2020-01-16 2020-01-16 Sweeping method of sweeping robot and sweeping robot Pending CN111158378A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010048000.4A CN111158378A (en) 2020-01-16 2020-01-16 Sweeping method of sweeping robot and sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010048000.4A CN111158378A (en) 2020-01-16 2020-01-16 Sweeping method of sweeping robot and sweeping robot

Publications (1)

Publication Number Publication Date
CN111158378A true CN111158378A (en) 2020-05-15

Family

ID=70563520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010048000.4A Pending CN111158378A (en) 2020-01-16 2020-01-16 Sweeping method of sweeping robot and sweeping robot

Country Status (1)

Country Link
CN (1) CN111158378A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111631650A (en) * 2020-06-05 2020-09-08 上海黑眸智能科技有限责任公司 Indoor plan generating method, system and terminal based on obstacle height detection and sweeping robot
CN111700553A (en) * 2020-06-05 2020-09-25 科沃斯机器人股份有限公司 Obstacle avoidance method, device, robot and storage medium
CN111830532A (en) * 2020-07-22 2020-10-27 厦门市和奕华光电科技有限公司 Multi-module multiplexing laser radar and sweeping robot
CN112690710A (en) * 2020-12-29 2021-04-23 深圳市云视机器人有限公司 Obstacle trafficability judging method, obstacle trafficability judging device, computer device, and storage medium
CN112741562A (en) * 2020-12-30 2021-05-04 苏州三六零机器人科技有限公司 Sweeper control method, sweeper control device, sweeper control equipment and computer readable storage medium
CN113892859A (en) * 2020-06-22 2022-01-07 珠海格力电器股份有限公司 Control method of sweeping robot and sweeping robot
WO2023078323A1 (en) * 2021-11-08 2023-05-11 追觅创新科技(苏州)有限公司 Self-moving device, obstacle detection method of self-moving device, and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092254A (en) * 2017-04-27 2017-08-25 北京航空航天大学 A kind of design method for the Household floor-sweeping machine device people for strengthening study based on depth
CN207148647U (en) * 2017-04-26 2018-03-27 北京饮冰科技有限公司 A kind of positioning of sweeping robot and navigation plotting board
CN108742360A (en) * 2018-09-03 2018-11-06 信利光电股份有限公司 A kind of cleaning method of sweeping robot, device, equipment and storage medium
CN109700383A (en) * 2019-01-17 2019-05-03 深圳乐动机器人有限公司 Clean method, robot and the terminal device of robot
CN109717796A (en) * 2018-11-21 2019-05-07 北京石头世纪科技股份有限公司 Intelligent cleaning equipment
CN110353583A (en) * 2019-08-21 2019-10-22 追创科技(苏州)有限公司 The autocontrol method of sweeping robot and sweeping robot
CN110390237A (en) * 2018-04-23 2019-10-29 北京京东尚科信息技术有限公司 Processing Method of Point-clouds and system
CN110393482A (en) * 2019-09-03 2019-11-01 深圳飞科机器人有限公司 Maps processing method and clean robot
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
CN110623601A (en) * 2018-06-21 2019-12-31 科沃斯机器人股份有限公司 Ground material identification method and device, sweeping robot and storage medium
CN209915877U (en) * 2019-01-29 2020-01-10 国科光芯(海宁)科技股份有限公司 Robot of sweeping floor with multi-thread laser radar

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207148647U (en) * 2017-04-26 2018-03-27 北京饮冰科技有限公司 A kind of positioning of sweeping robot and navigation plotting board
CN107092254A (en) * 2017-04-27 2017-08-25 北京航空航天大学 A kind of design method for the Household floor-sweeping machine device people for strengthening study based on depth
CN110390237A (en) * 2018-04-23 2019-10-29 北京京东尚科信息技术有限公司 Processing Method of Point-clouds and system
CN110623601A (en) * 2018-06-21 2019-12-31 科沃斯机器人股份有限公司 Ground material identification method and device, sweeping robot and storage medium
CN108742360A (en) * 2018-09-03 2018-11-06 信利光电股份有限公司 A kind of cleaning method of sweeping robot, device, equipment and storage medium
CN109717796A (en) * 2018-11-21 2019-05-07 北京石头世纪科技股份有限公司 Intelligent cleaning equipment
CN109700383A (en) * 2019-01-17 2019-05-03 深圳乐动机器人有限公司 Clean method, robot and the terminal device of robot
CN209915877U (en) * 2019-01-29 2020-01-10 国科光芯(海宁)科技股份有限公司 Robot of sweeping floor with multi-thread laser radar
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
CN110353583A (en) * 2019-08-21 2019-10-22 追创科技(苏州)有限公司 The autocontrol method of sweeping robot and sweeping robot
CN110393482A (en) * 2019-09-03 2019-11-01 深圳飞科机器人有限公司 Maps processing method and clean robot

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111631650A (en) * 2020-06-05 2020-09-08 上海黑眸智能科技有限责任公司 Indoor plan generating method, system and terminal based on obstacle height detection and sweeping robot
CN111700553A (en) * 2020-06-05 2020-09-25 科沃斯机器人股份有限公司 Obstacle avoidance method, device, robot and storage medium
CN111631650B (en) * 2020-06-05 2021-12-03 上海黑眸智能科技有限责任公司 Indoor plan generating method, system and terminal based on obstacle height detection and sweeping robot
CN111700553B (en) * 2020-06-05 2021-12-24 深圳瑞科时尚电子有限公司 Obstacle avoidance method, device, robot and storage medium
CN113892859A (en) * 2020-06-22 2022-01-07 珠海格力电器股份有限公司 Control method of sweeping robot and sweeping robot
CN111830532A (en) * 2020-07-22 2020-10-27 厦门市和奕华光电科技有限公司 Multi-module multiplexing laser radar and sweeping robot
CN112690710A (en) * 2020-12-29 2021-04-23 深圳市云视机器人有限公司 Obstacle trafficability judging method, obstacle trafficability judging device, computer device, and storage medium
CN112690710B (en) * 2020-12-29 2021-10-26 深圳市云视机器人有限公司 Obstacle trafficability judging method, obstacle trafficability judging device, computer device, and storage medium
CN112741562A (en) * 2020-12-30 2021-05-04 苏州三六零机器人科技有限公司 Sweeper control method, sweeper control device, sweeper control equipment and computer readable storage medium
WO2023078323A1 (en) * 2021-11-08 2023-05-11 追觅创新科技(苏州)有限公司 Self-moving device, obstacle detection method of self-moving device, and storage medium

Similar Documents

Publication Publication Date Title
CN111158378A (en) Sweeping method of sweeping robot and sweeping robot
CN105793790B (en) Prioritizing cleaning zones
CN108885453A (en) The division of map for robot navigation
JP6750921B2 (en) Robot vacuum cleaner
US20190332121A1 (en) Moving robot and control method thereof
EP3629869B1 (en) Method of detecting a difference in level of a surface in front of a robotic cleaning device
US20190045992A1 (en) Method for operating an autonomously traveling floor treatment device
CN106940560A (en) Surveying and mapping with region division
CN109582015B (en) Indoor cleaning planning method and device and robot
CN108836195A (en) A kind of get rid of poverty method and the sweeping robot of sweeping robot
CN111096714A (en) Control system and method of sweeping robot and sweeping robot
CN112315379B (en) Mobile robot, control method and device thereof, and computer readable medium
CN111374603A (en) Control method and chip for partitioned cleaning of vision robot and intelligent sweeping robot
CN113841098A (en) Detecting objects using line arrays
CN111631650B (en) Indoor plan generating method, system and terminal based on obstacle height detection and sweeping robot
KR102397035B1 (en) Use of augmented reality to exchange spatial information with robotic vacuums
CN113786125A (en) Operation method, self-moving device and storage medium
CN112711250B (en) Self-walking equipment movement control method and self-walking equipment
KR101291149B1 (en) Apparatus and method for estimating ceiling height of mobile robot
CN114569003A (en) Control method and device of removable device, removable device and storage medium
CN113749585B (en) Semantic-based self-adaptive sweeping method for sweeping robot
CN111419115A (en) Control method of intelligent sweeping robot and intelligent sweeping robot
CN114489058A (en) Sweeping robot, path planning method and device thereof and storage medium
CN116300844A (en) Intelligent control method and device for cleaning equipment
CN114587210A (en) Cleaning robot control method and control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200515

RJ01 Rejection of invention patent application after publication