CN110936370A - Cleaning robot control method and device - Google Patents

Cleaning robot control method and device Download PDF

Info

Publication number
CN110936370A
CN110936370A CN201811115975.3A CN201811115975A CN110936370A CN 110936370 A CN110936370 A CN 110936370A CN 201811115975 A CN201811115975 A CN 201811115975A CN 110936370 A CN110936370 A CN 110936370A
Authority
CN
China
Prior art keywords
obstacle
image information
cleaning robot
attribute
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811115975.3A
Other languages
Chinese (zh)
Inventor
吴少波
连园园
陈浩广
冯德兵
谌进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Gree Wuhan Electric Appliances Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Gree Wuhan Electric Appliances Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Gree Wuhan Electric Appliances Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201811115975.3A priority Critical patent/CN110936370A/en
Publication of CN110936370A publication Critical patent/CN110936370A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning

Abstract

The invention discloses a cleaning robot control method and device. Wherein, the method comprises the following steps: acquiring image information of an obstacle; inputting image information into a recognition model, outputting the types and attributes of the obstacles corresponding to the image information by the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of the obstacle; the cleaning robot is controlled according to the type of the obstacle and the attribute of the obstacle. The invention solves the technical problem that the cleaning robot in the related art has weak detection capability on the target obstacle.

Description

Cleaning robot control method and device
Technical Field
The invention relates to the technical field of cleaning robot control, in particular to a cleaning robot control method and device.
Background
In the modern society, along with the continuous development and progress of science and technology, the application of the robot is more and more extensive. For example, a cleaning robot is one of the common cleaning robots in life, and as a substitute for human labor, the cleaning robot plays an irreplaceable role in our life. From the market penetration rate, the penetration rate of the cleaning robot market in the United states reaches 10% to 11%, and the penetration rate of the cleaning robot market in China is below 5%. Therefore, the development prospect of the cleaning robot in China is very wide. However, the cleaning robot that is now widely used in the market has the following problems: the route planning is unclear; obstacle detouring ability is poor. The key to the above problem is that the cleaning robot in the prior art has a weak detection capability for the target obstacle.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a cleaning robot control method and device, which at least solve the technical problem that the cleaning robot is weak in detection capability on a target obstacle in the related art.
According to an aspect of an embodiment of the present invention, there is provided a cleaning robot control method including: acquiring image information of an obstacle; inputting the image information into a recognition model, and outputting the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple sets of training data through machine learning training, and each set of data in the multiple sets of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of an obstacle; and controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle.
Optionally, acquiring the image information of the obstacle includes: shooting a picture through an image acquisition device arranged on the cleaning robot; and determining image information according to the photo.
Optionally, the recognition model is established by an online difficult sample mining algorithm and a positive and negative sample equilibrium optimization training method.
Optionally, the controlling the cleaning robot according to the kind of the obstacle and the attribute of the obstacle includes: determining the action of the obstacle according to the type of the obstacle, wherein the action comprises at least one of the following actions of bypassing, straight going and crossing; and judging whether the action is feasible according to the attribute of the obstacle, and executing the action under the feasible condition.
Optionally, the determining the action on the obstacle according to the type of the obstacle comprises: selecting a detour action when the obstacle is a boundary obstacle; and/or, in the case that the obstacle is a solid obstacle with hardness exceeding a hardness threshold, judging whether the obstacle can move straight or not, and in the case that the obstacle can be executed, selecting a straight movement; and/or, in the case that the obstacle is a soft obstacle with hardness not exceeding a hardness threshold, judging whether the obstacle can be overturned, and in the case that the obstacle can be executed, selecting the overturning action.
Optionally, determining whether the line can be straight comprises: judging whether the solid barrier can move straight or not according to the weight and the size of the solid barrier; determining that a straight line is possible in the event that the weight and size of the solid obstacle do not exceed a first threshold; wherein the first threshold comprises a first weight threshold and a first size threshold.
Optionally, determining whether the flip is possible comprises: judging whether the soft obstacle can be overturned according to the weight and the size of the soft obstacle; determining that a rollover is possible in the event that the weight and size of the soft obstacle does not exceed a second threshold; wherein the second threshold comprises a second weight threshold and a second size threshold.
According to another aspect of the embodiments of the present invention, there is also provided a cleaning robot control apparatus including: the acquisition module is used for acquiring the image information of the obstacle; the recognition module is used for inputting the image information into a recognition model, and outputting the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of an obstacle; and the control module is used for controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium, where the storage medium stores program instructions, and when the program instructions are executed, the storage medium is controlled by an apparatus to execute any one of the above methods.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the method described in any one of the above.
In the embodiment of the invention, the image information of the obstacle is acquired; inputting the image information into a recognition model, and outputting the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple sets of training data through machine learning training, and each set of data in the multiple sets of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of an obstacle; the mode of controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle achieves the purpose of accurately identifying the type and the attribute of the obstacle in the image information through the identification model, so that the detection efficiency and the detection precision of the target obstacle are improved, the technical effect of more effectively controlling the cleaning robot to work is achieved, and the technical problem that the detection capability of the cleaning robot to the target obstacle is weak in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of a cleaning robot control method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a cleaning robot control device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided an embodiment of a cleaning robot control method, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of a cleaning robot control method according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, obtaining image information of an obstacle;
step S104, inputting the image information into a recognition model, and outputting the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of the obstacle;
step S106, the cleaning robot is controlled according to the type of the obstacle and the attribute of the obstacle.
Through the steps, the image information of the obstacle can be acquired; inputting image information into a recognition model, outputting the types and attributes of the obstacles corresponding to the image information by the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of the obstacle; the method for controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle achieves the purpose of accurately identifying the type and the attribute of the obstacle in the image information through the identification model, thereby achieving the technical effects of improving the detection efficiency and the detection precision of the target obstacle, further effectively controlling the cleaning robot to work, and further solving the technical problem that the detection capability of the cleaning robot to the target obstacle in the related technology is weak.
The above-mentioned acquiring of the image information of the obstacle may adopt various manners. For example, the collection may be performed by a monitoring device in the working area of the cleaning robot, or may be performed by a camera of the cleaning robot itself. The information collected by the equipment can be pictures, videos and the like. The acquired image information of the obstacle is all obstacle information that can affect the operation of the cleaning robot around the position where the cleaning robot is located.
After the identification model is built, the identification model is trained, wherein training data are image information and the type and the attribute of the obstacle corresponding to the image information, and the model can obtain the type and the attribute of the obstacle corresponding to the type and the attribute of the obstacle according to the obtained image information of the obstacle. The recognition model is optimized through an online difficult sample mining algorithm and a positive and negative sample balance optimization training method, the detection of the recognition model on many small targets is enhanced, and the feature extraction capability and the classification detection capability of the model are improved. The identification model is obtained through training of a large amount of image information, and can determine a detection target through the acquired image information, so that the detection efficiency and precision of a target obstacle are improved, the detection of many small targets is enhanced, and the feature extraction capability and the classification detection capability of the model are improved. The cleaning robot can plan a clear route according to the output target detection result, and the obstacle bypassing capacity is improved.
After the recognition result is obtained from the recognition model, the cleaning robot is further controlled according to the type of the obstacle and the attribute of the obstacle, so that the cleaning robot can complete the planning of the cleaning route, and the cleaning efficiency is improved.
Besides being applied to a sweeping robot, the technical scheme can also be applied to intelligent home furnishing, and can identify targets in the intelligent furniture environment, such as people, animals and the like, wherein the identification targets can be moving objects or static objects. By improving the target detection efficiency and precision and enhancing the detection of a plurality of small targets, the feature extraction capability and the classification detection capability of the model are improved, and the intelligent home can more accurately realize the intelligent operation of the intelligent home through a target detection result.
Optionally, acquiring the image information of the obstacle includes: shooting a picture through an image acquisition device arranged on the cleaning robot; image information is determined from the photograph.
The cleaning robot is provided with an image acquisition device, the image acquisition device comprises a camera, a video recorder and the like, wherein the number of the image acquisition devices can be one or more, and the image acquisition devices can freely move to obtain image information, so that the obstacle images can be acquired without dead angles. Further, key features of the obstacles in the acquired photos are extracted, and image information related to the obstacles is obtained. It should be noted that, when extracting and collecting the key features of the obstacle, the corresponding processing such as denoising is performed on the picture, and the interference factors which are useless for obstacle identification are filtered out.
Optionally, the recognition model is established by an online difficult sample mining algorithm and a positive and negative sample equilibrium optimization training method.
The network structure realized by the online difficult sample mining algorithm comprises a feature extraction network, a fusion feature, a region generation network RPN, a classification regression network and the like, and the adverse effect between the acquisition device and the barrier, such as distance, acquisition angle and the like, can be eliminated by the algorithm. Meanwhile, the training process of the recognition model is optimized in a balanced mode with the positive and negative samples, the targets of multiple scales and proportions of the surrounding scene of the cleaning robot are matched, and the target detection efficiency and precision are improved. The positive and negative sample balanced optimization training method is that the recognition model is trained through the corresponding confrontation samples of the positive sample and the positive sample, so that the recognition capability of the recognition model can be effectively improved, and the error rate of the recognition model is effectively reduced.
Alternatively, the controlling of the cleaning robot according to the kind of the obstacle and the attribute of the obstacle includes: determining the action of the obstacle according to the type of the obstacle, wherein the action comprises at least one of the following actions of bypassing, straight going and crossing; and judging whether the action is feasible according to the attribute of the obstacle, and executing the action if feasible.
The cleaning robot can perform related activities according to the type of the obstacle and the attribute of the obstacle, and the cleaning robot is selected to perform actions such as detour, straight movement, crossing and the like on the obstacle according to the type of the obstacle.
Optionally, the determining the action on the obstacle according to the type of the obstacle comprises: selecting a detour action when the barrier is a boundary barrier; and/or, in the case that the obstacle is a solid obstacle with hardness exceeding a hardness threshold, judging whether the obstacle can move straight or not, and in the case that the obstacle can be executed, selecting straight movement; and/or, judging whether the obstacle can be overturned or not under the condition that the obstacle is a soft obstacle with hardness not exceeding a hardness threshold value, and selecting the overturning action under the condition that the obstacle can be executed.
The cleaning robot can do some targeted actions according to different types of obstacles, wherein the targeted actions comprise bypassing, straight running, crossing and the like. Specifically, when the obstacle is a boundary obstacle, for example, a corner, a sofa, a table, or the like, the cleaning robot cannot pass through the boundary obstacle, and only can selectively detour the boundary obstacle. The obstacles in life can influence the planning of the cleaning route by the cleaning robot, the obstacles do not belong to the cleaning object or can not be cleaned, the cleaning robot can also recognize the obstacles according to the characteristics of the obstacles, further, the obstacles can be classified according to whether the hardness of the obstacles exceeds the hardness threshold value, the obstacles can be classified into solid obstacles and soft obstacles according to the judgment condition, and then corresponding actions are executed. If the obstacle is determined to be a solid obstacle with hardness exceeding the hardness threshold, whether the cleaning robot can move straight is further judged, and if the obstacle can move straight, the straight movement is selected. If the obstacle is determined to be a soft obstacle with hardness not exceeding the hardness threshold, whether the cleaning robot can be overturned or not is further judged, and if the obstacle can be executed, the overturning action is selected. The cleaning robot may select the operation to be performed according to the type of the obstacle, and may perform the selected operation individually or in combination.
Furthermore, the cleaning robot can plan a clear cleaning route by the technical scheme, and the obstacle bypassing capacity of the cleaning robot is greatly improved.
Optionally, determining whether the line can be straight comprises: judging whether the straight movement can be performed or not according to the weight and the size of the solid barrier; determining that a straight line can be made if the weight and size of the solid obstacle do not exceed a first threshold; wherein the first threshold comprises a first weight threshold and a first size threshold.
In the case where it is determined that the obstacle is a solid obstacle, it is necessary to further determine whether or not the cleaning robot can perform the cleaning operation based on the weight and size of the solid obstacle. Wherein the judging condition is whether the weight and the size of the solid obstacle exceed a first threshold value, and the first threshold value comprises a first weight threshold value and a first size threshold value. If the weight and size of the solid obstacle exceeds a first threshold value, the cleaning robot chooses to bypass. If the weight and the size of the solid obstacle do not exceed the first threshold value, the cleaning robot selects to go straight. It should be noted that, when determining the first weight threshold and the first size threshold of the solid obstacle, the determination is made according to the current specific work capacity of the cleaning robot. Therefore, the first threshold values of the different cleaning robots are also different. For example, solid obstacles include tables and chairs, pens falling on the ground, and the like.
Optionally, determining whether the flip is possible comprises: judging whether the soft obstacle can be overturned according to the weight and the size of the soft obstacle; determining that the soft obstacle can be crossed under the condition that the weight and the size of the soft obstacle do not exceed a second threshold value; wherein the second threshold comprises a second weight threshold and a second size threshold.
When it is determined that the obstacle of the cleaning robot is a soft obstacle, it is necessary to further determine whether the cleaning robot can perform the cleaning operation based on the weight and size of the soft obstacle. The judging condition is whether the weight and the size of the soft obstacle exceed a first threshold value, and the first threshold value comprises a first weight threshold value and a first size threshold value. If the weight and the size of the soft obstacle exceed a first threshold value, the cleaning robot selects to bypass. If the weight and size of the soft obstacle do not exceed the first threshold value, the cleaning robot chooses to flip over. It should be noted that the first weight threshold and the first size threshold of the soft obstacle are determined according to the current specific work capacity of the cleaning robot. Therefore, the first threshold values of the different cleaning robots are also different. For example, soft body obstacles include carpets, clothing, and the like.
Fig. 2 is a schematic structural view of a cleaning robot control device according to an embodiment of the present invention; as shown in fig. 2, the cleaning robot control device includes: an acquisition module 22, an identification module 24, and a control module 26. The cleaning robot control device will be described in detail below.
An obtaining module 22, configured to obtain image information of an obstacle; the recognition module 24 is connected to the obtaining module 22, and is configured to input the image information into a recognition model, and output the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, where the recognition model is obtained by using multiple sets of training data through machine learning training, and each set of data in the multiple sets of training data includes: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of the obstacle; and the control module 26 is connected with the identification module 24 and used for controlling the cleaning robot according to the type of the obstacles and the attributes of the obstacles.
The control device of the cleaning robot can realize the acquisition of the image information of the obstacle; inputting image information into a recognition model, outputting the types and attributes of the obstacles corresponding to the image information by the recognition model, wherein the recognition model is obtained by using a plurality of groups of training data through machine learning training, and each group of data in the plurality of groups of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of the obstacle; the method for controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle achieves the purpose of accurately identifying the type and the attribute of the obstacle in the image information through the identification model, thereby achieving the technical effects of improving the detection efficiency and the detection precision of the target obstacle, further effectively controlling the cleaning robot to work, and further solving the technical problem that the detection capability of the cleaning robot to the target obstacle in the related technology is weak.
The above-mentioned image information of obtaining the obstacle can adopt various modes, for example, can be collected through monitoring equipment in the cleaning robot work area, and can also be collected by the camera of the cleaning robot itself. The information collected by the equipment can be pictures, videos and the like. The acquired image information of the obstacle is all obstacle information that can affect the operation of the cleaning robot around the position where the cleaning robot is located.
After the identification model is built, the identification model is trained, wherein training data are image information and the type and the attribute of the obstacle corresponding to the image information, and the model can obtain the type and the attribute of the obstacle corresponding to the type and the attribute of the obstacle according to the obtained image information of the obstacle. The recognition model is optimized through an online difficult sample mining algorithm and a positive and negative sample balance optimization training method, the detection of the recognition model on many small targets is enhanced, and the feature extraction capability and the classification detection capability of the model are improved. The identification model is obtained through training of a large amount of image information, and can determine a detection target through the acquired image information, so that the detection efficiency and precision of a target obstacle are improved, the detection of many small targets is enhanced, and the feature extraction capability and the classification detection capability of the model are improved. The cleaning robot can plan a clear route according to the output target detection result, and the obstacle bypassing capacity is improved.
After the recognition result is obtained from the recognition model, the cleaning robot is further controlled according to the type of the obstacle and the attribute of the obstacle, so that the cleaning robot can complete the planning of the cleaning route, and the cleaning efficiency is improved.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium storing program instructions, wherein when the program instructions are executed, the apparatus on which the storage medium is located is controlled to execute the method of any one of the above.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the method of any one of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A cleaning robot control method is characterized by comprising:
acquiring image information of an obstacle;
inputting the image information into a recognition model, and outputting the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple sets of training data through machine learning training, and each set of data in the multiple sets of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of an obstacle;
and controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle.
2. The method of claim 1, wherein obtaining image information of an obstacle comprises:
shooting a picture through an image acquisition device arranged on the cleaning robot;
and determining image information according to the photo.
3. The method of claim 1, wherein the recognition model is established by an online difficult sample mining algorithm and a positive and negative sample equilibrium optimization training method.
4. The method of claim 1, wherein controlling the cleaning robot according to the type of obstacle and the attribute of the obstacle comprises:
determining the action of the obstacle according to the type of the obstacle, wherein the action comprises at least one of the following actions of bypassing, straight going and crossing;
and judging whether the action is feasible according to the attribute of the obstacle, and executing the action under the feasible condition.
5. The method of claim 4, wherein determining the action for the obstacle based on the type of obstacle comprises:
selecting a detour action when the obstacle is a boundary obstacle;
and/or, in the case that the obstacle is a solid obstacle with hardness exceeding a hardness threshold, judging whether the obstacle can move straight or not, and in the case that the obstacle can be executed, selecting a straight movement;
and/or, in the case that the obstacle is a soft obstacle with hardness not exceeding a hardness threshold, judging whether the obstacle can be overturned, and in the case that the obstacle can be executed, selecting the overturning action.
6. The method of claim 5, wherein determining whether the line can be straight comprises:
judging whether the solid barrier can move straight or not according to the weight and the size of the solid barrier;
determining that a straight line is possible in the event that the weight and size of the solid obstacle do not exceed a first threshold;
wherein the first threshold comprises a first weight threshold and a first size threshold.
7. The method of any of claims 5, wherein determining whether the flip is possible comprises:
judging whether the soft obstacle can be overturned according to the weight and the size of the soft obstacle;
determining that a rollover is possible in the event that the weight and size of the soft obstacle does not exceed a second threshold;
wherein the second threshold comprises a second weight threshold and a second size threshold.
8. A cleaning robot control device, comprising:
the acquisition module is used for acquiring the image information of the obstacle;
the recognition module is used for inputting the image information into a recognition model, and outputting the obstacle type and the obstacle attribute corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple groups of training data through machine learning training, and each group of data in the multiple groups of training data comprises: the image information and the obstacle type and the obstacle attribute corresponding to the image information, wherein the obstacle attribute comprises the size of an obstacle;
and the control module is used for controlling the cleaning robot according to the type of the obstacle and the attribute of the obstacle.
9. A storage medium storing program instructions, wherein the program instructions, when executed, control an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 7.
10. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of any of claims 1 to 7.
CN201811115975.3A 2018-09-25 2018-09-25 Cleaning robot control method and device Pending CN110936370A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811115975.3A CN110936370A (en) 2018-09-25 2018-09-25 Cleaning robot control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811115975.3A CN110936370A (en) 2018-09-25 2018-09-25 Cleaning robot control method and device

Publications (1)

Publication Number Publication Date
CN110936370A true CN110936370A (en) 2020-03-31

Family

ID=69905054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811115975.3A Pending CN110936370A (en) 2018-09-25 2018-09-25 Cleaning robot control method and device

Country Status (1)

Country Link
CN (1) CN110936370A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111528737A (en) * 2020-05-08 2020-08-14 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN111643011A (en) * 2020-05-26 2020-09-11 深圳市杉川机器人有限公司 Cleaning robot control method and device, cleaning robot and storage medium
CN111872928A (en) * 2020-06-08 2020-11-03 特斯联科技集团有限公司 Obstacle attribute distinguishing method and system and intelligent robot
CN112155486A (en) * 2020-09-30 2021-01-01 王丽敏 Control method and control device of sweeping robot
CN112692844A (en) * 2020-12-15 2021-04-23 大国重器自动化设备(山东)股份有限公司 Control method of artificial intelligent drug nursing robot
CN113106907A (en) * 2021-04-30 2021-07-13 广东美房智高机器人有限公司 Method, system, equipment and medium for robot to identify and process obstacle
CN113647864A (en) * 2021-07-21 2021-11-16 美智纵横科技有限责任公司 Method and device for determining operation of cleaning robot, electronic device and medium
WO2022077945A1 (en) * 2020-10-14 2022-04-21 北京石头世纪科技股份有限公司 Obstacle recognition information feedback method and apparatus, robot, and storage medium
CN114935341A (en) * 2022-07-25 2022-08-23 深圳市景创科技电子股份有限公司 Novel SLAM navigation calculation video identification method and device
US11874668B2 (en) 2019-12-27 2024-01-16 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102809451A (en) * 2011-05-31 2012-12-05 精工爱普生株式会社 Stress-detecting element, sensor module, and electronic apparatus
CN104503450A (en) * 2014-11-27 2015-04-08 无锡北斗星通信息科技有限公司 Service robot achieving intelligent obstacle crossing
CN105511478A (en) * 2016-02-23 2016-04-20 百度在线网络技术(北京)有限公司 Robot cleaner, control method applied to same and terminal
US20160280235A1 (en) * 2015-03-23 2016-09-29 Toyota Jidosha Kabushiki Kaisha Autonomous driving device
CN106821157A (en) * 2017-04-14 2017-06-13 小狗电器互联网科技(北京)股份有限公司 The cleaning method that a kind of sweeping robot is swept the floor
CN106909139A (en) * 2015-12-23 2017-06-30 袁祖六 Avoidance robot based on tactile
CN107456173A (en) * 2016-06-06 2017-12-12 北京小米移动软件有限公司 Barrier crossing method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102809451A (en) * 2011-05-31 2012-12-05 精工爱普生株式会社 Stress-detecting element, sensor module, and electronic apparatus
CN104503450A (en) * 2014-11-27 2015-04-08 无锡北斗星通信息科技有限公司 Service robot achieving intelligent obstacle crossing
US20160280235A1 (en) * 2015-03-23 2016-09-29 Toyota Jidosha Kabushiki Kaisha Autonomous driving device
CN106909139A (en) * 2015-12-23 2017-06-30 袁祖六 Avoidance robot based on tactile
CN105511478A (en) * 2016-02-23 2016-04-20 百度在线网络技术(北京)有限公司 Robot cleaner, control method applied to same and terminal
CN107456173A (en) * 2016-06-06 2017-12-12 北京小米移动软件有限公司 Barrier crossing method and device
CN106821157A (en) * 2017-04-14 2017-06-13 小狗电器互联网科技(北京)股份有限公司 The cleaning method that a kind of sweeping robot is swept the floor

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11874668B2 (en) 2019-12-27 2024-01-16 Samsung Electronics Co., Ltd. Electronic apparatus and method of controlling thereof
CN111528737A (en) * 2020-05-08 2020-08-14 小狗电器互联网科技(北京)股份有限公司 Control method and device of sweeper
CN111643011A (en) * 2020-05-26 2020-09-11 深圳市杉川机器人有限公司 Cleaning robot control method and device, cleaning robot and storage medium
WO2021248857A1 (en) * 2020-06-08 2021-12-16 特斯联科技集团有限公司 Obstacle attribute discrimination method and system, and intelligent robot
CN111872928A (en) * 2020-06-08 2020-11-03 特斯联科技集团有限公司 Obstacle attribute distinguishing method and system and intelligent robot
CN111872928B (en) * 2020-06-08 2022-04-05 特斯联科技集团有限公司 Obstacle attribute distinguishing method and system and intelligent robot
CN112155486A (en) * 2020-09-30 2021-01-01 王丽敏 Control method and control device of sweeping robot
WO2022077945A1 (en) * 2020-10-14 2022-04-21 北京石头世纪科技股份有限公司 Obstacle recognition information feedback method and apparatus, robot, and storage medium
CN112692844A (en) * 2020-12-15 2021-04-23 大国重器自动化设备(山东)股份有限公司 Control method of artificial intelligent drug nursing robot
CN113106907A (en) * 2021-04-30 2021-07-13 广东美房智高机器人有限公司 Method, system, equipment and medium for robot to identify and process obstacle
CN113106907B (en) * 2021-04-30 2022-07-15 广东美房智高机器人有限公司 Method, system, equipment and medium for robot to identify and process obstacle
CN113647864A (en) * 2021-07-21 2021-11-16 美智纵横科技有限责任公司 Method and device for determining operation of cleaning robot, electronic device and medium
CN114935341A (en) * 2022-07-25 2022-08-23 深圳市景创科技电子股份有限公司 Novel SLAM navigation calculation video identification method and device

Similar Documents

Publication Publication Date Title
CN110936370A (en) Cleaning robot control method and device
CN111543902B (en) Floor cleaning method and device, intelligent cleaning equipment and storage medium
CN105872477B (en) video monitoring method and video monitoring system
CN110175514B (en) Face brushing payment prompting method, device and equipment
CN109190508B (en) Multi-camera data fusion method based on space coordinate system
CN105335725B (en) A kind of Gait Recognition identity identifying method based on Fusion Features
CN102741882B (en) Image classification device, image classification method, integrated circuit, modeling apparatus
TW200820099A (en) Target moving object tracking device
CN106844492B (en) A kind of method of recognition of face, client, server and system
CN107909443A (en) Information-pushing method, apparatus and system
US20210027097A1 (en) Training methods for deep networks
CN109685037B (en) Real-time action recognition method and device and electronic equipment
CN101853391A (en) Messaging device and method, program and recording medium
CN109358546B (en) Control method, device and system of household appliance
CN101794384A (en) Shooting action identification method based on human body skeleton map extraction and grouping motion diagram inquiry
CN105243395A (en) Human body image comparison method and device
Rybok et al. The kit robo-kitchen data set for the evaluation of view-based activity recognition systems
CN108304834B (en) A kind of object follower method
CN107392182A (en) A kind of face collection and recognition method and device based on deep learning
CN105844258A (en) Action identifying method and apparatus
CN111643014A (en) Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
CN110928282A (en) Control method and device for cleaning robot
CN106127161A (en) Fast target detection method based on cascade multilayer detector
CN108268863A (en) A kind of image processing method, device and computer storage media
CN109460792A (en) A kind of artificial intelligence model training method and device based on image recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200331

RJ01 Rejection of invention patent application after publication