CN114326716A - Control method of cleaning robot, cleaning robot and storage medium - Google Patents

Control method of cleaning robot, cleaning robot and storage medium Download PDF

Info

Publication number
CN114326716A
CN114326716A CN202111515671.8A CN202111515671A CN114326716A CN 114326716 A CN114326716 A CN 114326716A CN 202111515671 A CN202111515671 A CN 202111515671A CN 114326716 A CN114326716 A CN 114326716A
Authority
CN
China
Prior art keywords
cleaning robot
tableware
target position
image information
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111515671.8A
Other languages
Chinese (zh)
Inventor
顾震江
吴生宇
白刚
林林庆
夏舸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202111515671.8A priority Critical patent/CN114326716A/en
Publication of CN114326716A publication Critical patent/CN114326716A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a control method of a cleaning robot, the cleaning robot and a storage medium, which are applied to the technical field of robots. When the cleaning robot reaches a target position, acquiring image information corresponding to a space to be cleaned; identifying tableware in the image information, and determining the food remaining degree on the tableware according to the ratio of food in the tableware in the image information; when the food residue degree is smaller than a preset food residue threshold value, determining whether a diner exists in the space to be cleaned according to the image information; when there is a diner in the space to be cleaned, the technical scheme of recycling the tableware solves the problem of mistakenly recycling the tableware, and the misoperation of the cleaning robot is reduced through the technical scheme of the application.

Description

Control method of cleaning robot, cleaning robot and storage medium
Technical Field
The present invention relates to the field of robot technology, and in particular, to a method for controlling a cleaning robot, and a storage medium.
Background
With the improvement of living standard, the use of the cleaning robot is visible everywhere. The cleaning robot is an intelligent machine capable of working semi-autonomously or fully autonomously, and is widely used in various industries. Wherein, the cleaning robot can realize the recovery of tableware. At present, the cleaning robot only simply and directly recovers tableware, and the phenomenon of mistakenly recovering the tableware is easy to occur.
Disclosure of Invention
The embodiment of the application aims to solve the problem of tableware mistaken recovery by providing a control method of a cleaning robot, the cleaning robot and a storage medium.
The embodiment of the application provides a control method of a cleaning robot, which comprises the following steps:
when the cleaning robot reaches a target position, acquiring image information corresponding to a space to be cleaned;
identifying tableware in the image information, and determining the food remaining degree on the tableware according to the ratio of food in the tableware in the image information;
when the food residue degree is smaller than a preset food residue threshold value, determining whether a diner exists in the space to be cleaned according to the image information;
the cutlery is recovered when a diner is present in the space to be cleaned.
In one embodiment, the step of determining the food retention on the cutlery based on the image information comprises:
determining a tableware detection frame according to the image information;
determining a first proportion of the tableware detection frame in the image information, wherein the tableware detection frame comprises tableware and food;
performing semantic segmentation processing on the tableware and the food in the tableware detection frame, and determining a second proportion of the food in the image information;
determining the food remaining on the tableware according to the ratio of the second ratio to the first ratio.
In an embodiment, before the step of acquiring the image information corresponding to the space to be cleaned when the cleaning robot reaches the target position, the method further includes:
the cleaning robot is automatically started based on a preset time length;
acquiring a reflected signal fed back by the detection equipment and current position information of the cleaning robot;
determining an environment map according to the reflection signal, and determining target positions according to the environment map, wherein the environment map comprises at least one target position, and all the target positions are sequenced in the environment map in a preset sequence;
determining a driving path of the cleaning robot according to the current position information and the target position;
and controlling the cleaning robot to move to the target position based on the driving path, and executing the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position.
In an embodiment, before the step of acquiring the image information corresponding to the space to be cleaned when the cleaning robot reaches the target position, the method further includes:
when a payment completion instruction sent by a terminal is received, obtaining a dining table number and determining a target position corresponding to the dining table number;
acquiring current position information of the cleaning robot;
determining a driving path of the cleaning robot according to the current position information and the target position;
and controlling the cleaning robot to move to the target position based on the driving path, and executing the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position.
In one embodiment, the controlling the cleaning robot to move to the target position based on the travel path includes:
detecting whether an obstacle exists or not in the process of controlling the cleaning robot to move based on the traveling path;
when an obstacle exists, determining a local path based on a preset path planning mode;
and controlling the cleaning robot to avoid obstacles based on the local path and enabling the cleaning robot to move to the target position.
In an embodiment, the preset path planning method at least includes one of the following: an artificial potential field method or a dynamic window method.
In one embodiment, the step of recovering the cutlery when a diner is present in the space to be cleaned comprises:
when a diner exists in the space to be cleaned, determining whether a response signal of the diner is received;
and when the response signal of the diner is received, the tableware is recycled.
In one embodiment, after the step of recovering the tableware when the diner is present in the space to be cleaned, the method further comprises:
when the food residual degree is larger than a preset food residual threshold value, forbidding tableware recovery;
or forbidding the tableware recovery when the space to be cleaned has a diner and does not receive the response signal of the diner.
Further, to achieve the above object, the present invention also provides a cleaning robot comprising: the control program of the cleaning robot realizes the steps of the control method of the cleaning robot when being executed by the processor.
Further, to achieve the above object, the present invention also provides a storage medium having stored thereon a control program of a cleaning robot, which when executed by a processor, implements the steps of the above-described control method of the cleaning robot.
According to the technical scheme of the control method of the cleaning robot, the cleaning robot and the storage medium, when the cleaning robot reaches the target position, image information corresponding to a space to be cleaned is obtained, and the food remaining degree on tableware is determined according to the image information; when the food remaining degree is smaller than the preset food remaining threshold value, the diner is indicated that the diner possibly finishes dining at the moment. And further detecting whether a diner exists in the space to be cleaned according to the image information. When waiting to clear up the space and having the diner, retrieve the technical scheme of tableware, this application has solved the problem that the tableware mistake was retrieved through carrying out dual detection to food residual degree and diner to reduce cleaning machines people's maloperation.
Drawings
FIG. 1 is a schematic diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a control method of a cleaning robot according to a first embodiment of the present invention;
FIG. 3 is a flowchart illustrating a control method of a cleaning robot according to a second embodiment of the present invention;
FIG. 4 is a flowchart illustrating a control method of a cleaning robot according to a third embodiment of the present invention;
FIG. 5 is a flowchart illustrating a fourth exemplary embodiment of a method for controlling a cleaning robot according to the present invention;
FIG. 6 is a schematic overall view of the tableware search and inspection plan of the present application;
FIG. 7 is a schematic flow chart of meal condition detection according to the present invention;
the objects, features, and advantages of the present invention will be further explained with reference to the accompanying drawings, which are an illustration of one embodiment, and not an entirety of the invention.
Detailed Description
For a better understanding of the above technical solutions, exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As shown in fig. 1, fig. 1 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that fig. 1 is a schematic structural diagram of a hardware operating environment of the cleaning robot.
As shown in fig. 1, the cleaning robot may include: a processor 1001, such as a CPU, a memory 1005, a user interface 1003, a network interface 1004, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the cleaning robot configuration shown in fig. 1 is not intended to be limiting of the cleaning robot and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of storage medium, may include therein an operating system, a network communication module, a user interface module, and a control program of the cleaning robot. Among them, the operating system is a program for managing and controlling hardware and software resources of the cleaning robot, a control program of the cleaning robot, and the operation of other software or programs.
In the cleaning robot shown in fig. 1, the user interface 1003 is mainly used for connecting a terminal, and performing data communication with the terminal; the network interface 1004 is mainly used for the background server and performs data communication with the background server; the processor 1001 may be used to invoke a control program of the cleaning robot stored in the memory 1005.
In the present embodiment, the cleaning robot includes: a memory 1005, a processor 1001 and a control program of a cleaning robot stored on the memory and executable on the processor, wherein:
when the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are performed:
when the cleaning robot reaches a target position, acquiring image information corresponding to a space to be cleaned;
identifying tableware in the image information, and determining the food remaining degree on the tableware according to the ratio of food in the tableware in the image information;
when the food residue degree is smaller than a preset food residue threshold value, determining whether a diner exists in the space to be cleaned according to the image information;
the cutlery is recovered when a diner is present in the space to be cleaned.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
determining a tableware detection frame according to the image information;
determining a first proportion of the tableware detection frame in the image information, wherein the tableware detection frame comprises tableware and food;
performing semantic segmentation processing on the tableware and the food in the tableware detection frame, and determining a second proportion of the food in the image information;
determining the food remaining on the tableware according to the ratio of the second ratio to the first ratio.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
the cleaning robot is automatically started based on a preset time length;
acquiring a reflected signal fed back by the detection equipment and current position information of the cleaning robot;
determining an environment map according to the reflection signal, and determining target positions according to the environment map, wherein the environment map comprises at least one target position, and all the target positions are sequenced in the environment map in a preset sequence;
determining a driving path of the cleaning robot according to the current position information and the target position;
and controlling the cleaning robot to move to the target position based on the driving path, and executing the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
when a payment completion instruction sent by a terminal is received, obtaining a dining table number and determining a target position corresponding to the dining table number;
acquiring current position information of the cleaning robot;
determining a driving path of the cleaning robot according to the current position information and the target position;
and controlling the cleaning robot to move to the target position based on the driving path, and executing the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
detecting whether an obstacle exists or not in the process of controlling the cleaning robot to move based on the traveling path;
when an obstacle exists, determining a local path based on a preset path planning mode;
and controlling the cleaning robot to avoid obstacles based on the local path and enabling the cleaning robot to move to the target position.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
the preset path planning mode at least comprises one of the following modes: an artificial potential field method or a dynamic window method.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
when a diner exists in the space to be cleaned, determining whether a response signal of the diner is received;
and when the response signal of the diner is received, the tableware is recycled.
When the processor 1001 calls the control program of the cleaning robot stored in the memory 1005, the following operations are also performed:
when the food residual degree is larger than a preset food residual threshold value, forbidding tableware recovery;
or forbidding the tableware recovery when the space to be cleaned has a diner and does not receive the response signal of the diner.
The first embodiment:
as shown in fig. 2, in a first embodiment of the present application, a control method of a cleaning robot of the present application includes the steps of:
step S110, when the cleaning robot reaches a target position, acquiring image information corresponding to a space to be cleaned;
step S120, identifying tableware in the image information, and determining the food residue on the tableware according to the ratio of food in the tableware in the image information;
step S130, when the food residual degree is smaller than a preset food residual threshold value, determining whether a diner exists in the space to be cleaned according to the image information;
and step S140, when a diner exists in the space to be cleaned, recovering the tableware.
In this embodiment, the cleaning robot may be applied to a restaurant, a dining room, etc. for search and retrieval of tableware. Because the traditional cleaning robot only directly recovers the tableware when detecting that no diner exists in the space to be cleaned. If the diner leaves the tableware only for a short time, the phenomenon of mistakenly recycling the tableware can occur. Therefore, the present application designs a cleaning robot control method. The control method of the cleaning robot carries out double judgment on the food residue degree on the tableware and whether a diner exists or not through detection, so that the phenomenon that the tableware is recovered by mistake is avoided. In other embodiments, the dinnerware may also be determined whether to be recovered by whether a diner's reply signal is received. Through the technical scheme of this application, realized reducing cleaning robot's maloperation.
Referring to fig. 6, fig. 6 is an overall schematic diagram of the tableware search and inspection scheme of the present application. Wherein, before the tableware is searched and inspected, an environment map is established. And planning a path according to the environment map. And in the process of moving to the target position based on the planned driving path, carrying out pedestrian obstacle avoidance. After the target position is reached, the meal condition is detected. After the meal condition is detected, a corresponding search and inspection action is executed.
In this embodiment, the target position is a dining table position, and the target position may be calibrated in advance according to a placement position of a dining table in the dining area. The cleaning robot drives to the target position based on the planned driving path, and image information of the space to be cleaned is obtained at the target position. Wherein, the space to be cleaned is a dining table. The cleaning robot comprises a cleaning robot, a high-definition camera device and a cleaning system.
In this embodiment, after the image information corresponding to the space to be cleaned is obtained, the food remaining degree on the tableware can be further determined according to the image information. The food remaining degree can be the remaining condition of food on the dinner plate, namely the food accounts for the ratio of the food on the dinner plate. Specifically, the image information may be input into a neural network model, and the determination of the tableware detection frame according to the image information may be realized by the neural network model. After obtaining the tableware detection frame, determining a first proportion of the tableware detection frame in the image information, wherein the tableware detection frame comprises tableware and food. After the first occupancy is determined, the tableware and the food in the tableware detection frame need to be separated to further determine the food remaining on the tableware. Specifically, semantic segmentation processing is performed on the tableware and the food in the tableware detection frame, and a second proportion of the food in the image information is determined. After obtaining a second ratio, determining the food remaining on the tableware according to the ratio of the second ratio to the first ratio.
For example, the area of the tableware detection frame is taken as the size S _ c of the tableware in the image, and the detected positioned tableware and the meal therein are semantically segmented to obtain the area size S _ f of the meal occupied in the image. At this time, the ratio of the meal to the tableware, R, (S _ f/S _ c), is obtained, R is used as an index of the food remaining degree, and when R is smaller than a certain preset threshold T, the probability of the completion of the meal is considered to be greater.
Optionally, the tidiness of food on the tableware can be determined according to the image information, so that the probability of completing the meal is further determined.
In the present embodiment, after determining the food remaining degree, the food remaining degree needs to be judged. Specifically, the food residue degree is compared with a preset food residue threshold value. And when the food residue degree is smaller than a preset food residue threshold value, executing the next operation. And the preset food residue threshold value is obtained by training according to a preset food residue sample. And when the food residue degree is smaller than the preset food residue threshold value, the probability of finishing the meal is higher. At this time, whether a diner exists in the space to be cleaned is determined according to the image information, and the tableware is recovered when the diner exists in the space to be cleaned. Optionally, when the food remaining degree is smaller than the preset food remaining threshold, continuing to perform search. The focal length of the high-definition camera equipment can be changed to acquire second image information, and the camera view field corresponding to the second image information is wider than the camera view field corresponding to the image information. And determining whether a diner exists in the space to be cleaned or in the vicinity of the space to be cleaned through the acquired second image information. The cutlery is retrieved when the space to be cleaned and a diner is present in the vicinity of the space to be cleaned.
Alternatively, referring to fig. 7, fig. 7 is a flow chart illustrating a meal condition detection. When a diner is present in the space to be cleaned, it is determined whether a response signal of the diner is received, which may be a voice signal. For example, when the cleaning robot transmits a voice signal of "whether or not the tableware can be collected", if the voice signal of the diner is received as "ok", it indicates that the response signal is received. And when the response signal of the diner is received, the tableware is recycled.
Optionally, when the food remaining degree is greater than a preset food remaining threshold, tableware recycling is prohibited; or forbidding the tableware recovery when the space to be cleaned has a diner and does not receive the response signal of the diner.
According to the technical scheme, when the cleaning robot reaches the target position, the image information corresponding to the space to be cleaned is obtained, and the food remaining degree on the tableware is determined according to the image information; when the food remaining degree is smaller than the preset food remaining threshold value, the diner is indicated that the diner possibly finishes dining at the moment. And further detecting whether a diner exists in the space to be cleaned according to the image information. When waiting to clear up the space and having the diner, retrieve the technical scheme of tableware, this application has solved the problem that the tableware mistake was retrieved through carrying out dual detection to food residual degree and diner to reduce cleaning machines people's maloperation.
Second embodiment:
as shown in fig. 3, the steps in fig. 3 precede step S110 in the first embodiment of the present application, where step S110 includes steps S210-S250:
step S210, the cleaning robot is started automatically based on preset time;
step S220, acquiring a reflected signal fed back by the detection equipment and current position information of the cleaning robot;
step S230, determining an environment map according to the reflection signal, and determining target positions according to the environment map, wherein the environment map comprises at least one target position, and the target positions are sequenced in the environment map in a preset sequence;
step S240, determining a driving path of the cleaning robot according to the current position information and the target position;
and step S250, controlling the cleaning robot to move to the target position based on the traveling path.
In this embodiment, the cleaning robot may be automatically started for search by setting a preset duration. The preset duration can be set according to actual conditions, for example, the preset duration can be set according to conditions such as business hours and dining times. And searching and checking the cleaning robot by self-starting based on preset time.
In this embodiment, a radar signal may be emitted through a detection device provided on the cleaning robot, and a reflected signal of the radar signal may be acquired. And feeding the reflected signal back to the cleaning robot processor. Wherein, the detection equipment is provided with a sensor for the laser radar. The current location information of the cleaning robot may be determined by matching a point cloud emitted by a lidar sensor of the cleaning robot with an indoor environment map previously constructed using the lidar. Specifically, after acquiring a reflection signal fed back by the detection device, the cleaning robot determines an environment map according to the reflection signal, so that a target position is determined according to the environment map. After determining the target position, a travel path of the cleaning robot may be determined according to the current position information and the target position. Controlling the robot to move to the target position based on the travel path. After receiving a reflected signal fed back by the detection equipment, the cleaning robot determines an environment map according to the reflected signal and by combining with a slam algorithm, then plans a path which enables the total cost to be minimum and can reach a target position on the basis of a global cost map through a global path planning algorithm A, the set target position is the outer side of each dining table position, each target position is a target point which is set to be sequential, and the cleaning robot can move to each sequential target point when global route planning is carried out.
According to the technical scheme, the technical means that the cleaning robot moves to the target position based on the running path after being automatically started based on the preset time is adopted, so that the search and the detection are carried out at the target position.
The third embodiment:
as shown in fig. 4, the steps in fig. 4 are located before step S110 in the first embodiment of the present application, where step S110 includes steps S310 to S340:
step S310, when a payment completion instruction sent by a terminal is received, a dining table number is obtained, and a target position corresponding to the dining table number is determined;
step S320, acquiring current position information of the cleaning robot;
step S330, determining a driving path of the cleaning robot according to the current position information and the target position;
and step S340, controlling the cleaning robot to move to the target position based on the travel path.
In this embodiment, after the guest determines payment in the foreground, the terminal sends a payment completion instruction to the cleaning robot. And enabling the cleaning robot to reach the designated dining table position for searching and checking. Specifically, the table number of the paying diner is obtained. And after the dining table is obtained, determining a target position corresponding to the dining table number. Information of the target position and the current position of the cleaning robot may be acquired. Thereby determining a travel path of the cleaning robot according to the current position information and the target position. After determining a travel path of the cleaning robot, controlling the cleaning robot to move to a target position based on the travel path.
According to the technical scheme, the technical means that the cleaning robot is started to move to the target position based on the traveling path when the payment completion instruction sent by the terminal is received is adopted, so that the search and the inspection are carried out at the target position.
The fourth embodiment:
as shown in fig. 5, fig. 5 is a schematic flow chart of a fourth embodiment of the present application. Based on step S250 of the second embodiment and step S340 of the third embodiment, the fourth embodiment of the present application includes the following steps:
step S251 of detecting whether an obstacle exists in a process of controlling the cleaning robot to move based on the travel path;
step S252, when an obstacle exists, determining a local path based on a preset path planning mode;
and step S253, controlling the cleaning robot to avoid obstacles based on the local path and enabling the cleaning robot to move to the target position.
In this embodiment, the cleaning robot detects whether an obstacle exists in real time during a process of moving to a target position based on the travel path or during a patrol process. The image can be collected by a high-definition camera device arranged on the cleaning robot. Analyzing the image to determine whether an obstacle exists in front of the cleaning robot. When an obstacle exists in front of the cleaning robot, a local path is planned. The robot can carry out local path planning to realize local obstacle avoidance in the global path planning moving process. Specifically, the local path is determined based on a preset path planning mode. And controlling the cleaning robot to avoid obstacles based on the local path until the cleaning robot moves to a target position. And performing tableware searching at the target position. Optionally, the preset path planning method at least includes one of the following: an artificial potential field method or a dynamic window method. The artificial potential field method is characterized in that the robot moves in the surrounding environment and is designed to move in an abstract artificial gravitational field, the target position generates attraction force on the cleaning robot, the obstacle generates repulsion force on the cleaning robot, and finally the movement of the cleaning robot is controlled by solving the resultant force. The dynamic window method is a commonly used obstacle avoidance planning method. This is a method of selecting a speed that can quickly reach the target position while avoiding an obstacle that may collide with the cleaning robot in the speed search space. The method directly searches the optimal control speed of the cleaning robot in the speed space, reduces the search space to the speed which can be reached under dynamic constraint, and therefore the dynamic characteristics of the cleaning robot are integrated into the method.
According to the technical scheme, the cleaning robot can smoothly move to the target position by adopting the technical means of carrying out local path planning and obstacle avoidance when an obstacle is detected in the process of moving the cleaning robot based on the driving path.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown or described herein.
Based on the same inventive concept, an embodiment of the present application further provides a storage medium, where a control program of the cleaning robot is stored, and when the control program of the cleaning robot is executed by a processor, the control program of the cleaning robot realizes the steps of controlling the cleaning robot as described above, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
Since the storage medium provided in the embodiments of the present application is a storage medium used for implementing the method in the embodiments of the present application, based on the method described in the embodiments of the present application, a person skilled in the art can understand a specific structure and a modification of the storage medium, and thus details are not described here. Any storage medium used in the methods of the embodiments of the present application is intended to be within the scope of the present application.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A control method of a cleaning robot, applied to a cleaning robot, the control method of a cleaning robot comprising:
when the cleaning robot reaches a target position, acquiring image information corresponding to a space to be cleaned;
identifying tableware in the image information, and determining the food remaining degree on the tableware according to the ratio of food in the tableware in the image information;
when the food residue degree is smaller than a preset food residue threshold value, determining whether a diner exists in the space to be cleaned according to the image information;
the cutlery is recovered when a diner is present in the space to be cleaned.
2. The method of claim 1, wherein the step of determining the food residue on the utensil from the image information comprises:
determining a tableware detection frame according to the image information;
determining a first proportion of the tableware detection frame in the image information, wherein the tableware detection frame comprises tableware and food;
performing semantic segmentation processing on the tableware and the food in the tableware detection frame, and determining a second proportion of the food in the image information;
determining the food remaining on the tableware according to the ratio of the second ratio to the first ratio.
3. The method of claim 1, wherein the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position is preceded by the step of:
the cleaning robot is automatically started based on a preset time length;
acquiring a reflected signal fed back by the detection equipment and current position information of the cleaning robot;
determining an environment map according to the reflection signal, and determining target positions according to the environment map, wherein the environment map comprises at least one target position, and all the target positions are sequenced in the environment map in a preset sequence;
determining a driving path of the cleaning robot according to the current position information and the target position;
and controlling the cleaning robot to move to the target position based on the driving path, and executing the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position.
4. The method of claim 1, wherein the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position is preceded by the step of:
when a payment completion instruction sent by a terminal is received, obtaining a dining table number and determining a target position corresponding to the dining table number;
acquiring current position information of the cleaning robot;
determining a driving path of the cleaning robot according to the current position information and the target position;
and controlling the cleaning robot to move to the target position based on the driving path, and executing the step of acquiring image information corresponding to the space to be cleaned when the cleaning robot reaches the target position.
5. The method of claim 3 or 4, wherein the step of controlling the cleaning robot to move to the target position based on the travel path comprises:
detecting whether an obstacle exists or not in the process of controlling the cleaning robot to move based on the traveling path;
when an obstacle exists, determining a local path based on a preset path planning mode;
and controlling the cleaning robot to avoid obstacles based on the local path and enabling the cleaning robot to move to the target position.
6. The method of claim 5, wherein the predetermined path planning method comprises at least one of: an artificial potential field method or a dynamic window method.
7. The method of claim 1, wherein the step of recovering the dishware when a diner is present in the space to be cleaned comprises:
when a diner exists in the space to be cleaned, determining whether a response signal of the diner is received;
and when the response signal of the diner is received, the tableware is recycled.
8. The method of claim 1, wherein after the step of recovering the dishware when a diner is present in the space to be cleaned, further comprising:
when the food residual degree is larger than a preset food residual threshold value, forbidding tableware recovery;
or forbidding the tableware recovery when the space to be cleaned has a diner and does not receive the response signal of the diner.
9. A cleaning robot, characterized in that the cleaning robot comprises: memory, a processor and a control program of a cleaning robot stored on the memory and executable on the processor, the control program of the cleaning robot realizing the steps of the control method of a cleaning robot according to any one of claims 1-8 when executed by the processor.
10. A storage medium characterized in that the storage medium stores a control program of a cleaning robot, which when executed by a processor implements the steps of the control method of a cleaning robot according to any one of claims 1 to 8.
CN202111515671.8A 2021-12-10 2021-12-10 Control method of cleaning robot, cleaning robot and storage medium Pending CN114326716A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111515671.8A CN114326716A (en) 2021-12-10 2021-12-10 Control method of cleaning robot, cleaning robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111515671.8A CN114326716A (en) 2021-12-10 2021-12-10 Control method of cleaning robot, cleaning robot and storage medium

Publications (1)

Publication Number Publication Date
CN114326716A true CN114326716A (en) 2022-04-12

Family

ID=81050037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111515671.8A Pending CN114326716A (en) 2021-12-10 2021-12-10 Control method of cleaning robot, cleaning robot and storage medium

Country Status (1)

Country Link
CN (1) CN114326716A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104677481A (en) * 2015-03-13 2015-06-03 广州视源电子科技股份有限公司 Food weight monitoring method and food weight monitoring device
CN109241815A (en) * 2018-06-29 2019-01-18 北京百度网讯科技有限公司 Detection method, device and the robot of user behavior
CN109846303A (en) * 2018-11-30 2019-06-07 广州富港万嘉智能科技有限公司 Service plate surplus automatic testing method, system, electronic equipment and storage medium
CN110765985A (en) * 2019-11-11 2020-02-07 上海秒针网络科技有限公司 Dish replenishment monitoring method and system for cafeteria
CN112070253A (en) * 2020-08-31 2020-12-11 华迪计算机集团有限公司 Intelligent service system based on Internet of things
CN112947106A (en) * 2020-03-06 2021-06-11 智慧式有限公司 Intelligent household kitchen system and control method thereof
JP2021096766A (en) * 2019-12-19 2021-06-24 キヤノンマーケティングジャパン株式会社 Information processing device, information processing system, notification method, and program
KR20210077022A (en) * 2019-12-16 2021-06-25 한국전자기술연구원 Robot apparatus to obtain food customized according to food type and remaining amount of food and method for same
CN113287119A (en) * 2019-09-18 2021-08-20 乐恩吉室株式会社 Restaurant transferring and recycling system in store by utilizing image recognition and restaurant transferring and recycling method by utilizing same

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104677481A (en) * 2015-03-13 2015-06-03 广州视源电子科技股份有限公司 Food weight monitoring method and food weight monitoring device
CN109241815A (en) * 2018-06-29 2019-01-18 北京百度网讯科技有限公司 Detection method, device and the robot of user behavior
CN109846303A (en) * 2018-11-30 2019-06-07 广州富港万嘉智能科技有限公司 Service plate surplus automatic testing method, system, electronic equipment and storage medium
CN113287119A (en) * 2019-09-18 2021-08-20 乐恩吉室株式会社 Restaurant transferring and recycling system in store by utilizing image recognition and restaurant transferring and recycling method by utilizing same
CN110765985A (en) * 2019-11-11 2020-02-07 上海秒针网络科技有限公司 Dish replenishment monitoring method and system for cafeteria
KR20210077022A (en) * 2019-12-16 2021-06-25 한국전자기술연구원 Robot apparatus to obtain food customized according to food type and remaining amount of food and method for same
JP2021096766A (en) * 2019-12-19 2021-06-24 キヤノンマーケティングジャパン株式会社 Information processing device, information processing system, notification method, and program
CN112947106A (en) * 2020-03-06 2021-06-11 智慧式有限公司 Intelligent household kitchen system and control method thereof
CN112070253A (en) * 2020-08-31 2020-12-11 华迪计算机集团有限公司 Intelligent service system based on Internet of things

Similar Documents

Publication Publication Date Title
CN110827190B (en) Image processing for automatic object identification
JP4504433B2 (en) Object search apparatus and method
CN112515563B (en) Obstacle avoiding method, sweeping robot and readable storage medium
CN112861975A (en) Generation method of classification model, classification method, device, electronic equipment and medium
CN109819400B (en) User position searching method, device, equipment and medium
CN106155592A (en) A kind of photo processing method and terminal
US20210304295A1 (en) Utilizing machine learning to generate augmented reality vehicle information for a vehicle captured by cameras in a vehicle lot
US11880998B2 (en) Systems and methods for aiding a visual positioning system with indoor wayfinding
CN108064389A (en) A kind of target identification method, device and intelligent terminal
CN114359692A (en) Room identification method and device, electronic equipment and storage medium
CN104268504B (en) Image identification method and device
CN114326716A (en) Control method of cleaning robot, cleaning robot and storage medium
CN111872928B (en) Obstacle attribute distinguishing method and system and intelligent robot
CN113733086A (en) Robot traveling method, device, equipment and storage medium
CN103530646A (en) Complex-object detection using a cascade of classifiers
CN112286185A (en) Floor sweeping robot, three-dimensional map building method and system thereof, and computer readable storage medium
CN110025280A (en) Tableware recovery and treatment method and tableware recyclable device
CN109002511A (en) A kind of intelligent recommendation method and apparatus of public lavatory
US11594079B2 (en) Methods and apparatus for vehicle arrival notification based on object detection
WO2020005905A1 (en) System and method for a task management and communication system
CN104102712A (en) Method and device for entity identification
CN113143114B (en) Sweeper and naming method of sweeping area thereof and computer readable storage medium
CN101924933A (en) Method for tracing interested area in video frame sequence
US20220414733A1 (en) Data lookup based on correlation of user interaction information
CN114027746B (en) Control method, control device, storage medium, electronic device, and cleaning robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination