CN112959328B - Robot control method, robot control device, robot, and storage medium - Google Patents

Robot control method, robot control device, robot, and storage medium Download PDF

Info

Publication number
CN112959328B
CN112959328B CN202110359532.4A CN202110359532A CN112959328B CN 112959328 B CN112959328 B CN 112959328B CN 202110359532 A CN202110359532 A CN 202110359532A CN 112959328 B CN112959328 B CN 112959328B
Authority
CN
China
Prior art keywords
target
robot
image
cleaning
contamination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110359532.4A
Other languages
Chinese (zh)
Other versions
CN112959328A (en
Inventor
饶向荣
支涛
应甫臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunji Technology Co Ltd
Original Assignee
Beijing Yunji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunji Technology Co Ltd filed Critical Beijing Yunji Technology Co Ltd
Priority to CN202110359532.4A priority Critical patent/CN112959328B/en
Publication of CN112959328A publication Critical patent/CN112959328A/en
Application granted granted Critical
Publication of CN112959328B publication Critical patent/CN112959328B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a robot control method, a device, a robot and a storage medium, wherein the method comprises the following steps: acquiring a target image at a target area on a robot body; determining a target recognition result of the target image based on the target image and a preset image recognition model, wherein the target recognition result comprises the existence of contamination and the absence of contamination; and when the target recognition result is that the contamination exists, controlling the robot to move to a target cleaning place so as to perform cleaning treatment on a target area of the robot. In the above-mentioned scheme, the robot can detect stained by oneself, and can remove by oneself to the clean place of target and clean detecting stained back, does not need artifical regularly to detect the stained of robot, greatly reduced the cost of labor, improved the cleaning efficiency of robot.

Description

Robot control method, robot control device, robot, and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a robot control method and device, a robot and a storage medium.
Background
With the continuous progress of science and technology, robots are used to replace manual tasks in many fields, for example, robots are used in hotels to transport goods for guests or take out meals, robots are used in logistics companies to transport goods, and the like.
The robot is provided with a carrying area for placing articles, for example, a cabin body is arranged inside the robot, or a part for placing articles is arranged outside the robot. In the prior art, when a robot conveys special articles, such as food with soup, liquid pigment and powder articles, if the sealing is not good, the articles are likely to spill in the conveying process to cause the contamination of the robot, so that the robot needs to be checked manually at regular time, whether the robot needs to be cleaned or not, and the labor cost is increased.
Disclosure of Invention
The embodiment of the invention provides a robot control method and device, a robot and a storage medium.
In a first aspect, an embodiment of the present invention provides a robot control method, including:
acquiring a target image at a target area on a robot body;
determining a target recognition result of the target image based on the target image and a preset image recognition model, wherein the target recognition result comprises the existence of the contamination and the absence of the contamination;
and when the target recognition result is that the contamination exists, controlling the robot to move to a target cleaning place so as to clean a target area of the robot.
Optionally, the robot is a robot for performing a transportation task, and the acquiring a target image at a target area on a robot body includes:
and acquiring a target image at a target area of the robot after detecting that the current conveying task of the robot is completed.
Optionally, the preset image recognition model is obtained by:
acquiring a sample image set used for model training and label information of each sample image in the sample image set, wherein the label information indicates that fouling exists or does not exist;
constructing an initial image recognition model;
and training the initial image recognition model based on the sample image set and the label information of each sample image to obtain the trained preset image recognition model.
Optionally, the determining a target recognition result of the target image based on the target image and a preset image recognition model includes:
determining an initial recognition result of the target image based on the target image and a preset image recognition model;
determining a defacement degree of the target image when the initial recognition result is defacement;
and when the contamination degree is greater than a preset contamination degree, determining that a target identification result corresponding to the target image is contamination.
Optionally, the cleaning place comprises a plurality of cleaning places, and before the controlling the robot to move to the target cleaning place, the method further comprises:
acquiring a target map corresponding to the position of the robot; determining the current position of the robot and the position of each cleaning place in the target map; planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place;
the controlling the robot to move to a target cleaning place includes: and controlling the robot to move to the target cleaning place according to a first moving route with the shortest distance.
Optionally, after the controlling the robot to move to the target cleaning place, the method further comprises:
acquiring a target map corresponding to the position of the robot;
determining a position of the target cleaning place and a position of a task waiting place of the robot in the target map;
planning a second moving route from the target cleaning place to the task waiting place based on the target map;
and after the cleaning treatment of the target area is detected to be completed, controlling the robot to move to the task waiting place according to the second moving route.
In a second aspect, an embodiment of the present invention provides a robot control apparatus, including:
the acquisition module is used for acquiring a target image at a target area on the robot body;
the image recognition module is used for determining a target recognition result of the target image based on the target image and a preset image recognition model, wherein the target recognition result comprises the existence of the dirt and the absence of the dirt;
and the control module is used for controlling the robot to move to a target cleaning place to perform cleaning treatment on a target area of the robot when the target recognition result shows that the contamination exists.
Optionally, the robot is a robot for performing a transportation task, and the acquiring module is configured to:
and acquiring a target image at a target area of the robot after detecting that the current conveying task of the robot is completed.
Optionally, the preset image recognition model is obtained by:
acquiring a sample image set used for model training and label information of each sample image in the sample image set, wherein the label information indicates that fouling exists or does not exist;
constructing an initial image recognition model;
and training the initial image recognition model based on the sample image set and the label information of each sample image to obtain the trained preset image recognition model.
Optionally, the image recognition module is configured to:
determining an initial recognition result of the target image based on the target image and a preset image recognition model;
determining a defacement degree of the target image when the initial recognition result is defacement;
and when the contamination degree is larger than a preset contamination degree, determining that a target identification result corresponding to the target image is contamination.
Optionally, the cleaning site comprises a plurality of sites, the apparatus further comprising:
the first map acquisition module is used for acquiring a target map corresponding to the position of the robot;
a first position determination module for determining a current position of the robot and a position of each cleaning spot in the target map;
the first route planning module is used for planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place;
the control module is used for controlling the robot to move to the target cleaning place according to a first moving route with the shortest distance.
Optionally, the apparatus further comprises:
the second map acquisition module is used for acquiring a target map corresponding to the position of the robot;
a second position determination module for determining a position of the target cleaning place and a position of a task waiting place of the robot in the target map;
the second route planning module is used for planning a second moving route from the target cleaning place to the task waiting place based on the target map;
and the processing module is used for controlling the robot to move to the task waiting place according to the second moving route after the cleaning processing of the target area is detected to be completed.
In a third aspect, an embodiment of the present invention provides a robot, including:
the image acquisition device is arranged on the robot body and used for acquiring a target image at a target area on the robot body;
the image processing device is connected with the image acquisition device and used for determining a target recognition result of the target image based on the target image and a preset image recognition model, and the target recognition result comprises the existence of contamination and the absence of contamination;
and the processor is connected with the image processing device and used for controlling the robot to move to a target cleaning place to perform cleaning treatment on a target area of the robot when the target recognition result shows that the contamination exists.
In a fourth aspect, an embodiment of the present invention provides a robot control apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the steps of any one of the above methods.
In a fifth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is used to implement the steps of any one of the methods described above when executed by a processor.
The embodiment of the invention has the following beneficial effects:
according to the robot control method provided by the embodiment of the invention, the target image of the target area on the robot body is acquired and input into the preset image recognition model to obtain the target recognition result of the target image, and when the target recognition result is that fouling exists, the robot is controlled to move to the target cleaning place to be cleaned. In this scheme, the robot can detect stained by oneself, and can remove by oneself to the clean place of target and clean after detecting stained, does not need artifical regularly to detect the stained of robot, greatly reduced the cost of labor, improved the clean efficiency of robot.
Drawings
Various additional advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart of a robot control method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a robot control device according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a robot according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a robot control device according to an embodiment of the present invention.
Detailed Description
In order to better understand the technical solutions of the embodiments of the present invention, the technical solutions of the embodiments of the present invention are described in detail below with reference to the accompanying drawings and specific embodiments, and it should be understood that the specific features of the embodiments and examples of the present invention are detailed descriptions of the technical solutions of the embodiments of the present invention, and are not limitations of the technical solutions of the present invention, and the technical features of the embodiments and examples of the present invention may be combined with each other without conflict.
An embodiment of the present invention provides a robot control method, as shown in fig. 1, which is a flowchart of the robot control method provided in the embodiment of the present invention, and the method includes the following steps:
step S11: acquiring a target image at a target area on a robot body;
step S12: determining a target recognition result of the target image based on the target image and a preset image recognition model, wherein the target recognition result comprises the existence of the contamination and the absence of the contamination;
step S13: and when the target recognition result is that the contamination exists, controlling the robot to move to a target cleaning place so as to perform cleaning treatment on a target area of the robot.
The robot in the embodiment of the invention can be a robot for transporting articles in a hotel, a robot for transporting articles in a logistics company and a robot for executing other tasks, and can be automatically moved to a cleaning place for cleaning by adopting the method provided by the embodiment of the invention as long as the robot has a pollution risk in the process of executing the tasks.
In step S11, the target area may be an area on the robot body that is easily stained, for example, if the robot loads articles through a bin provided on the body, the target area may be an area inside the bin. As another example, if the robot loads an article through a stage mounted on the body, the target area may be an area on the stage. The target area may of course also be another area on the robot body, for example, the front area of the robot may be contaminated when the robot is used for paint spraying, in which case the target area may be the front area of the robot.
In order to collect a target image of a target area, an image collecting device may be mounted on the robot body, and a specific mounting position of the image collecting device may be set according to actual needs as long as the image of the target area can be collected.
In step S12, the preset image recognition model may be set according to actual needs, for example, the preset image recognition model is a VGG16 model, a VGG19 model, a ResNet model, or the like. The preset image recognition model can be obtained by the following method: acquiring a sample image set used for model training and label information of each sample image in the sample image set, wherein the label information indicates that fouling exists or does not exist; constructing an initial image recognition model; and training the initial image recognition model based on the sample image set and the label information of each sample image to obtain the trained preset image recognition model.
In a specific implementation process, a supervised model may be selected as the image recognition model, and for convenience of description, the image recognition model is described as the VGG16 model.
First, a sample image set is obtained, and for each sample image, the sample image may be an image captured of the robot body, and each sample image corresponds to label information, and when the sample image is an image that has been stained, the corresponding label information indicates that there is a stain, and when the sample image is a clean image, the corresponding label information indicates that there is no stain.
Secondly, an initial image recognition model is constructed, taking the VGG16 model as an example, an initial VGG16 model is constructed, and a network structure of the initial VGG16 model comprises 13 convolutional layers, 3 connection layers and 5 pooling layers. Specifically, when an image is input into the VGG model, the image sequentially passes through blocks 1-5 and then is output through three full-connection layers. The block1 comprises two convolution layers and one pooling layer, the block2 comprises two convolution layers and one pooling layer, the block3 comprises three convolution layers and one pooling layer, the block4 comprises three convolution layers and one pooling layer, and the block5 comprises three convolution layers and one pooling layer. The final image recognition result in the embodiment of the invention comprises two conditions: the presence and absence of an insult, and thus, the output of the model can be set in two dimensions.
After the initial image recognition model is constructed, iterative training is carried out on the initial image recognition model through the sample image set and the label information of each sample image, model parameters are adjusted continuously until the iteration times reach preset times or the recognition accuracy of the image recognition model is larger than a threshold value (such as 90%, 95% and the like), the training process of the model is completed, and the trained preset image recognition model is obtained.
Of course, besides the supervised model, the image recognition model may also be an unsupervised model, which is not limited herein.
When the preset image recognition model is used for image recognition, the image may be preprocessed, for example, denoising and size normalization. Taking size standardization processing as an example, if all sample images used for model training are images with a length of 224 pixels and a width of 224 pixels, after a target image is acquired, it is necessary to determine whether the size of the target image is also 224 × 224, and if not, the target image is processed to make the size of the target image meet the image size required by a preset image recognition model.
And inputting the acquired target image serving as an input image into a preset image recognition model, wherein the preset image recognition model can output the recognition result of the target image. In the embodiment of the invention, the recognition result output by the preset image recognition model can be directly used as the target recognition result, and the recognition result can be further processed to obtain the target recognition result.
In the embodiment of the present invention, the target identification result may be obtained by: determining an initial recognition result of the target image based on the target image and a preset image recognition model; determining the fouling degree of the target image when the initial identification result is that fouling exists; and when the contamination degree is larger than a preset contamination degree, determining that a target identification result corresponding to the target image is contamination.
Specifically, when the robot is slightly stained, the robot is not cleaned to improve the working efficiency of the robot, and when the stained of the robot affects the execution task of the robot, the robot is controlled to move to a cleaning place to be cleaned. Therefore, in the embodiment of the present invention, the recognition result of the preset image recognition model may be used as the initial recognition result, and when the initial recognition result indicates that the target area of the robot is not stained, it is determined that the robot does not need to be cleaned, that is, the target recognition result is that no stain exists. When the initial recognition result shows that the target area of the robot is stained, whether the stained degree is larger than the preset stained degree is further judged, and the preset stained degree can be set according to actual needs, which is not limited herein. When the degree of soiling is greater than the preset degree of soiling, it is determined that the target recognition result is that soiling is present.
In particular implementations, determining the degree of soiling based on the target image may be accomplished in a variety of ways. For example, the area of the region where the stain is present in the target image is calculated, the percentage of the area of the region where the stain is present in the total area of the target image is determined, the obtained percentage is taken as the stain degree, and the preset stain degree may be 30%, 35%, or the like. When the calculated fouling degree is larger than the preset fouling degree, the cleaning is required, and when the calculated fouling degree is smaller than or equal to the preset fouling degree, the cleaning is not required, and the target identification result is that no fouling exists.
In step S13, when the target recognition result indicates that there is contamination, the robot needs to be controlled to move to the target cleaning point for cleaning. In the embodiment of the present invention, the number of the cleaning places may include one or more cleaning places, and when the number of the cleaning places includes a plurality of cleaning places, one cleaning place may be randomly determined as the target cleaning place, or one cleaning place may be previously determined as the target cleaning place among the plurality of cleaning places.
In an embodiment of the present invention, when the cleaning site includes a plurality of cleaning sites, the target cleaning site may be determined by: acquiring a target map corresponding to the position of the robot; determining the current position of the robot and the position of each cleaning place in the target map; and planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place.
Specifically, the target map may be a map of a work place of the robot, for example, the robot is a hotel robot, and the target map may be a map of a hotel, or, if the robot works on five floors of the hotel, the target map may be a map of five floors of the hotel. The target map may be stored in the local memory of the robot or in the server. When stored locally, the robot may directly read the target map in the local storage, and when stored on the server, the robot may send a request to the server to obtain the target map, which is not limited herein.
After the target map is obtained, the current position of the robot and the positions of the plurality of cleaning places are determined on the map, and a first moving route from the current position to each cleaning place is planned according to the passable route, the positions of the obstacles, the positions of the elevators and the like on the target map. In order to clean the robot most quickly, the invention compares the route distances of all the first moving routes in real time, and takes the first moving route with the shortest distance as the route for the robot to clean, wherein the cleaning place corresponding to the route is the target cleaning place. Further, the robot is controlled to move to the target cleaning place according to the first moving route with the shortest distance to perform cleaning.
Of course, other factors may also be considered when determining the target cleaning location, for example, after the first moving route from the current position of the robot to each cleaning location is planned, the number of robots to be cleaned at each cleaning location may also be obtained, and the target cleaning location is determined comprehensively based on the distance of each first moving route and the number of robots waiting to be cleaned at each cleaning location. In a specific implementation process, a time length consumed by the robot to move according to each first moving route can be determined according to a preset moving speed of the robot, and in addition, a cleaning waiting time length corresponding to each cleaning place is determined according to a preset cleaning time length of each robot and the number of robots waiting to be cleaned existing in each cleaning place. For each first moving route, adding the moving time length of the robot corresponding to the moving route and the cleaning waiting time length corresponding to the cleaning place of the moving route to obtain the final time length corresponding to the moving route. And taking the first moving route with the shortest final time length as a final robot moving route, and taking the corresponding cleaning place as a target cleaning place.
In the embodiment of the invention, after the cleaning processing is finished, the robot is controlled to return to the task waiting place again to wait for the task distribution. Taking a hotel meal delivery robot as an example, a meal taking point set by the hotel is a task waiting place. In the specific implementation process, the robot can be controlled to return to the task waiting point in the following modes: acquiring a target map corresponding to the position of the robot; determining a position of the target cleaning place and a position of a task waiting place of the robot in the target map; planning a second moving route from the target cleaning place to the task waiting place based on the target map; and after the cleaning treatment of the target area is detected to be completed, controlling the robot to move to the task waiting place according to the second moving route.
For the route planning of the return task waiting place, reference may be made to the above description of the planning robot moving to the target cleaning place, and details are not described here.
In addition, in order to ensure that the robot finishes the delivery of the articles within a specified time without affecting the execution of the current task by the robot, in the embodiment of the invention, after the completion of the current delivery task of the robot is detected, the detection of the fouling condition of the target area of the robot is executed.
Of course, the contamination of the target area may also be detected at preset time intervals, for example, a target image is captured for the target area every 1min, and whether the target area is contaminated or not is determined based on the target image.
In the embodiment of the present invention, the recognition of the target image may be performed by a processor provided inside the robot, or may be implemented by a server. Taking image recognition through a server as an example, after the robot captures a target image of a target area, the target image can be sent to the server through a communication module, and after receiving the target image, the server recognizes the target image based on a preset image recognition model running on the server and feeds back a recognition result to the robot. Of course, the route planning for the robot to move to the target cleaning place or to move to the task waiting place may be performed locally on the robot or may be calculated by the server, which is not limited herein.
In summary, the robot control method provided by the embodiment of the invention can automatically shoot the target image of the target area, identify the target image, and control the robot to automatically move to the target cleaning place to execute cleaning processing through the planned route after the identification result shows that the target area is stained, so that the stain is not required to be detected manually, the time for waiting for manual detection is saved, and the cleaning efficiency is improved.
Based on the same inventive concept as the robot control method, an embodiment of the present invention provides a robot control apparatus, as shown in fig. 2, the apparatus including:
an acquisition module 21 configured to acquire a target image at a target area on a robot body;
an image recognition module 22, configured to determine a target recognition result of the target image based on the target image and a preset image recognition model, where the target recognition result includes the presence and absence of the stain;
and the control module 23 is configured to control the robot to move to a target cleaning place to perform a cleaning process on a target area of the robot when the target recognition result indicates that contamination exists.
Optionally, the robot is a robot for performing a transportation task, and the obtaining module 21 is configured to:
and acquiring a target image at a target area of the robot after detecting that the current conveying task of the robot is completed.
Optionally, the preset image recognition model is obtained by:
acquiring a sample image set used for model training and label information of each sample image in the sample image set, wherein the label information indicates that fouling exists or does not exist;
constructing an initial image recognition model;
and training the initial image recognition model based on the sample image set and the label information of each sample image to obtain the trained preset image recognition model.
Optionally, an image recognition module 22, configured to:
determining an initial recognition result of the target image based on the target image and a preset image recognition model;
determining the fouling degree of the target image when the initial identification result is that fouling exists;
and when the contamination degree is larger than a preset contamination degree, determining that a target identification result corresponding to the target image is contamination.
Optionally, the cleaning site comprises a plurality of sites, the apparatus further comprising:
the first map acquisition module is used for acquiring a target map corresponding to the position of the robot;
a first position determination module for determining a current position of the robot and a position of each cleaning spot in the target map;
the first route planning module is used for planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place;
and the control module 23 is used for controlling the robot to move to the target cleaning place according to the first movement route with the shortest distance.
Optionally, the apparatus further comprises:
the second map acquisition module is used for acquiring a target map corresponding to the position of the robot;
a second position determination module for determining a position of the target cleaning place and a position of a task waiting place of the robot in the target map;
the second route planning module is used for planning a second moving route from the target cleaning place to the task waiting place based on the target map;
and the processing module is used for controlling the robot to move to the task waiting place according to the second moving route after the cleaning processing of the target area is detected to be completed.
With regard to the above-mentioned apparatus, the specific functions of the respective modules have been described in detail in the embodiment of the robot control method provided in the embodiment of the present invention, and will not be elaborated herein.
Based on the same inventive concept as the robot control method, an embodiment of the present invention provides a robot, as shown in fig. 3, including:
the image acquisition device 31 is arranged on the robot body and used for acquiring a target image at a target area on the robot body;
the image processing device 32 is connected with the image acquisition device and is used for determining a target recognition result of the target image based on the target image and a preset image recognition model, and the target recognition result comprises the existence of the contamination and the absence of the contamination;
a processor 33 connected to the image processing apparatus for controlling the robot to move to a target cleaning place to perform a cleaning process on a target area of the robot when the target recognition result is that contamination exists
With regard to the robot described above, the specific functions of the respective devices have been described in detail in the embodiments of the robot control method provided by the embodiments of the present invention, and will not be elaborated herein.
Based on the same inventive concept as the robot control method in the foregoing embodiment, an embodiment of the present invention further provides a robot control apparatus, as shown in fig. 4, including a memory 404, a processor 402, and a computer program stored in the memory 404 and executable on the processor 402, where the processor 402 executes the program to implement the steps of any one of the robot control methods described above.
Wherein in fig. 4 a bus architecture (represented by bus 400), bus 400 may include any number of interconnected buses and bridges, bus 400 linking together various circuits including one or more processors, represented by processor 402, and memory, represented by memory 404. The bus 400 may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface 406 provides an interface between the bus 400 and the receiver 401 and transmitter 403. The receiver 401 and the transmitter 403 may be the same element, i.e., a transceiver, providing a means for communicating with various other apparatus over a transmission medium. The processor 402 is responsible for managing the bus 400 and general processing, while the memory 404 may be used for storing data used by the processor 402 in performing operations.
Based on the inventive concept of the robot control method in the foregoing embodiments, embodiments of the present invention further provide a computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements the steps of the robot control method described above.
The present specification has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (9)

1. A robot control method, characterized in that the method comprises:
acquiring a target image of a target area on a robot body, wherein the robot body is provided with a bin body for loading articles, and the target area is an internal area of the bin body;
determining a target recognition result of the target image based on the target image and a preset image recognition model, wherein the target recognition result comprises the existence of contamination and the absence of contamination;
when the target recognition result shows that the contamination exists, the method for controlling the robot to move to a target cleaning place so as to perform cleaning treatment on a target area of the robot comprises the following steps: acquiring a target map corresponding to the position of the robot; determining the current position of the robot and the position of each cleaning place in the target map; planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place; and controlling the robot to move to the target cleaning place according to the first moving route with the shortest distance.
2. The method of claim 1, wherein the robot is a robot performing a transport task, said acquiring a target image at a target area on a robot body, comprising:
and acquiring a target image at a target area of the robot after detecting that the current conveying task of the robot is completed.
3. The method according to claim 1, characterized in that the preset image recognition model is obtained by:
acquiring a sample image set used for model training and label information of each sample image in the sample image set, wherein the label information indicates that fouling exists or does not exist;
constructing an initial image recognition model;
and training the initial image recognition model based on the sample image set and the label information of each sample image to obtain the trained preset image recognition model.
4. The method according to claim 1, wherein the determining a target recognition result of the target image based on the target image and a preset image recognition model comprises:
determining an initial recognition result of the target image based on the target image and a preset image recognition model;
determining the fouling degree of the target image when the initial identification result is that fouling exists;
and when the contamination degree is greater than a preset contamination degree, determining that a target identification result corresponding to the target image is contamination.
5. The method of claim 1, wherein after controlling the robot to move to the target cleaning location, the method further comprises:
acquiring a target map corresponding to the position of the robot;
determining a position of the target cleaning place and a position of a task waiting place of the robot in the target map;
planning a second moving route from the target cleaning place to the task waiting place based on the target map;
and after the cleaning treatment of the target area is detected to be completed, controlling the robot to move to the task waiting place according to the second moving route.
6. A robot control apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a storage module and a display module, wherein the acquisition module is used for acquiring a target image at a target area on a robot body, a bin body for loading articles is arranged on the robot body, and the target area is an internal area of the bin body;
the image recognition module is used for determining a target recognition result of the target image based on the target image and a preset image recognition model, and the target recognition result comprises the existence of contamination and the absence of contamination;
the control module is used for controlling the robot to move to a target cleaning place to perform cleaning treatment on a target area of the robot when the target recognition result shows that the contamination exists, and comprises the following steps: acquiring a target map corresponding to the position of the robot; determining the current position of the robot and the position of each cleaning place in the target map; planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place; and controlling the robot to move to the target cleaning place according to a first moving route with the shortest distance.
7. A robot, characterized in that the robot comprises:
the robot comprises a robot body, an image acquisition device, a storage device and a control device, wherein the image acquisition device is arranged on the robot body and is used for acquiring a target image at a target area on the robot body, a bin body for loading articles is arranged on the robot body, and the target area is an internal area of the bin body;
the image processing device is connected with the image acquisition device and used for determining a target recognition result of the target image based on the target image and a preset image recognition model, and the target recognition result comprises the existence of contamination and the absence of contamination;
the processor is connected with the image processing device and used for controlling the robot to move to a target cleaning place when the target recognition result shows that the contamination exists so as to perform cleaning treatment on a target area of the robot, and the processor comprises: acquiring a target map corresponding to the position of the robot; determining the current position of the robot and the position of each cleaning place in the target map; planning a first moving route from the current position to each cleaning place based on the target map, obtaining a plurality of first moving routes in total, and taking the cleaning place corresponding to the first moving route with the shortest distance as the target cleaning place; and controlling the robot to move to the target cleaning place according to the first moving route with the shortest distance.
8. A robot control apparatus comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of any of claims 1 to 5 when executing the program.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN202110359532.4A 2021-04-02 2021-04-02 Robot control method, robot control device, robot, and storage medium Active CN112959328B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110359532.4A CN112959328B (en) 2021-04-02 2021-04-02 Robot control method, robot control device, robot, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110359532.4A CN112959328B (en) 2021-04-02 2021-04-02 Robot control method, robot control device, robot, and storage medium

Publications (2)

Publication Number Publication Date
CN112959328A CN112959328A (en) 2021-06-15
CN112959328B true CN112959328B (en) 2022-11-22

Family

ID=76281065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110359532.4A Active CN112959328B (en) 2021-04-02 2021-04-02 Robot control method, robot control device, robot, and storage medium

Country Status (1)

Country Link
CN (1) CN112959328B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108121330A (en) * 2016-11-26 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of dispatching method, scheduling system and map path planing method
US10642277B2 (en) * 2017-03-22 2020-05-05 Fetch Robotics, Inc. Cleaning station for mobile robots
CN107597776B (en) * 2017-09-08 2020-11-20 珠海格力电器股份有限公司 Cleaning method and device of cooking equipment and cooking equipment
CN108154098A (en) * 2017-12-20 2018-06-12 歌尔股份有限公司 A kind of target identification method of robot, device and robot
CN109724993A (en) * 2018-12-27 2019-05-07 北京明略软件系统有限公司 Detection method, device and the storage medium of the degree of image recognition apparatus
CN111805551B (en) * 2020-06-01 2021-07-13 杭州电子科技大学 Meal delivery robot and meal delivery method thereof

Also Published As

Publication number Publication date
CN112959328A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
US9928438B2 (en) High accuracy localization system and method for retail store profiling via product image recognition and its corresponding dimension database
US10083418B2 (en) Distributed autonomous robot systems and mehtods
CN107272710B (en) Medical logistics robot system based on visual positioning and control method thereof
JP7145843B2 (en) Robot manipulator training
CN103582803B (en) Method and apparatus for sharing map data associated with automated industrial vehicles
KR101437952B1 (en) Method and apparatus for facilitating map data processing for industrial vehicle navigation
JPWO2017175312A1 (en) Measuring system and measuring method
US20180082249A1 (en) Autonomous Vehicle Content Identification System and Associated Methods
CN103946758A (en) Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
CN103782247A (en) Method and apparatus for using pre-positioned objects to localize an industrial vehicle
WO2018140746A1 (en) Systems and methods for resolving issues in a distributed autonomous robot system
US20190043185A1 (en) Image processing system
CN111061270B (en) Full coverage method, system and operation robot
CN109189013A (en) Operating method, device, server and the storage medium of container
KR102202244B1 (en) Logistics transport system using the autonomous driving unit and logistics transport method using the same
CN112959328B (en) Robot control method, robot control device, robot, and storage medium
CN114359692A (en) Room identification method and device, electronic equipment and storage medium
CN113031537B (en) Robot, cargo transportation method, electronic device, and storage medium
CN112130571B (en) Linen conveying method and device for hotel rooms and electronic equipment
CN215477503U (en) Sorting device and warehousing system
Soundattikar et al. Design and development of intelligent handling system for components in small and medium scale industries
CN112016725A (en) Task processing method, device, equipment and storage medium
CN110188695B (en) Shopping action decision method and device
CN116075795A (en) Environment change proposal system and environment change proposal program
CN115676194A (en) Cargo processing method and device, robot, sorting device and warehousing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 702, 7th floor, NO.67, Beisihuan West Road, Haidian District, Beijing 100089

Applicant after: Beijing Yunji Technology Co.,Ltd.

Address before: Room 702, 7 / F, 67 North Fourth Ring Road West, Haidian District, Beijing

Applicant before: BEIJING YUNJI TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant