CN114714321A - Robot and distribution method, control device and storage medium thereof - Google Patents

Robot and distribution method, control device and storage medium thereof Download PDF

Info

Publication number
CN114714321A
CN114714321A CN202210356446.2A CN202210356446A CN114714321A CN 114714321 A CN114714321 A CN 114714321A CN 202210356446 A CN202210356446 A CN 202210356446A CN 114714321 A CN114714321 A CN 114714321A
Authority
CN
China
Prior art keywords
robot
image information
controlling
target
article
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210356446.2A
Other languages
Chinese (zh)
Inventor
周畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Pudu Technology Co Ltd
Original Assignee
Shenzhen Pudu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pudu Technology Co Ltd filed Critical Shenzhen Pudu Technology Co Ltd
Priority to CN202210356446.2A priority Critical patent/CN114714321A/en
Publication of CN114714321A publication Critical patent/CN114714321A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants

Abstract

The embodiment of the application discloses a robot and a distribution method, a control device and a storage medium thereof, which relate to the field of artificial intelligence and comprise the following steps: the image acquisition device, bear device, telecontrol equipment, display device, memory and processor, the memory stores executable program code, and the processor is used for realizing the following step when executing executable program code: receiving a target distribution place and controlling an image acquisition device to photograph an article placed on a bearing device so as to acquire image information and bind the image information with the target distribution place; controlling the moving device to move the robot to the target delivery location; and after the robot is determined to move to the target distribution place, controlling the display device to display the image information bound with the target distribution place. In the embodiment of the present application, when the robot reaches the target delivery location, the image information bound to the target delivery location is displayed, so that the article corresponding to the target delivery location can be specified from the plurality of articles.

Description

Robot and distribution method, control device and storage medium thereof
Technical Field
The embodiment of the application relates to the field of artificial intelligence, in particular to a robot, a distribution method, a control device and a storage medium thereof.
Background
With the development of artificial intelligence technology in recent years, robots are more and more widely applied in the service industry. The robot is widely applied to meal delivery in restaurants instead of waiters. The robot is used for sending meals, so that labor cost is reduced, and service efficiency is improved.
In the existing robot, a waiter puts a food to be delivered on the robot in the food delivery process, then the robot delivers the food to be delivered to a designated dining table, and the waiter puts the food on the dining table. In order to avoid the situation of wrong dishes as much as possible, the robot binds the dishes to be delivered with the pre-stored dish information in the robot before delivering the dishes to be delivered, and after the robot reaches the designated dining table corresponding to the dishes to be delivered, the pre-stored dish information, such as dish pictures or dish names, is displayed through the dish display interface. The waiter can determine the dishes to be delivered corresponding to the appointed dining table from the plurality of dishes through the dish information.
However, if there are a plurality of dishes on the robot that belong to the same dish, and there is a difference in the preparation method or different preparation materials, it is difficult for the waiter to determine the dish to be delivered corresponding to the designated table from the plurality of dishes through the stored dish information displayed on the dish display interface.
Disclosure of Invention
Embodiments of the present application provide a robot, a distribution method thereof, and a control device, which are capable of identifying an article corresponding to a target distribution location from a plurality of articles by displaying image information bound to the target distribution location.
The embodiment of the application provides a robot, the robot comprises an image acquisition device, a bearing device, a movement device, a display device, a memory and a processor, wherein the memory stores executable program codes, and the processor is used for realizing the following steps when executing the executable program codes:
receiving a target distribution place and controlling the image acquisition device to photograph the article placed on the bearing device so as to acquire image information;
binding the image information with the target distribution place;
controlling the moving device to move the robot to the target delivery location;
and after the robot is determined to move to the target distribution place, controlling the display device to display the image information bound with the target distribution place.
Further, the processor is further configured to,
controlling the moving device to move the robot to an object-taking point;
after the robot is determined to move to the object taking point, the display device is controlled to display a plurality of selectable places to be distributed, and the target distribution place is determined from the plurality of places to be distributed.
Further, the processor is also used for being in communication connection with an order system;
the receiving the target delivery site includes: and receiving order information sent by the order system, wherein the order information comprises the target delivery location and the article information corresponding to the target delivery location.
Further, the article information includes an article name and an article picture;
the binding the image information with the target delivery location comprises: and matching the article picture with the image information, and binding the target delivery location with the image information if the matching is successful.
Further, the method also comprises the following steps: the detection device is connected with the processor; the controlling the image acquisition device to photograph the row of objects placed on the carrying device to acquire the image information includes:
controlling the detection device to detect whether an article is placed on the bearing device;
and if the bearing device is detected to be provided with the article, controlling the image acquisition device to photograph the article to acquire image information.
Further, the detection device is a pressure sensor arranged on the bearing device; or the detection device is an infrared scanner; or the detection device is a camera.
Further, the method also comprises the following steps: a voice playing device;
the processor is further used for being in communication connection with the voice playing device, and when the robot moves to the target distribution place, the voice playing device is controlled to play the article taking prompt information, play the article introduction voice corresponding to the article and control the display device to display the article introduction video corresponding to the article.
The embodiment of the application also provides a robot delivery method, which comprises the following steps:
receiving a target distribution place and controlling the image acquisition device to photograph the article placed on the bearing device so as to acquire image information;
binding the image information with the target distribution place;
controlling the moving device to move the robot to the target delivery location;
and after the robot is determined to move to the target distribution place, controlling the display device to display the image information bound with the target distribution place.
The embodiment of the present application further provides a robot control device, including:
the first execution unit is used for receiving a target distribution place and controlling the image acquisition device to photograph the object placed on the bearing device so as to acquire image information;
a binding unit for binding the image information with the target delivery location;
a control unit for controlling the moving device to move the robot to the target delivery place;
and the second execution unit is used for controlling the display device to display the image information bound with the target distribution place after the robot is determined to move to the target distribution place.
Embodiments of the present application also provide a computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the method of claim 8.
According to the technical scheme, the embodiment of the application has the following advantages:
in the embodiment of the application, the image information of the object obtained by photographing is bound with the target delivery location, and after the robot reaches the target delivery location, the object corresponding to the target delivery location can be determined from the plurality of objects by displaying the image information bound with the target delivery location.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a schematic diagram of a robot communication structure disclosed in an embodiment of the present application;
FIG. 2 is a flow chart of a robot operation disclosed in an embodiment of the present application;
FIG. 3 is a schematic view of a robot structure disclosed in the embodiments of the present application;
fig. 4 is a schematic diagram of a robot system according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the embodiments of the present application, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings, and are only for convenience of description of the embodiments of the present application and for simplicity of description, but do not indicate or imply that the devices or elements referred to must have specific orientations, be configured in specific orientations, and operate, and thus, should not be construed as limiting the embodiments of the present application.
In the description of the embodiments of the present application, it should be noted that the terms "mounted," "connected," and "connected" are to be construed broadly and may be, for example, a fixed connection, a detachable connection, or an integral connection unless otherwise explicitly stated or limited; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. Specific meanings of the above terms in the embodiments of the present application can be understood in specific cases by those of ordinary skill in the art.
Before the robot performs the distribution of the articles, the robot generally needs to be in communication connection with the background system to receive the relevant information of the distribution sent by the background system, where the relevant information of the distribution may be information of a distribution place or a distributed article, and the details are not limited herein. The robot accurately delivers the article to the delivery location corresponding to the article based on the delivery-related information. The background system may be a delivery system or an ordering system, and is not limited herein.
The following will describe the distribution of the robot by taking the background system as the ordering system and the corresponding item as the dish, please refer to fig. 1: the food ordering system 101 is in communication connection with the robot 102, and the communication connection may be wifi or bluetooth connection, which is not limited herein. The ordering system 101 can collect ordering information corresponding to all table numbers in a restaurant, wherein the ordering information includes at least one dish name. Specifically, the ordering system 101 may collect ordering information through the robot 102, or store manually entered ordering information, or receive ordering information of a guest ordering through WeChat, Mei Tuo, or the like. The ordering system 101 sends the ordering information to the robot 102 according to the ordering time sequence, the robot 102 can determine the dish to be delivered and the target table number corresponding to the dish to be delivered according to the ordering information, and the dish to be delivered is delivered to the position of the corresponding target table number. It is understood that one ordering system 101 may be communicatively connected to a plurality of robots 102, and send the collected ordering information to the plurality of robots 102, respectively. Generally, the ordering information sent by the ordering system 101 to the plurality of robots 102 is different, so as to avoid that the plurality of robots 102 perform the same delivery task.
Generally, a plurality of dishes belonging to the same dish may exist in dishes distributed by the robot, a difference may exist in a preparation method or a preparation material of each dish, and it is difficult for a service person to directly determine the dish corresponding to a distribution place from the appearance of the dish. For example, the distribution dish of the robot is a plurality of fish-flavored shredded pork, but the plurality of fish-flavored shredded pork have spicy property and non-spicy property, when the robot sends the plurality of fish-flavored shredded pork to a certain distribution place, when the distribution place is correspondingly non-spicy property, a waiter is difficult to determine that the fish-flavored shredded pork is not spicy. Therefore, the embodiment of the application provides a robot which can accurately determine dishes corresponding to a distribution place during distribution. The following will describe the delivery process of the robot with reference to fig. 2 and 3.
The robot in the embodiment of the present application includes: image acquisition device 301, carrying device 302, motion device 303, display device 304, memory and processor. The memory is in communication with the processor and has pre-programmed executable program code stored therein that is recognizable and modifiable by the processor. The processor may retrieve the program code from the memory and execute the program code. It can be understood that the main executing mechanism in the robot is the processor, the processor is connected to the image capturing device 301, the carrying device 302, the moving device 303 and the display device 304 in the robot respectively, and when the processor executes the program code, the distribution task of the robot can be performed by controlling the relevant devices, and the specific distribution method is as follows:
201. and receiving the target distribution place and controlling the image acquisition device to photograph the article placed on the bearing device so as to acquire image information.
The processor may receive the target delivery location and control the image capture device 301 to take a picture of the item placed on the carrier 302 to obtain image information. When the meal delivery task is executed, the target delivery location may be a table number, and the object may be a dish, which will not be described in detail below.
When a dish is placed on the carrier 302, the processor in the robot determines the target delivery location to which the dish is to be delivered, based on pre-acquired dish information or manually entered instructions. The processor may control the image capture device 301 to photograph the dishes placed on the carrying device 302 to obtain image information corresponding to the dishes. After the robot acquires the image information, the image information can be stored in a memory so as to be called at any time. The image information may be a picture containing the dish information. The image capturing device 301 may be a video camera or a device with a corresponding image capturing function, and is not limited herein. The carrying device 302 is fixedly disposed on the robot and disposed in an image capturing area of the image capturing device 301, and may be a tray or a device with carrying capacity, which is not limited herein.
It is understood that the processor may determine the table number to which the dish needs to be delivered before the dish is placed on the carrier 302, or may determine the table number corresponding to the dish after the dish is placed on the carrier 302. In an implementation, the attendant may assign a table number to the robot first, and then place the dish on the carrier 302; the attendant may also place the dish on the carrier 302 and assign a table number to the robot.
202. And binding the image information with the target delivery location.
The processor binds the image information with the target delivery location. Specifically, the processor binds the image information of the dish acquired by the image acquisition device 301 with the table number corresponding to the dish. It is understood that the processor may bind the image information of the dish and the table number corresponding to the dish with the same identifier, or may store the image information of the dish and the related information of the table number corresponding to the dish on the same memory bank of the memory for binding, which is not limited herein. It is understood that the table number bound with the image information may also be subjected to modification editing.
203. The moving means is controlled to move the robot to the target delivery location.
The processor drives the robot to move to the target distribution place by controlling the movement device 303. Specifically, after the processor binds the image information with the table number, the processor identifies the specific table position of the table number, and controls the moving device 303 to drive the robot, so that the robot moves to the table position. Specifically, the processor may send a movement instruction to the movement device 303, where the movement instruction includes a table number of a specific table position, and the movement device 303 may drive the robot to move to the table position after receiving the movement instruction. It will be appreciated that the memory in the robot stores the table positions of all delivery table numbers in advance, and when the processor determines the target table number, the specific table position of the target table number can be determined from the table positions of the delivery table numbers. Since the dishes are placed on the carrier 302 of the robot, when the robot moves to the table position, the dishes are delivered to the table position.
204. And controlling the display device to display the image information bound with the target distribution place.
The processor controls the display device 304 to display the image information bound with the target delivery place. Specifically, after the processor determines that the robot moves to a specific table position of the table number, the processor may control the display device 304 to display the image information bound to the table number. Further, after the moving device 303 drives the robot to move to a specific table position of the table number, an arrival signal can be fed back to the processor, and the processor determines that the robot moves to the table position according to the arrival signal. Then, the processor finds the image information bound with the table number from the memory according to the identification information of the table number, and sends the image information to the display device 304, so that the display device 304 displays the image information bound with the table number. The server selects the dishes corresponding to the table number from the carrier 302 according to the image information displayed on the display device 304.
In the embodiment of the application, the image information of the dishes obtained by photographing is bound with the table number, and after the robot reaches the position of the table number, the dishes corresponding to the table number can be determined from the multiple dishes by displaying the image information bound with the table number. Furthermore, even if a plurality of dishes belonging to the same type of dish exist on the robot, the image information of the dishes can be bound with the table number, and the image information of the dishes is displayed when the dishes arrive at the dining table of the table number, so that the dishes corresponding to the table number are determined.
The above description of the delivery process of the robot is provided, and the delivery process will be further described according to the relevant structure and structure function of the robot according to fig. 3: the image acquisition device 301, the carrying device 302, the movement device 303, the display device 304, the memory and the processor are similar to those described above, and detailed description thereof is omitted here.
When a certain robot receives a plurality of meal delivery tasks, the robot has a plurality of table numbers to be delivered at the moment, and when the robot takes the dishes to be delivered, the robot does not know to which table number to be delivered the dishes to be delivered are delivered. At this time, the delivery table number of the robot may be determined by the attendant. In particular, the processor on the robot may send a motion instruction to the motion device 303, where the motion instruction carries a specific location of an access point, which may be understood as a meal access point. The moving device 303 can drive the robot to move to the meal taking point according to the moving instruction. After determining that the robot moves to the food taking point according to the information fed back by the moving device 303, the processor sends the information of the plurality of table numbers to be delivered to the display device 304, and the display device 304 displays the plurality of table numbers to be delivered, wherein the plurality of table numbers to be delivered can be selected. When a dish is to be delivered, the waiter selects a table number to which the dish is to be delivered from among the table numbers to be delivered on the display unit 304. It is understood that the display device 304 may be a touch display screen, and the selection result may be determined according to the touch of the attendant.
Further, the processor may be communicatively coupled to the order system to receive order information sent by the order system. Specifically, when the robot delivers the dishes, the ordering system is the ordering system, and the processor can be in communication connection with the ordering system. When a customer orders, the customer can be connected with the ordering system through WeChat or American group and the like and input ordering instructions. And the ordering system generates corresponding ordering information according to the ordering instruction and sends the ordering information to the processor. The meal ordering information includes dish information and a table number corresponding to the dish. It is understood that, in general, when there are a plurality of customers to order, the ordering information includes a plurality of table numbers corresponding to the plurality of customers.
Further, the dish information also includes the name of the dish and the picture of the dish. The processor may also bind the image information of the dish to the table number by: the processor matches the picture of the dishes obtained according to the ordering system with the image information obtained through the image acquisition device 301. Specifically, the processor may perform matching according to the color or image layout of the two, which is not limited herein. And if the matching is successful, binding the table number with the image information. If the matching is unsuccessful, a matching error prompt can be sent out, and a waiter checks whether the dish is wrongly served. The processor can further ensure the accuracy of serving in the mode.
In another implementation, in order to know whether there are dishes on the carrying device 302, the robot may be further provided with a detection device. The detection device is communicatively connected to the processor, and the detection device may be a pressure sensor disposed on the carrying device, or may be an infrared scanner disposed on the carrying device 302, or may be a camera, which is not limited herein. The processor may control the detection device to detect whether dishes are placed on the carrying device 302 in real time. Specifically, the processor may determine whether dishes are placed on the carrying device 302 according to the information fed back by the detecting device. When the processor determines that the dishes are placed on the carrying device 302, the processor image acquisition device 301 takes a picture of the dishes to acquire image information.
In particular, when the detection means is a pressure sensor. The bearing device 302 can be provided with a plurality of dish placing areas with preset areas; each dish placing area is used for placing a dish. Each dish placing area is provided with a pressure sensor, and each pressure sensor is in communication connection with the processor. When the pressure value measured by any pressure sensor reaches the preset pressure value, the pressure sensor can send the trigger signal to the processor, the processor determines the pressure to reach the target dish placing area corresponding to the pressure value according to the trigger signal, and determines that dishes are placed on the target dish placing area. It is understood that the preset pressure value may be 5 kg or 10 kg, and is not limited herein. The preset pressure value is mainly used for determining that dishes are placed in a certain dish placing area and is not suitable to be set too large. Next, the processor controls the image capture device 301 to photograph the dishes placed on the target dish placement area, and obtains image information of the dishes.
Further, in order to increase interactivity with customers, the robot may further be provided with a voice playing device 305, the processor may be in communication with the voice playing device 305, and when the robot moves to a dining table position of a table number, the voice playing device 305 is controlled to play the meal taking prompt information and play the dish introduction voice corresponding to the dishes. It is understood that the dish introduction voice may be pre-stored in the memory of the robot, and the dish introduction voice is bound to the dish, and the processor may extract the dish introduction voice from the memory according to the identification information of the dish. Meanwhile, a dish introduction video corresponding to the dish can be prestored in the memory, and when the robot moves to a table position of a table, the processor controls the display device 304 to display the dish introduction video. For example, when the robot is delivering food and arrives at the dining table, the voice playing device 305 of the robot prompts the name of the dish, and may introduce the creative idea, the moral meaning, the practice and the like of the dish to the guest through the voice interaction function, and may also display the video introduction, the voice introduction and the like of the dish on the display device 304. Meanwhile, the system can prompt which dishes are not put on and whether the dishes are completely put on by voice.
An embodiment of the present application further provides a robot control apparatus, as shown in fig. 4, including:
a first execution unit 401, configured to receive a target distribution location and control the image acquisition device to photograph an article placed on the carrying device to obtain image information;
a binding unit 402 for binding the image information with the target delivery location;
a control unit 403 for controlling the moving means to move the robot to the target delivery site;
and a second executing unit 404, configured to control the display device to display image information bound to the target distribution location after determining that the robot moves to the target distribution location.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.

Claims (10)

1. A robot comprising an image acquisition device, a carrying device, a movement device, a display device, a memory and a processor, the memory storing executable program code, characterized in that the processor is adapted to perform the following steps when executing the executable program code:
receiving a target distribution place and controlling the image acquisition device to photograph the article placed on the bearing device so as to acquire image information;
binding the image information with the target distribution place;
controlling the moving device to move the robot to the target delivery location;
and after the robot is determined to move to the target distribution place, controlling the display device to display the image information bound with the target distribution place.
2. The robot of claim 1, wherein the processor is further configured to,
controlling the moving device to move the robot to an object-taking point;
after the robot is determined to move to the object taking point, the display device is controlled to display a plurality of selectable places to be delivered, and the target delivery place is determined from the places to be delivered.
3. The robot of claim 1, wherein the processor is further configured to communicatively couple with an order system;
the receiving the target delivery site includes: and receiving order information sent by the order system, wherein the order information comprises the target delivery location and the article information corresponding to the target delivery location.
4. The robot of claim 3, wherein the item information includes an item name and an item picture;
the binding the image information and the target delivery location comprises: and matching the article picture with the image information, and if the matching is successful, binding the target delivery location with the image information.
5. The robot of claim 1, further comprising: the detection device is connected with the processor; the controlling the image acquisition device to photograph the row of objects placed on the carrying device to acquire the image information includes:
controlling the detection device to detect whether an article is placed on the bearing device;
and if the bearing device is detected to be provided with the article, controlling the image acquisition device to photograph the article to acquire image information.
6. A robot according to claim 1, wherein the detection means is a pressure sensor provided on the carrying means; or the detection device is an infrared scanner; or the detection device is a camera.
7. The robot of claim 1, further comprising: a voice playing device;
the processor is further used for being in communication connection with the voice playing device, and when the robot moves to the target distribution place, the voice playing device is controlled to play the article taking prompt information, play the article introduction voice corresponding to the article and control the display device to display the article introduction video corresponding to the article.
8. A robotic delivery method, comprising:
receiving a target distribution place and controlling the image acquisition device to photograph the article placed on the bearing device so as to acquire image information;
binding the image information with the target distribution place;
controlling the moving device to move the robot to the target delivery location;
and after the robot is determined to move to the target distribution place, controlling the display device to display the image information bound with the target distribution place.
9. A robot control apparatus, comprising:
the first execution unit is used for receiving a target distribution place and controlling the image acquisition device to photograph the articles placed on the bearing device so as to acquire image information;
a binding unit for binding the image information with the target delivery location;
a control unit for controlling the moving device to move the robot to the target delivery place;
and the second execution unit is used for controlling the display device to display the image information bound with the target distribution place after the robot is determined to move to the target distribution place.
10. A computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the method of claim 8.
CN202210356446.2A 2022-04-06 2022-04-06 Robot and distribution method, control device and storage medium thereof Pending CN114714321A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210356446.2A CN114714321A (en) 2022-04-06 2022-04-06 Robot and distribution method, control device and storage medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210356446.2A CN114714321A (en) 2022-04-06 2022-04-06 Robot and distribution method, control device and storage medium thereof

Publications (1)

Publication Number Publication Date
CN114714321A true CN114714321A (en) 2022-07-08

Family

ID=82242186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210356446.2A Pending CN114714321A (en) 2022-04-06 2022-04-06 Robot and distribution method, control device and storage medium thereof

Country Status (1)

Country Link
CN (1) CN114714321A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105270649A (en) * 2014-07-09 2016-01-27 波音公司 Mobile platform for performing operations inside a fuselage assembly
CN105516920A (en) * 2015-11-26 2016-04-20 广州众志物联网科技有限公司 Intelligent restaurant wireless positioning meal delivery system
CN107609997A (en) * 2017-08-23 2018-01-19 谢锋 A kind of dining room food delivery method and system
CN107666572A (en) * 2017-09-29 2018-02-06 北京金山安全软件有限公司 Shooting method, shooting device, electronic equipment and storage medium
JP2020005147A (en) * 2018-06-28 2020-01-09 株式会社リコー Information processing apparatus, movable body, remote control system, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105270649A (en) * 2014-07-09 2016-01-27 波音公司 Mobile platform for performing operations inside a fuselage assembly
CN111498138A (en) * 2014-07-09 2020-08-07 波音公司 Mobile platform for performing operations along the exterior of a fuselage assembly
CN105516920A (en) * 2015-11-26 2016-04-20 广州众志物联网科技有限公司 Intelligent restaurant wireless positioning meal delivery system
CN107609997A (en) * 2017-08-23 2018-01-19 谢锋 A kind of dining room food delivery method and system
CN107666572A (en) * 2017-09-29 2018-02-06 北京金山安全软件有限公司 Shooting method, shooting device, electronic equipment and storage medium
JP2020005147A (en) * 2018-06-28 2020-01-09 株式会社リコー Information processing apparatus, movable body, remote control system, information processing method, and program

Similar Documents

Publication Publication Date Title
US11708217B2 (en) Relay-type goods picking system and method
CN107240000B (en) Self-service vending method, system and electronic equipment
US10327583B2 (en) Automatic food preparation apparatus
EP2975568A1 (en) System and method for self-checkout using product images
US10163149B1 (en) Providing item pick and place information to a user
WO2019222246A1 (en) Systems and methods for automated storage and retrieval
CN110826963A (en) Shopping method, system, related equipment and storage medium
JP2020102072A (en) Positional relationship detecting apparatus and positional relationship detecting system
JP6591119B1 (en) Article management system and article management method
CN112061748A (en) Food distribution method and system and actuating mechanism
CN109993157A (en) Allocator, device, equipment and readable storage medium storing program for executing based on robot
CN111123717A (en) Cooking control method and device, storage medium and terminal
CN114714321A (en) Robot and distribution method, control device and storage medium thereof
WO2018166652A1 (en) Method for picking items
CN111243182A (en) On-site making and on-site selling system supporting on-line ordering
JP2021070549A (en) Picking work support device and picking work support program
KR20210119885A (en) Interworking method of ordering systems and mobile robots
CN111598518B (en) Distribution method, distribution device and electronic equipment
JP2022140498A (en) Information processing device, information processing system, information processing method, and program
WO2020048334A1 (en) Device and method for accessing a logistics object and selling goods, and storage medium
WO2021079261A1 (en) Vending machine
CN212863033U (en) Food distribution system and actuator
CN213365668U (en) Checkout equipment
JP2021070550A (en) Picking assist system and picking assist method
KR20220082919A (en) A system and method for filling a virtual shopping cart based on a video of a customer's shopping session in a physical store.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination