WO2018210109A1 - 机器人控制方法、装置和机器人 - Google Patents
机器人控制方法、装置和机器人 Download PDFInfo
- Publication number
- WO2018210109A1 WO2018210109A1 PCT/CN2018/084402 CN2018084402W WO2018210109A1 WO 2018210109 A1 WO2018210109 A1 WO 2018210109A1 CN 2018084402 W CN2018084402 W CN 2018084402W WO 2018210109 A1 WO2018210109 A1 WO 2018210109A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- user
- module
- path
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000004044 response Effects 0.000 claims description 21
- 238000012545 processing Methods 0.000 claims description 14
- 238000001514 detection method Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 8
- 230000003993 interaction Effects 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 16
- 230000008901 benefit Effects 0.000 description 13
- 238000004590 computer program Methods 0.000 description 8
- 235000014102 seafood Nutrition 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000001815 facial effect Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 230000001737 promoting effect Effects 0.000 description 2
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001680 brushing effect Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40519—Motion, trajectory planning
Definitions
- the present disclosure relates to the field of automatic control, and in particular, to a robot control method, apparatus, and robot.
- shopping malls such as supermarkets and shopping malls provide shopping carts for users.
- the user pushes the shopping cart to select the goods in the shopping place, and puts the selected items in the shopping cart, thereby facilitating the user's shopping.
- the inventors have realized that since the shopping cart needs to be driven by the user to move, the user cannot perform operations such as using the mobile phone and selecting products during the process of pushing the shopping cart. In addition, for consumers with babies, inconvenient mobility, young and old, it is more difficult to push a shopping cart.
- the present disclosure provides a solution for causing a shopping cart to move by itself to follow the user.
- a robot control method includes: determining that a robot moves to a vicinity of a binding user by identifying a user feature to lock a binding user and binding a current location of the user The path that drives the robot to move along the path to the bound user's neighborhood.
- the driving the robot to move along the path to the binding user neighboring area comprises: detecting whether an obstacle appears in front of the moving direction during the moving of the driving robot along the path; and controlling the robot to pause the movement in the case of an obstacle appearing in front of the moving; After the predetermined time, the obstacle is detected to disappear; in the case where the obstacle disappears, the driving robot continues to move along the path.
- the distance between the robot and the binding user is greater than the first predetermined distance and less than the second predetermined distance, wherein the first predetermined distance is less than the second predetermined distance.
- the barcode information on the adjacent shelf is identified; the broadcast information corresponding to the barcode information is queried; and the broadcast information is played.
- the identifier of the broadcast information After querying the broadcast information corresponding to the barcode information, extracting the identifier of the broadcast information; determining whether the identifier matches the historical data of the bound user; and if the identifier matches the historical data of the bound user, Play broadcast information.
- the voice information of the bound user is collected, the voice information is identified to obtain a voice instruction of the bound user; the voice command is analyzed and processed to obtain corresponding response information; and the response information includes the purpose.
- the path of the robot to the destination address is determined; the driving robot moves along the determined path to lead the binding user to the destination address.
- predetermined boot information is played as the drive robot moves along the determined path.
- the reply information is played to interact with the binding user.
- the state of the robot is switched to a working state based on a triggering instruction issued by the operating user; and the operating user is used as a binding user to perform user feature recognition on the bound user.
- the binding relationship between the robot and the binding user is released; and the state of the robot is switched to the idle state.
- the path of the robot to the predetermined parking place is determined; the driving robot moves along the determined path to the predetermined parking place to achieve automatic homing.
- a robot control apparatus includes: a feature recognition module configured to recognize a user feature; and a lock module configured to lock a binding based on the identified user feature a current location of the user and the bound user; a path determining module configured to determine a path that the robot moves to the neighboring area of the binding user; and a driving module configured to drive the robot to move along the path to the bound user neighboring area.
- control device further includes an obstacle detecting module configured to detect whether an obstacle is present in front of the movement when the driving module drives the robot to move along the path, and instruct the driving module to control when an obstacle occurs in front of the moving
- the robot pauses to move. After a predetermined time, it detects whether the obstacle disappears. If the obstacle disappears, it instructs the drive module to drive the robot to continue moving along the path.
- the obstacle detection module is further configured to detect the surrounding environment of the robot without the obstacle still disappearing; the path determination module is further configured to re-determine the movement of the robot to the binding user proximity according to the detection result The path of the zone; the driver module is further configured to drive the robot to move along the redefined path to the bound user proximity zone.
- the distance between the robot and the binding user is greater than the first predetermined distance and less than the second predetermined distance, wherein the first predetermined distance is less than the second predetermined distance.
- control device further includes: a barcode information identification module configured to identify barcode information on the adjacent shelf during the movement of the driver module to drive the robot; and the broadcast information query module configured to query the barcode information corresponding to the barcode information Broadcast information; a play module configured to play broadcast information.
- a barcode information identification module configured to identify barcode information on the adjacent shelf during the movement of the driver module to drive the robot
- broadcast information query module configured to query the barcode information corresponding to the barcode information Broadcast information
- a play module configured to play broadcast information.
- control device further includes an information matching module, configured to: after the broadcast information query module queries the broadcast information corresponding to the barcode information, extract the identifier of the broadcast information, and determine whether the identifier matches the historical data of the bound user. In the case that the identifier matches the historical data of the bound user, the playback module is instructed to play the broadcast information.
- an information matching module configured to: after the broadcast information query module queries the broadcast information corresponding to the barcode information, extract the identifier of the broadcast information, and determine whether the identifier matches the historical data of the bound user. In the case that the identifier matches the historical data of the bound user, the playback module is instructed to play the broadcast information.
- control device further includes: a voice recognition module, configured to: after collecting the voice information of the bound user, identify the voice information to obtain a voice instruction of the bound user; and the instruction processing module is configured to The voice instruction is analyzed and processed to obtain corresponding response information.
- the indication path determination module determines a path of the robot moving from the current location to the destination address; the driving module is further configured to drive the robot along the determination The path is moved to lead the bound user to the destination address.
- the play module is further configured to play the predetermined boot information when the drive module drives the robot to move along the determined path.
- the play module is further configured to play the reply information if the reply information is included in the response information.
- control device further includes: an interaction module configured to receive an instruction issued by the operation user; and a state switching module configured to switch the state of the robot based on a trigger instruction issued by the operation user when the robot is in an idle state For the working state, the operating user is used as the binding user, and the feature recognition module is instructed to perform user feature recognition on the binding user.
- the state switching module is further configured to: after the binding user ends the use, release the binding relationship between the robot and the binding user, and switch the state of the robot to the idle state.
- the path determining module is further configured to: after the state switching module switches the state of the robot to the idle state, determine a path that the robot moves to the predetermined parking place; the driving module is further configured to drive the robot to move along the determined path to the predetermined path. Park the location for automatic home homing.
- a robot control apparatus includes:
- a memory for storing instructions
- a processor coupled to the memory, the processor being configured to perform the method as referred to in any of the above embodiments based on the instructions stored in the memory.
- a robot comprising the robot control apparatus according to any of the above embodiments.
- a computer readable storage medium wherein the computer readable storage medium stores computer instructions that are executed by a processor to implement any of the embodiments described above Methods.
- FIG. 1 is an exemplary flowchart showing a robot control method in accordance with some embodiments of the present disclosure.
- FIG. 2 is an exemplary flowchart showing a robot control method according to further embodiments of the present disclosure.
- FIG. 3 is an exemplary block diagram showing a robot control device in accordance with some embodiments of the present disclosure.
- FIG. 4 is an exemplary block diagram showing a robot control device according to further embodiments of the present disclosure.
- FIG. 5 is an exemplary block diagram showing a robot control device according to still other embodiments of the present disclosure.
- FIG. 6 is an exemplary block diagram showing a robot control device according to some embodiments of the present disclosure.
- FIG. 7 is an exemplary block diagram showing a robot control device according to some embodiments of the present disclosure.
- FIG. 8 is an exemplary block diagram showing a robot control device according to some embodiments of the present disclosure.
- FIG. 9 is an exemplary block diagram showing a robot in accordance with some embodiments of the present disclosure.
- FIG. 1 is an exemplary flowchart showing a robot control method in accordance with some embodiments of the present disclosure. In some embodiments, the method steps illustrated in Figure 1 are performed by a robotic control device.
- the current location of the bound user and the bound user is locked by identifying the user feature.
- the robot acts as the binding user.
- the robot knows the characteristics of the bound user through face recognition and walking gesture recognition, so that the robot can lock the bound user by means of image recognition, and determine the current position of the bound user through image processing. For example, after the robot control device knows the characteristics of the bound user, by identifying the walking posture of the face or person in the captured image, the bound user can be identified in a multi-person environment.
- the robot control device also determines the orientation of the bound user relative to the robot by image recognition, and the distance of the bound user relative to the current position of the robot.
- the current location of the bound user is determined by identifying the shelf or other identification of the bound user.
- the robot is a smart shopping cart, or other smart mobile device capable of carrying items.
- the robot is capable of switching between two states, an active state and an idle state. For example, a robot in an idle state switches its state to an active state after being bound to a user.
- step 102 it is determined that the robot moves to a path that binds the user's neighborhood.
- the robot uses the map information of the current location, uses the current location of the robot as a starting point, and uses the neighboring area of the bound user as an end point to perform path planning between the starting point and the ending point. For example, the robot can determine its current location by reading the navigation bar code. Since the path planning itself is not the point of the invention of the present disclosure, the description will not be made here.
- the drive robot is moved along the determined path to the bound user proximity area.
- the distance of the robot from the bound user is greater than the first predetermined distance and less than the second predetermined distance in the vicinity of the bound user, wherein the first predetermined distance is less than the second predetermined distance. That is to say, after reaching the bound user's neighboring area, the robot will keep a certain distance from the bound user. Therefore, the user can be used without affecting the user's walking, thereby improving the user's shopping experience.
- the robot by binding the user to the robot, the robot moves by itself to follow the side of the bound user, thereby liberating the user's hands and significantly improving the user experience.
- FIG. 2 is an exemplary flowchart showing a robot control method according to further embodiments of the present disclosure. In some embodiments, the method steps illustrated in Figure 2 are performed by a robotic control device.
- the drive robot is moved along the selected path.
- step 202 it is detected whether an obstacle is present in front of the movement.
- the video information in front of the mobile can be collected by the camera and analyzed.
- step 203 in the case where an obstacle appears in front of the movement, the control robot pauses the movement and waits for a predetermined time.
- step 204 it is detected whether the obstacle has disappeared. If the obstacle disappears, step 205 is performed; if the obstacle has not disappeared, step 206 is performed.
- step 205 the drive robot continues to move along the original path.
- step 206 the surroundings of the robot are detected.
- step 207 based on the detection result, the path of the robot moving to the adjacent area of the binding user is re-determined.
- the drive robot is moved along the redefined path to the bound user proximity area.
- the robot During the movement of the robot, if an obstacle appears in front of it, such as other people or robots, the robot will wait a little. If the obstacle leaves on its own, the robot continues to move according to the original route. If the obstacle persists, the robot re-routes according to the current position and the target position, and moves according to the re-planned path. This allows the robot to avoid obstacles on its own while following the binding user.
- the robot control device identifies the bar code information on the adjacent shelf during the movement of the driving robot, queries the broadcast information corresponding to the bar code information, and plays the same to bind the user to the product information on the adjacent shelf.
- the corresponding barcode information is set on the shelf, and the robot reads the barcode information on the adjacent shelf by image recognition during the moving process.
- the robot obtains and broadcasts the broadcast information corresponding to the barcode information, thereby enabling the user to understand the advertisements, promotions, and the like related to the goods on the shelf, thereby facilitating the user's shopping.
- the robotic control device filters the received broadcast information based on user history data to provide a better user experience.
- the robot control device extracts the identifier of the broadcast information after querying the broadcast information corresponding to the barcode information.
- the robot controller determines whether the flag matches the historical data of the bound user. If the identification matches the historical data of the bound user, the robot control device plays the broadcast information. If the identification matches the historical data of the bound user, the robot control device does not play the broadcast information.
- user history data indicates that the user is interested in electronic products.
- the robot control device queries the identification of the broadcast information, and if the information relates to the electronic product information, it is played to the binding user. If the information relates to a toothbrush discount promotion, the robot control device will not play to the binding user, thereby improving the user experience.
- the robot may query user history data corresponding to the facial features by extracting facial features of the bound user to provide personalized services to the user based on the user historical data.
- the robot is also capable of providing navigation services to the bound user.
- the user can issue a voice command to the robot.
- the robot control device After collecting the voice information of the bound user, the robot control device identifies the voice information to obtain a voice command of the bound user.
- the robot control device analyzes and processes the voice command to obtain corresponding response information. If the response information includes the destination address, the robot controller performs path planning to determine the path of the robot moving from the current location to the destination address, and drives the robot to move along the determined path to lead the binding user to the destination address.
- the robot control device determines the location of the seafood area by identification, and then performs path planning and drives the robot to move, so as to lead the binding user to the seafood area. For another example, if the binding user says “checkout” or "pay”, the robot controller will drive the robot to move to lead the binding user to the checkout counter.
- the robotic control device also plays predetermined guidance information as the drive robot moves along the determined path. For example, when a robot leads a bound user to a destination, it will play a guide message such as "please follow me” to enhance the user experience.
- the bot can also interact with the binding user to provide communication services for the binding.
- the user issues a voice command to the robot to ask questions of interest.
- the robot control device identifies the voice information to obtain a voice command of the bound user.
- the robot control device analyzes and processes the voice command to obtain corresponding response information. If the response information includes reply information, the reply information is played to interact with the bound user.
- the robot control device provides information such as the main manufacturer, product features and product price of the product to the user by interacting with the business server, thereby improving the interest and convenience of the user's shopping. At the same time, it can also become a window for manufacturers and brands to promote products and release promotional information.
- the robot control device releases the binding relationship between the robot and the binding user, and switches the state of the robot to the idle state.
- the user can click on the touch screen after the end of use or checkout to perform the corresponding operation to switch the state of the robot to the idle state, so that the robot can continue to provide services for other users.
- the robot control device may also determine a path of the robot moving from the current position to the predetermined parking place by performing path planning, thereby driving the robot to move along the determined path to the predetermined path. Park the location for automatic home homing.
- FIG. 3 is an exemplary block diagram showing a robot control device in accordance with some embodiments of the present disclosure.
- the robot control device includes a feature recognition module 31, a lock module 32, a path determination module 33, and a drive module 34.
- the feature recognition module 31 is for identifying user features.
- the feature recognition module 31 performs face recognition or walking gesture recognition to acquire user features.
- the locking module 32 is configured to lock the binding user and the current location of the binding user according to the identified user feature.
- the robot control device After the robot control device knows the characteristics of the binding user, by identifying the walking posture of the face or person in the captured image, the binding user can be identified in a multi-person environment.
- the robot control device also determines the orientation of the bound user relative to the robot by image recognition, and the distance of the bound user relative to the current position of the robot.
- the robot locks the bound user by image recognition the current location of the bound user is determined by identifying the shelf or other identification of the bound user.
- the path determination module 33 is configured to determine a path that the robot moves to the vicinity of the bound user.
- the drive module 34 is configured to drive the robot to move along the path to the bound user proximity area.
- the distance of the robot from the bound user is greater than the first predetermined distance and less than the second predetermined distance in the vicinity of the bound user, wherein the first predetermined distance is less than the second predetermined distance. Therefore, the user can be used without affecting the user's walking, thereby improving the user's shopping experience.
- the robot by binding the user to the robot, the robot moves by itself to follow the side of the bound user, thereby liberating the user's hands and significantly improving the user experience.
- FIG. 4 is an exemplary block diagram showing a robot control device according to further embodiments of the present disclosure. 4 is different from FIG. 3, in the embodiment shown in FIG. 4, the robot control device further includes an obstacle detecting module 35.
- the obstacle detecting module 35 is configured to detect whether an obstacle is present in front of the movement when the driving module 34 drives the robot to move along the path. If an obstacle appears in front of the movement, the obstacle detecting module 35 instructs the driving module to control the robot to pause the movement, and after the predetermined time elapses, detects whether the obstacle disappears. If the obstacle disappears, the obstacle detection module 35 instructs the drive module 34 to drive the robot to continue moving along the path.
- the obstacle detection module 35 is further configured to detect the surroundings of the robot if the obstacle has not disappeared after a predetermined time has elapsed.
- the path determining module 33 is further configured to re-determine the path that the robot moves from the current location to the bound user neighboring area according to the detection result.
- the drive module 34 is also used to drive the robot to move along the redefined path to the bound user proximity area.
- the robot is able to achieve a self-detour according to the current environment in the event of an obstacle appearing in front of the movement.
- FIG. 5 is an exemplary block diagram showing a robot control device according to still other embodiments of the present disclosure.
- the robot control apparatus further includes a bar code information recognition module 36, a broadcast information inquiry module 37, and a play module 38.
- the barcode information identification module 36 is configured to identify barcode information on adjacent shelves during the movement of the drive module to drive the robot.
- the broadcast information query module 37 is configured to query broadcast information corresponding to the barcode information.
- the play module 38 is configured to play broadcast information so as to bind the user to the item information on the adjacent shelf. Thus, it is convenient to use information for advertising, promotion, and the like of items on adjacent shelves.
- the robot control device further includes an information matching module 39.
- the information matching module 39 is configured to extract the identifier of the broadcast information after the broadcast information query module 37 queries the broadcast information corresponding to the barcode information, and determine whether the identifier matches the historical data of the bound user. If the identification matches the historical data of the binding user, the information matching module 39 instructs the playing module 38 to play the broadcast information.
- the robot control device plays only information of interest to the user based on the user history data. For example, the robot control device knows that the user is interested in the electronic product based on the historical data, and therefore only plays related information of the electronic product, and does not play the toothbrush promotion advertisement to the user, thereby improving the user experience.
- corresponding user history data can be determined by identifying facial features of the user. So that users can get personalized service by brushing their faces.
- FIG. 6 is an exemplary block diagram showing a robot control device according to some embodiments of the present disclosure. The difference between FIG. 6 and FIG. 5 is that, in the embodiment shown in FIG. 6, the robot control apparatus further includes a voice recognition module 310 and an instruction processing module 311.
- the voice recognition module 310 is configured to identify the voice information after acquiring the voice information of the binding user, to obtain a voice instruction of the binding user.
- the instruction processing module 311 is configured to perform analysis processing on the voice instruction to obtain corresponding response information. If the response information includes the destination address, the indication path determining module 33 determines the path that the robot moves to the destination address.
- the drive module 34 is also used to drive the robot to move along the determined path to lead the binding user to the destination address.
- the user can obtain a navigation service by issuing a voice command. For example, if the binding user says "seafood", the robot control device determines the location of the seafood area by identification, and then performs path planning and drives the robot to move, so as to lead the binding user to the seafood area.
- the play module 38 is further configured to play predetermined boot information when the drive module 34 drives the robot to move along the determined path. For example, during the lead process, a guide such as "Please follow me” will be played to enhance the user experience.
- the playing module 38 is further configured to play the reply information when the reply information is included in the response information to interact with the binding user. For example, if the user asks for the price of a certain product, the robot control device provides information such as the main manufacturer, product features and product price of the product to the user by interacting with the business server, thereby improving the interest and convenience of the user's shopping. At the same time, it can also become a window for manufacturers and brands to promote products and release promotional information.
- FIG. 7 is an exemplary block diagram showing a robot control device according to some embodiments of the present disclosure. 7 is different from FIG. 6 in that, in the embodiment shown in FIG. 7, the robot control apparatus further includes an interaction module 312 and a state switching module 313.
- the interaction module 312 is configured to receive an instruction issued by an operation user.
- the interaction module 312 is a touch screen, or other interactive device capable of receiving user instructions.
- the state switching module 313 is configured to switch the state of the robot to the working state based on the triggering instruction issued by the operating user when the robot is in the idle state, and use the operating user as the binding user, and instruct the feature recognition module 31 to perform the user on the binding user. Feature recognition.
- the state switching module 313 is further configured to release the binding relationship between the robot and the binding user after the binding user ends the use, and switch the state of the robot to the idle state, so that the robot continues to provide the other users. service.
- the path determination module 33 is further configured to determine, by the state switching module, a path of the robot moving to a predetermined parking place after switching the state of the robot to the idle state.
- the drive module 34 is also operative to drive the robot to move along a determined path to a predetermined parking location for automatic home homing.
- FIG. 8 is an exemplary block diagram showing a robot control device according to some embodiments of the present disclosure.
- the robot control device includes a memory 801 and a processor 802.
- Memory 801 is for storing instructions
- processor 802 is coupled to memory 801, and processor 802 is configured to perform the methods involved in any of the embodiments of FIGS. 1-2 based on instructions stored in the memory.
- the robot control apparatus further includes a communication interface 803 for performing information interaction with other devices.
- the apparatus further includes a bus 804, and the processor 802, the communication interface 803, and the memory 801 complete communication with each other via the bus 804.
- the memory 801 may include a high speed RAM (Random-Access Memory) memory, and may also include a non-volatile memory such as at least one disk memory. Memory 801 can also be a memory array. The memory 801 may also be partitioned, and the blocks may be combined into a virtual volume according to certain rules.
- RAM Random-Access Memory
- the processor 802 can be a central processing unit CPU, or can be an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the present disclosure.
- CPU central processing unit
- ASIC Application Specific Integrated Circuit
- FIG. 9 is an exemplary block diagram showing a robot in accordance with some embodiments of the present disclosure.
- the robot 91 includes a robot control device 92.
- the robot control device 92 is the robot control device according to any of the embodiments of FIGS. 3 to 8.
- the robot moves by itself to follow the side of the bound user, thereby liberating the user's hands and significantly improving the user experience.
- the functional unit modules described in the above embodiments may be implemented as a general purpose processor, Programmable Logic Controller (PLC), digital signal processing for performing the functions described in the present disclosure.
- PLC Programmable Logic Controller
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- discrete gates Either a transistor logic device, a discrete hardware component, or any suitable combination thereof.
- the present disclosure also provides a computer readable storage medium, wherein the computer readable storage medium stores computer instructions that, when executed by a processor, implement the method of any of the embodiments of FIG. 1 or FIG.
- the present disclosure may be provided as a method, apparatus, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware aspects.
- the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code. .
- a robot such as a smart shopping cart can follow the bound user by itself, thereby freeing the user's hands and facilitating the user to select items, use the mobile phone, and the like. The advantage is even more obvious for users who are babies and have limited mobility.
- embodiments of the present disclosure can be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware aspects. Moreover, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code. .
- the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
- the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
- These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
- the instructions provide steps for implementing the functions specified in one or more flows of the flowchart or in a block or blocks of the flowchart.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
Abstract
一种机器人控制方法、装置和机器人,涉及自动控制领域。机器人控制装置通过识别用户特征以锁定绑定用户及绑定用户的当前位置,确定机器人移动到绑定用户邻近区域的路径,驱动机器人沿所确定的路径移动到绑定用户邻近区域。通过将用户与机器人进行绑定,机器人通过自行移动以跟随在绑定用户的身旁,由此可解放用户的双手,显著提高用户体验。
Description
本申请是以CN申请号为201710341099.5,申请日为2017年5月16日的申请为基础,并主张其优先权,该CN申请的公开内容在此作为整体引入本申请中。
本公开涉及自动控制领域,特别涉及一种机器人控制方法、装置和机器人。
目前超市、商场等购物场所为用户提供购物车。用户推着购物车在购物场所内挑选商品,并将所挑选商品放在购物车内,从而为用户购物提供便利。
发明内容
发明人认识到,由于购物车需要依靠用户的推动才能移动,因此用户在推购物车的过程中,无法进行使用手机、挑选商品等操作。此外,对于怀抱婴儿、行动不便、年幼、年老的消费者来说,推着购物车行走较为困难。
为此,本公开提供一种使购物车通过自行移动以跟随在用户身旁的方案。
依据本公开的一个或多个实施例的一个方面,提供一种机器人控制方法,包括:通过识别用户特征以锁定绑定用户及绑定用户的当前位置,确定机器人移动到绑定用户邻近区域的路径,驱动机器人沿路径移动到绑定用户邻近区域。
可选地,驱动机器人沿路径移动到绑定用户邻近区域包括:在驱动机器人沿路径移动过程中,检测移动前方是否出现障碍物;在移动前方出现障碍物的情况下,控制机器人暂停移动;经过预定时间后,检测障碍物是否消失;在障碍物消失的情况下,驱动机器人继续沿路径移动。
可选地,在障碍物仍未消失的情况下,对机器人的周围环境进行检测;根据检测结果,重新确定机器人移动到绑定用户邻近区域的路径;驱动机器人沿重新确定的路径移动到绑定用户邻近区域。
可选地,在绑定用户邻近区域,机器人与绑定用户的距离大于第一预定距离且小于第二预定距离,其中第一预定距离小于第二预定距离。
可选地,在驱动机器人移动过程中,识别邻近货架上的条码信息;查询与条码信 息相对应的广播信息;播放广播信息。
可选地,在查询与条码信息相对应的广播信息后,提取广播信息的标识;判断标识是否与绑定用户的历史数据相匹配;在标识与绑定用户的历史数据相匹配的情况下,播放广播信息。
可选地,在采集到绑定用户的语音信息后,对语音信息进行识别,以得到绑定用户的语音指令;对语音指令进行分析处理,以得到相应的响应信息;在响应信息中包括目的地址的情况下,确定机器人移动到目的地址的路径;驱动机器人沿确定的路径进行移动,以便引领绑定用户到达目的地址。
可选地,在驱动机器人沿确定的路径进行移动时,播放预定的引导信息。
可选地,在响应信息中包括答复信息的情况下,播放答复信息,以便与绑定用户进行交互。
可选地,在机器人处于空闲状态下,基于操作用户发出的触发指令,将机器人的状态切换为工作状态;将操作用户作为绑定用户,对绑定用户进行用户特征识别。
可选地,在绑定用户结束使用后,解除机器人与绑定用户之间的绑定关系;将机器人的状态切换为空闲状态。
可选地,在将机器人的状态切换为空闲状态后,确定机器人移动到预定停放地点的路径;驱动机器人沿确定的路径移动到预定停放地点,以便实现自动归位。
依据本公开的一个或多个实施例的另一个方面,提供一种机器人控制装置,包括:特征识别模块,被配置为识别用户特征;锁定模块,被配置为根据识别出的用户特征锁定绑定用户及所述绑定用户的当前位置;路径确定模块,被配置为确定机器人移动到绑定用户邻近区域的路径;驱动模块,被配置为驱动机器人沿路径移动到绑定用户邻近区域。
可选地,上述控制装置还包括障碍物检测模块,被配置为在驱动模块驱动机器人沿路径移动过程中,检测移动前方是否出现障碍物,在移动前方出现障碍物的情况下,指示驱动模块控制机器人暂停移动,经过预定时间后,检测障碍物是否消失,若障碍物消失,则指示驱动模块驱动机器人继续沿路径移动。
可选地,障碍物检测模块还被配置为在障碍物仍未消失的情况下,对机器人的周围环境进行检测;路径确定模块还被配置为根据检测结果,重新确定机器人移动到绑定用户邻近区域的路径;驱动模块还被配置为驱动机器人沿重新确定的路径移动到绑定用户邻近区域。
可选地,在绑定用户邻近区域,机器人与绑定用户的距离大于第一预定距离且小于第二预定距离,其中第一预定距离小于第二预定距离。
可选地,上述控制装置还包括:条码信息识别模块,被配置为在驱动模块驱动机器人移动过程中,识别邻近货架上的条码信息;广播信息查询模块,被配置为查询与条码信息相对应的广播信息;播放模块,被配置为播放广播信息。
可选地,上述控制装置还包括信息匹配模块,被配置为在广播信息查询模块查询与条码信息相对应的广播信息后,提取广播信息的标识,判断标识是否与绑定用户的历史数据相匹配,在标识与绑定用户的历史数据相匹配的情况下,指示播放模块播放广播信息。
可选地,上述控制装置还包括:语音识别模块,被配置为在采集到绑定用户的语音信息后,对语音信息进行识别,以得到绑定用户的语音指令;指令处理模块,被配置为对语音指令进行分析处理以得到相应的响应信息,在响应信息中包括目的地址的情况下,指示路径确定模块确定机器人从当前位置移动到目的地址的路径;驱动模块还被配置为驱动机器人沿确定的路径进行移动,以便引领绑定用户到达目的地址。
可选地,播放模块还被配置为在驱动模块驱动机器人沿确定的路径进行移动时播放预定的引导信息。
可选地,播放模块还被配置为在响应信息中包括答复信息的情况下,播放答复信息。
可选地,上述控制装置还包括:交互模块,被配置为接收操作用户发出的指令;状态切换模块,被配置为在机器人处于空闲状态下,基于操作用户发出的触发指令,将机器人的状态切换为工作状态,将操作用户作为绑定用户,并指示特征识别模块对绑定用户进行用户特征识别。
可选地,状态切换模块还被配置为在绑定用户结束使用后,解除机器人与绑定用户之间的绑定关系,将机器人的状态切换为空闲状态。
可选地,路径确定模块还被配置为状态切换模块在将机器人的状态切换为空闲状态后,确定机器人移动到预定停放地点的路径;驱动模块还被配置为驱动机器人沿确定的路径移动到预定停放地点,以便实现自动归位。
依据本公开的一个或多个实施例的另一个方面,提供一种机器人控制装置,包括:
存储器,用于存储指令;
处理器,耦合到存储器,处理器被配置为基于存储器存储的指令执行实现如上述 任一实施例涉及的方法。
依据本公开的一个或多个实施例的另一个方面,提供一种机器人,包括如上述任一实施例涉及的机器人控制装置。
依据本公开的一个或多个实施例的又一个方面,还提供一种计算机可读存储介质,其中计算机可读存储介质存储有计算机指令,指令被处理器执行时实现如上述任一实施例涉及的方法。
通过以下参照附图对本公开的示例性实施例的详细描述,本公开的其它特征及其优点将会变得清楚。
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为示出根据本公开一些实施例的机器人控制方法的示例性流程图。
图2为示出根据本公开另一些实施例的机器人控制方法的示例性流程图。
图3为示出根据本公开一些实施例的机器人控制装置的示例性框图。
图4为示出根据本公开另一些实施例的机器人控制装置的示例性框图。
图5为示出根据本公开又一些实施例的机器人控制装置的示例性框图。
图6为示出根据本公开又一些实施例的机器人控制装置的示例性框图。
图7为示出根据本公开又一些实施例的机器人控制装置的示例性框图。
图8为示出根据本公开又一些实施例的机器人控制装置的示例性框图。
图9为示出根据本公开一些实施例的机器人的示例性框图。
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本公开的范围。
同时,应当明白,为了便于描述,附图中所示出的各个部分的尺寸并不是按照实际的比例关系绘制的。
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为授权说明书的一部分。
在这里示出和讨论的所有示例中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它示例可以具有不同的值。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。
图1为示出根据本公开一些实施例的机器人控制方法的示例性流程图。在一些实施例中,图1所示的方法步骤由机器人控制装置执行。
在步骤101,通过识别用户特征以锁定绑定用户及绑定用户的当前位置。
在一些实施例中,在用户向处于空闲状态的机器人发出触发指令的情况下,例如利用机器人上的触摸屏进行触发操作,机器人将该用户作为绑定用户。机器人通过脸部识别和行走姿态识别获知绑定用户的特征,从而机器人可通过图像识别的方式锁定绑定用户,并通过图像处理确定出绑定用户的当前位置。例如,机器人控制装置在获知绑定用户的特征后,通过对采集图像中的人脸或人的行走姿态进行识别,能够在多人环境下识别出绑定用户。机器人控制装置还通过图像识别,确定该绑定用户相对于机器人的方位,以及该绑定用户相对于机器人当前位置的距离。又例如,机器人在通过图像识别锁定绑定用户后,通过识别绑定用户邻近的货架或其它标识,以确定绑定用户的当前位置。
这里需要说明的是,机器人为智能购物车,或其它能够承载物品的智能可移动设备。
在一些实施例中,机器人能够在工作状态和空闲状态这两种状态之间切换。例如,处于空闲状态的机器人在与用户进行绑定后,将自身状态切换为工作状态。
在步骤102,确定机器人移动到绑定用户邻近区域的路径。
在一些实施例中,机器人利用当前场所的地图信息,将机器人当前位置作为出发点,将绑定用户的邻近区域作为终点,在出发点和终点之间进行路径规划。例如,机器人可通过读取导航条码以确定自身的当前位置。由于路径规划本身并不是本公开的 发明点所在,因此这里不展开描述。
在步骤103,驱动机器人沿确定的路径移动到绑定用户邻近区域。
在一些实施例中,在绑定用户邻近区域,机器人与绑定用户的距离大于第一预定距离且小于第二预定距离,其中第一预定距离小于第二预定距离。也就是说,机器人在到达绑定用户邻近区域后,会与绑定用户保持一定的距离。由此,既能方便用户使用,又不会影响到用户走动,从而提高了用户的购物体验。
在本公开上述实施例提供的机器人控制方法中,通过将用户与机器人进行绑定,机器人通过自行移动以跟随在绑定用户的身旁,由此可解放用户的双手,显著提高用户体验。
图2为示出根据本公开另一些实施例的机器人控制方法的示例性流程图。在一些实施例中,图2所示的方法步骤由机器人控制装置执行。
在步骤201,驱动机器人沿所选择的路径移动。
在步骤202,检测移动前方是否出现障碍物。
例如,可通过摄像头采集移动前方的视频信息并进行分析。
在步骤203,在移动前方出现障碍物的情况下,控制机器人暂停移动,并等待预定时间。
在步骤204,检测障碍物是否消失。若障碍物消失,则执行步骤205;若障碍物仍未消失,则执行步骤206。
在步骤205,驱动机器人继续沿原路径移动。
在步骤206,对机器人的周围环境进行检测。
在步骤207,根据检测结果,重新确定机器人移动到绑定用户邻近区域的路径。
在步骤208,驱动机器人沿重新确定的路径移动到绑定用户邻近区域。
在机器人移动过程中,若前方出现了障碍物,例如可能是其它人员或机器人,则该机器人会稍作等待。若该障碍物自行离开,则机器人按照原定路线继续移动。若该障碍物一直存在,则该机器人根据当前位置和目标位置重新进行路径规划,并按重新规划的路径进行移动。从而可使机器人在跟随绑定用户的过程中能够自行避开障碍物。
在一些实施例中,机器人控制装置在驱动机器人移动过程中,识别邻近货架上的条码信息,查询与条码信息相对应的广播信息并进行播放,以便绑定用户了解邻近货架上的商品信息。
例如,货架上设置有相应的条码信息,机器人在移动过程中,通过图像识别方式读取相邻货架上的条码信息。机器人获得与该条码信息相对应的广播信息并进行播放,从而使用户能够了解与该货架上商品有关的广告、促销等信息,以便给用户购物提供便利。
在一些实施例中,机器人控制装置根据用户历史数据对接收到的广播信息进行筛选,以便提供更好的用户体验。机器人控制装置在查询与条码信息相对应的广播信息后,提取广播信息的标识。机器人控制装置判断该标识是否与绑定用户的历史数据相匹配。若标识与绑定用户的历史数据相匹配,则机器人控制装置播放该广播信息。若标识与绑定用户的历史数据相匹配,则机器人控制装置不播放该广播信息。
例如,用户历史数据表明该用户对电子产品感兴趣。机器人控制装置通过查询广播信息的标识,若该信息涉及电子产品信息,则将其播放给绑定用户。若该信息涉及牙刷打折促销,则机器人控制装置不会播放给绑定用户,由此可提高用户体验。
在一些实施例中,机器人可通过提取绑定用户的脸部特征,来查询与该脸部特征相对应的用户历史数据,以便根据用户历史数据给用户提供个性化服务。
在一些实施例中,机器人还能够给绑定用户提供导航服务。例如,用户可对机器人发出语音指令。机器人控制装置在采集到绑定用户的语音信息后,对语音信息进行识别,以得到绑定用户的语音指令。机器人控制装置对语音指令进行分析处理,以得到相应的响应信息。若响应信息中包括目的地址,则机器人控制装置进行路径规划,以确定机器人从当前位置移动到目的地址的路径,并驱动机器人沿确定的路径进行移动,以便引领绑定用户到达目的地址。
例如,绑定用户说“海鲜”,则机器人控制装置通过识别,确定出海鲜区的位置,进而进行路径规划并驱动机器人进行移动,以便带领绑定用户前往海鲜区。又例如,若绑定用户说“结账”或“买单”,则机器人控制器会驱动机器人进行移动,以便带领绑定用户前往收银台。
在一些实施例中,机器人控制装置在驱动机器人沿确定的路径进行移动时,还播放预定的引导信息。例如,机器人在引领绑定用户前往目的地时,会播放诸如“请跟我来”的引导信息,以提升用户体验。
在一些实施例中,机器人还可与绑定用户进行交互,以便为绑定提供沟通服务。例如,用户对机器人发出语音指令,以便对感兴趣的问题进行提问。机器人控制装置在采集到绑定用户的语音信息后,对语音信息进行识别,以得到绑定用户的语音指令。 机器人控制装置对语音指令进行分析处理,以得到相应的响应信息。若响应信息中包括答复信息,则播放答复信息,以便与绑定用户进行交互。
例如,若用户询问某产品的价格情况,则机器人控制装置通过与业务服务器交互,将生产该产品的主要厂商、产品特点和产品价格等信息提供给用户,从而提升用户购物的趣味性和便利性,同时也可成为厂商、品牌上进行商品宣传和发布促销信息的一个窗口。
在一些实施例中,在绑定用户结束使用后,机器人控制装置解除机器人与绑定用户之间的绑定关系,将机器人的状态切换为空闲状态。例如,用户可在结束使用或结帐后点击触摸屏进行相应操作,以便将机器人的状态切换为空闲状态,以便机器人继续为其他用户提供服务。
在一些实施例中,机器人控制装置在将机器人的状态切换为空闲状态后,还可通过进行路径规划,确定机器人从当前位置移动到预定停放地点的路径,进而驱动机器人沿确定的路径移动到预定停放地点,以便实现自动归位。
图3为示出根据本公开一些实施例的机器人控制装置的示例性框图。
如图3所示,机器人控制装置包括特征识别模块31、锁定模块32、路径确定模块33和驱动模块34。特征识别模块31用于识别用户特征。例如,特征识别模块31进行脸部识别或行走姿态识别,以获取用户特征。锁定模块32用于根据识别出的用户特征锁定绑定用户及所述绑定用户的当前位置。
在一些实施例中,机器人控制装置在获知绑定用户的特征后,通过对采集图像中的人脸或人的行走姿态进行识别,能够在多人环境下识别出绑定用户。机器人控制装置还通过图像识别,确定该绑定用户相对于机器人的方位,以及该绑定用户相对于机器人当前位置的距离。又例如,机器人在通过图像识别锁定绑定用户后,通过识别绑定用户邻近的货架或其它标识,以确定绑定用户的当前位置。
路径确定模块33用于确定机器人移动到绑定用户邻近区域的路径。驱动模块34用于驱动机器人沿路径移动到绑定用户邻近区域。
在一个实施例中,在绑定用户邻近区域,机器人与绑定用户的距离大于第一预定距离且小于第二预定距离,其中第一预定距离小于第二预定距离。由此,既能方便用户使用,又不会影响到用户走动,从而提高了用户的购物体验。
在本公开上述实施例提供的机器人控制装置中,通过将用户与机器人进行绑定,机器人通过自行移动以跟随在绑定用户的身旁,由此可解放用户的双手,显著提高用 户体验。
图4为示出根据本公开另一些实施例的机器人控制装置的示例性框图。图4与图3不同的是,在图4所示实施例中,机器人控制装置还进一步包括障碍物检测模块35。
障碍物检测模块35用于在驱动模块34驱动机器人沿路径移动过程中,检测移动前方是否出现障碍物。若移动前方出现障碍物,则障碍物检测模块35指示驱动模块控制机器人暂停移动,并在经过预定时间后,检测障碍物是否消失。若障碍物消失,则障碍物检测模块35指示驱动模块34驱动机器人继续沿路径移动。
在一些实施例中,障碍物检测模块35还用于在障碍物经过预定时间后仍未消失的情况下,对机器人的周围环境进行检测。路径确定模块33还用于根据检测结果,重新确定机器人从当前位置移动到绑定用户邻近区域的路径。驱动模块34还用于驱动机器人沿重新确定的路径移动到绑定用户邻近区域。
在上述一些实施例中,在移动前方出现障碍物的情况下,机器人能够根据当前环境实现自行绕道。
图5为示出根据本公开又一些实施例的机器人控制装置的示例性框图。图5与图4的区别在于,在图5所示实施例中,机器人控制装置还包括条码信息识别模块36、广播信息查询模块37和播放模块38。
条码信息识别模块36用于在驱动模块驱动机器人移动过程中,识别邻近货架上的条码信息。广播信息查询模块37用于查询与条码信息相对应的广播信息。播放模块38用于播放广播信息,以便绑定用户了解邻近货架上的商品信息。由此,可便于用于了解邻近货架上商品的广告、促销等信息。
在一些实施例中,如图5所示,机器人控制装置还包括信息匹配模块39。信息匹配模块39用于在广播信息查询模块37查询与条码信息相对应的广播信息后,提取出广播信息的标识,并判断标识是否与绑定用户的历史数据相匹配。若标识与绑定用户的历史数据相匹配,则信息匹配模块39指示播放模块38播放广播信息。
也就是说,机器人控制装置根据用户历史数据只播放用户感兴趣的信息。例如,机器人控制装置根据历史数据了解到用户对电子产品感兴趣,因此仅会播放电子产品的相关信息,而不会给该用户播放牙刷促销广告,从而提高用户体验。
例如,可通过识别用户的脸部特征来确定相应的用户历史数据。以便用户通过刷脸,就可获得个性化的服务。
图6为示出根据本公开又一些实施例的机器人控制装置的示例性框图。图6与图 5的区别在于,在图6所示实施例中,机器人控制装置还进一步包括语音识别模块310和指令处理模块311。
语音识别模块310用于在采集到绑定用户的语音信息后,对语音信息进行识别,以得到绑定用户的语音指令。指令处理模块311用于对语音指令进行分析处理以得到相应的响应信息,若响应信息中包括目的地址,则指示路径确定模块33确定机器人移动到目的地址的路径。驱动模块34还用于驱动机器人沿确定的路径进行移动,以便引领绑定用户到达目的地址。
从而,用户可通过发出语音指令以获得导航服务。例如,绑定用户说“海鲜”,则机器人控制装置通过识别,确定出海鲜区的位置,进而进行路径规划并驱动机器人进行移动,以便带领绑定用户前往海鲜区。
在一些实施例中,播放模块38还用于在驱动模块34驱动机器人沿确定的路径进行移动时播放预定的引导信息。例如,在引领过程中会播放诸如“请跟我来”的引导信息,以提升用户体验。
在一些实施例中,播放模块38还用于在响应信息中包括答复信息时,播放答复信息,以便与绑定用户进行交互。例如,若用户询问某产品的价格情况,则机器人控制装置通过与业务服务器交互,将生产该产品的主要厂商、产品特点和产品价格等信息提供给用户,从而提升用户购物的趣味性和便利性,同时也可成为厂商、品牌上进行商品宣传和发布促销信息的一个窗口。
图7为示出根据本公开又一些实施例的机器人控制装置的示例性框图。图7与图6的区别在于,在图7所示实施例中,机器人控制装置还包括交互模块312和状态切换模块313。
交互模块312用于接收操作用户发出的指令。例如,交互模块312为触摸屏,或者其它能够接收用户指令的交互设备。状态切换模块313用于在机器人处于空闲状态下,基于操作用户发出的触发指令,将机器人的状态切换为工作状态,将操作用户作为绑定用户,并指示特征识别模块31对绑定用户进行用户特征识别。
在一些实施例中,状态切换模块313还用于在绑定用户结束使用后,解除机器人与绑定用户之间的绑定关系,将机器人的状态切换为空闲状态,以便机器人继续为其他用户提供服务。
在一些实施例中,路径确定模块33还用于状态切换模块在将机器人的状态切换为空闲状态后,确定机器人移动到预定停放地点的路径。驱动模块34还用于驱动机 器人沿确定的路径移动到预定停放地点,以便实现自动归位。
图8为示出根据本公开又一些实施例的机器人控制装置的示例性框图。
如图8所示,机器人控制装置包括存储器801和处理器802。存储器801用于存储指令,处理器802耦合到存储器801,处理器802被配置为基于存储器存储的指令执行实现如图1至图2中任一实施例涉及的方法。
如图8所示,机器人控制装置还包括通信接口803,用于与其它设备进行信息交互。同时,该装置还包括总线804,处理器802、通信接口803、以及存储器801通过总线804完成相互间的通信。
存储器801可以包含高速RAM(Random-Access Memory)存储器,也可还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。存储器801也可以是存储器阵列。存储器801还可能被分块,并且块可按一定的规则组合成虚拟卷。
在一些实施例中,处理器802可以是一个中央处理器CPU,或者可以是专用集成电路ASIC(Application Specific Integrated Circuit),或者是被配置成实施本公开实施例的一个或多个集成电路。
图9为示出根据本公开一些实施例的机器人的示例性框图。
如图9所示,机器人91中包括机器人控制装置92。机器人控制装置92为图3至图8中任一实施例涉及的机器人控制装置。
在本公开上述实施例提供的机器人中,机器人通过自行移动以跟随在绑定用户的身旁,由此可解放用户的双手,显著提高用户体验。
在一些实施例中,在上述实施例所描述的功能单元模块可以实现为用于执行本公开所描述功能的通用处理器、可编程逻辑控制器(Programmable Logic Controller,简称:PLC)、数字信号处理器(Digital Signal Processor,简称:DSP)、专用集成电路(Application Specific Integrated Circuit,简称:ASIC)、现场可编程门阵列(Field-Programmable Gate Array,简称:FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件或者其任意适当组合。
本公开还提供一种计算机可读存储介质,其中计算机可读存储介质存储有计算机指令,指令被处理器执行时实现如图1或图2中任一实施例所涉及的方法。本领域内的技术人员应明白,本公开的实施例可提供为方法、装置、或计算机程序产品。因此,本公开可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本公开可采用在一个或多个其中包含有计算机可用程序代码的计算机可 用非瞬时性存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
通过实施本公开,可以得到以下有益效果中的至少一项:
1)诸如智能购物车的机器人可自行跟随所绑定的用户,从而可解放用户的双手,便于用户进行挑选商品、使用手机等操作。对于怀抱婴儿、行动不便的用户来说优势更加明显。
2)为用户提供导航服务,可根据用户需求进行路径规划,并带路用户前往目的地,从而有效节省时间。
3)还可为用户提供沟通服务。可回答用户提出的诸如与商场超市、商品、促销或其它问题,提升了用户购物的趣味性,并保证用户能够获取必要的信息。同时还可成为厂商、品牌商宣传商品及促销信息的窗口。
本领域内的技术人员应明白,本公开的实施例可提供为方法、系统、或计算机程序产品。因此,本公开可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本公开可采用在一个或多个其中包含有计算机可用程序代码的计算机可用非瞬时性存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本公开是参照根据本公开实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/ 或方框图一个方框或多个方框中指定的功能的步骤。
本公开的描述是为了示例和描述起见而给出的,而并不是无遗漏的或者将本公开限于所公开的形式。很多修改和变化对于本领域的普通技术人员而言是显然的。选择和描述实施例是为了更好说明本公开的原理和实际应用,并且使本领域的普通技术人员能够理解本公开从而设计适于特定用途的带有各种修改的各种实施例。
Claims (27)
- 一种机器人的控制方法,包括:通过识别用户特征以锁定绑定用户及所述绑定用户的当前位置;确定所述机器人移动到所述绑定用户邻近区域的路径;驱动所述机器人沿所述路径移动到所述绑定用户邻近区域。
- 根据权利要求1所述的控制方法,其中,驱动所述机器人沿所述路径移动到所述绑定用户邻近区域包括:在驱动所述机器人沿所述路径移动过程中,检测移动前方是否出现障碍物;在移动前方出现障碍物的情况下,控制所述机器人暂停移动;经过预定时间后,检测所述障碍物是否消失;在所述障碍物消失的情况下,驱动所述机器人继续沿所述路径移动。
- 根据权利要求2所述的控制方法,还包括:在所述障碍物仍未消失的情况下,对所述机器人的周围环境进行检测;根据检测结果,重新确定所述机器人移动到所述绑定用户邻近区域的路径;驱动所述机器人沿重新确定的路径移动到所述绑定用户邻近区域。
- 根据权利要求1所述的控制方法,其中,在所述绑定用户邻近区域,所述机器人与所述绑定用户的距离大于第一预定距离且小于第二预定距离,其中第一预定距离小于第二预定距离。
- 根据权利要求1所述的控制方法,还包括:在驱动所述机器人移动过程中,识别邻近货架上的条码信息;查询与所述条码信息相对应的广播信息;播放所述广播信息。
- 根据权利要求5所述的控制方法,其中,在查询与所述条码信息相对应的广播信息后,还包括:提取所述广播信息的标识;判断所述标识是否与所述绑定用户的历史数据相匹配;在所述标识与所述绑定用户的历史数据相匹配的情况下,播放所述广播信息。
- 根据权利要求1所述的控制方法,还包括:在采集到所述绑定用户的语音信息后,对所述语音信息进行识别,以得到所述绑定用户的语音指令;对所述语音指令进行分析处理,以得到相应的响应信息;在所述响应信息中包括目的地址的情况下,确定所述机器人移动到所述目的地址的路径;驱动所述机器人沿确定的路径进行移动,以便引领所述绑定用户到达所述目的地址。
- 根据权利要求7所述的控制方法,其中,在驱动所述机器人沿确定的路径进行移动时,播放预定的引导信息。
- 根据权利要求7所述的控制方法,其中,在所述响应信息中包括答复信息的情况下,播放所述答复信息。
- 根据权利要求1-9中任一项所述的控制方法,还包括:在所述机器人处于空闲状态下,基于操作用户发出的触发指令,将所述机器人的状态切换为工作状态;将所述操作用户作为绑定用户,对所述绑定用户进行用户特征识别。
- 根据权利要求10所述的控制方法,其中,在所述绑定用户结束使用后,解除所述机器人与所述绑定用户之间的绑定关系;将所述机器人的状态切换为空闲状态。
- 根据权利要求11所述的控制方法,其中,在将所述机器人的状态切换为空闲状态后,还包括:确定所述机器人移动到所述预定停放地点的路径;驱动所述机器人沿确定的路径移动到所述预定停放地点,以便实现自动归位。
- 一种机器人的控制装置,包括:特征识别模块,被配置为识别用户特征;锁定模块,被配置为根据识别出的用户特征锁定绑定用户及所述绑定用户的当前位置;路径确定模块,被配置为确定所述机器人移动到所述绑定用户邻近区域的路径;驱动模块,被配置为驱动所述机器人沿所述路径移动到所述绑定用户邻近区域。
- 根据权利要求13所述的控制装置,还包括:障碍物检测模块,被配置为在驱动模块驱动所述机器人沿所述路径移动过程中,检测移动前方是否出现障碍物,在移动前方出现障碍物的情况下,指示驱动模块控制所述机器人暂停移动,经过预定时间后,检测所述障碍物是否消失,若所述障碍物消失,则指示驱动模块驱动所述机器人继续沿所述路径移动。
- 根据权利要求14所述的控制装置,其中,障碍物检测模块还被配置为在所述障碍物仍未消失的情况下,对所述机器人的周围环境进行检测;路径确定模块还被配置为根据检测结果,重新确定所述机器人移动到所述绑定用户邻近区域的路径;驱动模块还被配置为驱动所述机器人沿重新确定的路径移动到所述绑定用户邻近区域。
- 根据权利要求13所述的控制装置,其中,在所述绑定用户邻近区域,所述机器人与所述绑定用户的距离大于第一预定距离且小于第二预定距离,其中第一预定距离小于第二预定距离。
- 根据权利要求13所述的控制装置,还包括:条码信息识别模块,被配置为在驱动模块驱动所述机器人移动过程中,识别邻近 货架上的条码信息;广播信息查询模块,被配置为查询与所述条码信息相对应的广播信息;播放模块,被配置为播放所述广播信息。
- 根据权利要求17所述的控制装置,还包括:信息匹配模块,被配置为在广播信息查询模块查询与所述条码信息相对应的广播信息后,提取所述广播信息的标识,判断所述标识是否与所述绑定用户的历史数据相匹配,在所述标识与所述绑定用户的历史数据相匹配的情况下,指示播放模块播放所述广播信息。
- 根据权利要求13所述的控制装置,还包括:语音识别模块,被配置为在采集到所述绑定用户的语音信息后,对所述语音信息进行识别,以得到所述绑定用户的语音指令;指令处理模块,被配置为对所述语音指令进行分析处理以得到相应的响应信息,在所述响应信息中包括目的地址的情况下,指示路径确定模块确定所述机器人移动到所述目的地址的路径;驱动模块还被配置为驱动所述机器人沿确定的路径进行移动,以便引领所述绑定用户到达所述目的地址。
- 根据权利要求19所述的控制装置,其中,播放模块还被配置为在驱动模块驱动所述机器人沿确定的路径进行移动时播放预定的引导信息。
- 根据权利要求19所述的控制装置,其中,播放模块还被配置为在所述响应信息中包括答复信息的情况下,播放所述答复信息。
- 根据权利要求13-21中任一项所述的控制装置,还包括:交互模块,被配置为接收操作用户发出的指令;状态切换模块,被配置为在所述机器人处于空闲状态下,基于操作用户发出的触 发指令,将所述机器人的状态切换为工作状态,将所述操作用户作为绑定用户,并指示特征识别模块对所述绑定用户进行用户特征识别。
- 根据权利要求22所述的控制装置,其中,状态切换模块还被配置为在所述绑定用户结束使用后,解除所述机器人与所述绑定用户之间的绑定关系,将所述机器人的状态切换为空闲状态。
- 根据权利要求23所述的控制装置,其中,路径确定模块还被配置为状态切换模块在将所述机器人的状态切换为空闲状态后,确定所述机器人移动到所述预定停放地点的路径;驱动模块还被配置为驱动所述机器人沿确定的路径移动到所述预定停放地点,以便实现自动归位。
- 一种机器人的控制装置,包括:存储器,用于存储指令;处理器,耦合到所述存储器,所述处理器被配置为基于所述存储器存储的指令执行实现如权利要求1-12中任一项所述的方法。
- 一种机器人,包括如权利要求13-25中任一项所述的机器人的控制装置。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机指令,所述指令被处理器执行时实现如权利要求1-12中任一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/613,639 US20200078943A1 (en) | 2017-05-16 | 2018-04-25 | Robot control method, robot control apparatus and robot |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710341099.5A CN106956266A (zh) | 2017-05-16 | 2017-05-16 | 机器人控制方法、装置和机器人 |
CN201710341099.5 | 2017-05-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018210109A1 true WO2018210109A1 (zh) | 2018-11-22 |
Family
ID=59482533
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/084402 WO2018210109A1 (zh) | 2017-05-16 | 2018-04-25 | 机器人控制方法、装置和机器人 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20200078943A1 (zh) |
CN (1) | CN106956266A (zh) |
WO (1) | WO2018210109A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10332402B2 (en) | 2017-10-12 | 2019-06-25 | Toyota Jidosha Kabushiki Kaisha | Movement assistance system and movement assistance method |
CN114401488A (zh) * | 2021-12-03 | 2022-04-26 | 杭州华橙软件技术有限公司 | 机器人运动路径上报方法、下载方法、装置和电子装置 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106956266A (zh) * | 2017-05-16 | 2017-07-18 | 北京京东尚科信息技术有限公司 | 机器人控制方法、装置和机器人 |
CN107292571A (zh) * | 2017-07-21 | 2017-10-24 | 深圳市萨斯智能科技有限公司 | 一种机器人的安全确定方法和机器人 |
JP2019049785A (ja) * | 2017-09-08 | 2019-03-28 | 株式会社日立ビルシステム | ロボット管理システム及び商品提案方法 |
CN107894773A (zh) * | 2017-12-15 | 2018-04-10 | 广东工业大学 | 一种移动机器人的导航方法、系统及相关装置 |
CN108198030A (zh) * | 2017-12-29 | 2018-06-22 | 深圳正品创想科技有限公司 | 一种手推车控制方法、装置及电子设备 |
CN109176512A (zh) * | 2018-08-31 | 2019-01-11 | 南昌与德通讯技术有限公司 | 一种体感控制机器人的方法、机器人及控制装置 |
CN109866230A (zh) * | 2019-01-17 | 2019-06-11 | 深圳壹账通智能科技有限公司 | 客服机器人控制方法、装置、计算机设备及存储介质 |
CN110926476B (zh) * | 2019-12-04 | 2023-09-01 | 三星电子(中国)研发中心 | 一种智能机器人的伴随服务方法及装置 |
CN111950431B (zh) * | 2020-08-07 | 2024-03-26 | 北京猎户星空科技有限公司 | 一种对象查找方法及装置 |
CN112562402B (zh) * | 2020-11-12 | 2022-04-12 | 深圳优地科技有限公司 | 一种位置确定方法、装置、终端和存储介质 |
CN113758479A (zh) * | 2021-04-02 | 2021-12-07 | 北京京东拓先科技有限公司 | 无人机寻址方法、装置、无人机以及存储介质 |
US20240153504A1 (en) * | 2021-06-08 | 2024-05-09 | Chian Chiu Li | Presenting Location Related Information and Implementing a Task through a Mobile Control Device |
CN114355885A (zh) * | 2021-12-03 | 2022-04-15 | 中国信息通信研究院 | 基于agv小车的协作机器人搬运系统及方法 |
CN114190295B (zh) * | 2021-12-20 | 2023-06-09 | 珠海一微半导体股份有限公司 | 一种多宠物机器人控制方法、系统及芯片 |
CN114527748A (zh) * | 2022-01-19 | 2022-05-24 | 广东博智林机器人有限公司 | 路径规划方法、施工方法及装置、机器人、存储介质 |
CN114994604A (zh) * | 2022-04-21 | 2022-09-02 | 深圳市倍思科技有限公司 | 人机交互位置确定方法、装置、机器人及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102289556A (zh) * | 2011-05-13 | 2011-12-21 | 郑正耀 | 一种超市购物助手机器人 |
US9393686B1 (en) * | 2013-03-15 | 2016-07-19 | Industrial Perception, Inc. | Moveable apparatuses having robotic manipulators and conveyors to facilitate object movement |
CN106056633A (zh) * | 2016-06-07 | 2016-10-26 | 速感科技(北京)有限公司 | 运动控制方法、装置及系统 |
CN106155065A (zh) * | 2016-09-28 | 2016-11-23 | 上海仙知机器人科技有限公司 | 一种机器人跟随方法及用于机器人跟随的设备 |
WO2016200439A1 (en) * | 2015-06-09 | 2016-12-15 | Integrated Construction Enterprises, Inc. | Construction board installation robot |
CN106956266A (zh) * | 2017-05-16 | 2017-07-18 | 北京京东尚科信息技术有限公司 | 机器人控制方法、装置和机器人 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102393739B (zh) * | 2011-05-27 | 2014-12-03 | 严海蓉 | 智能手推车及其应用方法 |
CN109815834A (zh) * | 2014-01-03 | 2019-05-28 | 科沃斯商用机器人有限公司 | 导购机器人顾客识别通知方法及导购机器人系统 |
CN105468003A (zh) * | 2016-01-18 | 2016-04-06 | 深圳思科尼亚科技有限公司 | 全方位智能跟随高尔夫球车及其跟随方法 |
CN106096576B (zh) * | 2016-06-27 | 2019-05-07 | 陈包容 | 一种机器人的智能服务方法 |
CN106251173A (zh) * | 2016-07-22 | 2016-12-21 | 尚艳燕 | 一种基于平衡车的超市导购方法和平衡车 |
CN106297083B (zh) * | 2016-07-29 | 2019-03-15 | 广州市沃希信息科技有限公司 | 一种商场购物方法、购物服务器以及购物机器人 |
-
2017
- 2017-05-16 CN CN201710341099.5A patent/CN106956266A/zh active Pending
-
2018
- 2018-04-25 WO PCT/CN2018/084402 patent/WO2018210109A1/zh active Application Filing
- 2018-04-25 US US16/613,639 patent/US20200078943A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102289556A (zh) * | 2011-05-13 | 2011-12-21 | 郑正耀 | 一种超市购物助手机器人 |
US9393686B1 (en) * | 2013-03-15 | 2016-07-19 | Industrial Perception, Inc. | Moveable apparatuses having robotic manipulators and conveyors to facilitate object movement |
WO2016200439A1 (en) * | 2015-06-09 | 2016-12-15 | Integrated Construction Enterprises, Inc. | Construction board installation robot |
CN106056633A (zh) * | 2016-06-07 | 2016-10-26 | 速感科技(北京)有限公司 | 运动控制方法、装置及系统 |
CN106155065A (zh) * | 2016-09-28 | 2016-11-23 | 上海仙知机器人科技有限公司 | 一种机器人跟随方法及用于机器人跟随的设备 |
CN106956266A (zh) * | 2017-05-16 | 2017-07-18 | 北京京东尚科信息技术有限公司 | 机器人控制方法、装置和机器人 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10332402B2 (en) | 2017-10-12 | 2019-06-25 | Toyota Jidosha Kabushiki Kaisha | Movement assistance system and movement assistance method |
CN114401488A (zh) * | 2021-12-03 | 2022-04-26 | 杭州华橙软件技术有限公司 | 机器人运动路径上报方法、下载方法、装置和电子装置 |
CN114401488B (zh) * | 2021-12-03 | 2024-05-28 | 杭州华橙软件技术有限公司 | 机器人运动路径上报方法、下载方法、装置和电子装置 |
Also Published As
Publication number | Publication date |
---|---|
US20200078943A1 (en) | 2020-03-12 |
CN106956266A (zh) | 2017-07-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018210109A1 (zh) | 机器人控制方法、装置和机器人 | |
WO2018171285A1 (zh) | 机器人的控制方法和装置、机器人及控制系统 | |
JP6502491B2 (ja) | 顧客サービスロボットおよび関連するシステムおよび方法 | |
US7147154B2 (en) | Method and system for assisting a shopper in navigating through a store | |
JP6020326B2 (ja) | 経路探索装置、自走式作業装置、プログラム及び記録媒体 | |
US8924868B2 (en) | Moving an activity along terminals associated with a physical queue | |
WO2017065187A1 (ja) | 情報表示端末装置及び商品情報提供システム並びに商品販売促進方法 | |
JP6261197B2 (ja) | 表示制御装置、表示制御方法、及びプログラム | |
WO2019205760A1 (zh) | 一种智能设备,商品盘点方法、装置以及设备 | |
CN105550224A (zh) | 物品搜索方法、装置及系统 | |
AU2016262718A1 (en) | Prompting method and apparatus | |
JP7224488B2 (ja) | インタラクティブ方法、装置、デバイス、及び記憶媒体 | |
US20200234393A1 (en) | Accompanying moving object | |
CN105487863A (zh) | 基于场景的界面设置方法及装置 | |
US20170045949A1 (en) | Gesture evaluation system, method for evaluating gestures and vehicle | |
US20220083049A1 (en) | Accompanying mobile body | |
US11074040B2 (en) | Presenting location related information and implementing a task based on gaze, gesture, and voice detection | |
CN105301585A (zh) | 信息展示方法及装置 | |
JP2024015277A (ja) | 情報提供装置及びその制御プログラム | |
CN106546240A (zh) | 导航处理方法和装置 | |
US12014397B2 (en) | In-store computerized product promotion system with product prediction model that outputs a target product message based on products selected in a current shopping session | |
CN112989895A (zh) | 人机交互方法、系统及自移动设备 | |
WO2017164061A1 (ja) | 出力制御装置、情報出力システム、出力制御方法、およびプログラム | |
JP4767717B2 (ja) | ロボット | |
JP7116806B2 (ja) | 同行移動体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18801307 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 26/02/2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18801307 Country of ref document: EP Kind code of ref document: A1 |