WO2020151755A1 - Procédé, appareil et système de service coopératif à robots multiples, et dispositif de commande - Google Patents

Procédé, appareil et système de service coopératif à robots multiples, et dispositif de commande Download PDF

Info

Publication number
WO2020151755A1
WO2020151755A1 PCT/CN2020/073900 CN2020073900W WO2020151755A1 WO 2020151755 A1 WO2020151755 A1 WO 2020151755A1 CN 2020073900 W CN2020073900 W CN 2020073900W WO 2020151755 A1 WO2020151755 A1 WO 2020151755A1
Authority
WO
WIPO (PCT)
Prior art keywords
service
robot
task
user
information
Prior art date
Application number
PCT/CN2020/073900
Other languages
English (en)
Chinese (zh)
Inventor
王雪松
Original Assignee
北京猎户星空科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京猎户星空科技有限公司 filed Critical 北京猎户星空科技有限公司
Publication of WO2020151755A1 publication Critical patent/WO2020151755A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks

Definitions

  • This application relates to the field of artificial intelligence technology, and in particular to a multi-robot collaborative service method, device, control equipment and system.
  • the robot can lead the user to a location designated by the user, helping the user find the destination as soon as possible in an unfamiliar environment.
  • robots still have many problems in the process of providing services. For example, when leading the user, the robot needs to continue to follow a user until the user reaches the destination. This causes the robot to perform a single task longer and less efficient. In addition, for safety, the robot generally moves slowly. , This will delay the user’s time or the robot will easily lose track of the user. In addition, in the process of leading the way, there are some areas where the robot is not suitable for entering and exiting, such as elevators, stairs, and small spaces. This will cause the robot to not follow the user smoothly and complete the service task.
  • the embodiments of the present application provide a multi-robot collaborative service method, device, control device, and system to solve the problem of low execution efficiency when robots provide services in the prior art.
  • an embodiment of the present application provides a multi-robot collaborative service method, including:
  • the service task and the service area of the robot determine the service operation of the robot to the user in its service area
  • the obtaining the service task of the user detected by the robot includes:
  • the method also includes:
  • the task information of the user is synchronized to other robots in the multi-robot, and the task information includes the biological information and service tasks of the user.
  • the obtaining the service task of the user detected by the robot includes:
  • the method further includes:
  • a task execution status corresponding to the service task is generated, and the generated task execution status is synchronized to other robots in the multi-robot.
  • the method further includes:
  • the determining the service operation of the robot to the user in the service area of the robot according to the service task and the service area of the robot includes:
  • the service area of the robot, and the task execution status corresponding to the service task determine the service operation of the robot to the user in its service area.
  • the determining the service operation of the robot to the user in its service area according to the service task and the service area of the robot includes:
  • the service area of the robot determines the service operation of the robot to the user in its service area.
  • the additional information includes at least one of the following information: time information, weather information, and Local property information.
  • the method further includes:
  • the method further includes:
  • the determining that the service task ends includes:
  • an embodiment of the present application provides a multi-robot collaborative service device, including:
  • the acquisition module is used to acquire the service tasks of the user detected by the robot;
  • the processing module is used to determine the service operation of the robot to the user in the service area of the robot according to the service task and the service area of the robot;
  • the control module is used to control the robot to perform service operations in its service area.
  • the acquisition module is specifically configured to: generate the user's service task according to the user's input information detected by the robot;
  • the device also includes a task creation module for:
  • the task information includes the biological information and service tasks of the user.
  • the obtaining module is specifically used for:
  • the device further includes an execution status determining module, configured to generate task execution status corresponding to the service task according to the service operation, and synchronize the generated task execution status to other robots in the multi-robot .
  • an execution status determining module configured to generate task execution status corresponding to the service task according to the service operation, and synchronize the generated task execution status to other robots in the multi-robot .
  • the device further includes an execution status acquisition module, configured to acquire the task execution status corresponding to the service task from the synchronized task execution status;
  • the processing module is specifically configured to determine the service operation of the robot to the user in its service area according to the service task, the service area of the robot, and the task execution status corresponding to the service task.
  • the processing module is specifically configured to determine the service operation of the robot to the user in its service area according to the service task, the service area of the robot, and additional information, the additional information Including at least one of the following information: time information, weather information and local property information.
  • the device further includes a task deletion module, configured to delete all the information corresponding to the service task after determining the end of the service task, and synchronize the service task end message corresponding to the service task to the Other robots in multiple robots.
  • a task deletion module configured to delete all the information corresponding to the service task after determining the end of the service task, and synchronize the service task end message corresponding to the service task to the Other robots in multiple robots.
  • the device further includes a task deletion module, configured to delete all information corresponding to the service task after receiving the service task end message corresponding to the service task.
  • a task deletion module configured to delete all information corresponding to the service task after receiving the service task end message corresponding to the service task.
  • a task end judgment module for:
  • an embodiment of the present application provides a robot control device, including a transceiver, a memory, a processor, and a computer program stored in the memory and running on the processor, where the transceiver is used to run on the processor
  • the steps of any of the above methods are implemented when the processor executes the program.
  • an embodiment of the present application provides a robot, including the robot control device shown in the third aspect.
  • an embodiment of the present application provides a multi-robot collaborative service system, including: a plurality of robots as shown in the fourth aspect, each robot corresponds to a service area, and the plurality of robots communicate through a network.
  • an embodiment of the present application provides a computer-readable storage medium having computer program instructions stored thereon, and when the program instructions are executed by a processor, the steps of any of the above methods are implemented.
  • the present application also provides a computer program product, the computer program product includes a computer program stored on a computer-readable storage medium, the computer program includes program instructions, when the program instructions are executed by a processor Realize the steps of any of the above multi-robot collaborative service methods.
  • the technical solution provided by the embodiments of this application synchronizes the user’s service tasks to robots in different service areas. After any robot in a service area recognizes the user, it actively provides the user with service operations corresponding to the service task. After entering the next service area, the robot in the next service area detects the user and continues to provide the user with service operations corresponding to the service task.
  • the robots in each service area provide continuous services to the user through relay. Since each robot only conducts small-scale activities in its corresponding service area and provides services to users, a single robot does not need to follow the same user throughout, reducing the execution time of a single robot task, improving the service efficiency of the robot, and expanding the robot The coverage of the service improves the user experience.
  • 1A is a schematic diagram of an application scenario of a multi-robot collaborative service method provided by an embodiment of the application
  • 1B is a schematic diagram of another application scenario of the multi-robot collaborative service method provided by an embodiment of the application.
  • FIG. 2 is a schematic flowchart of a multi-robot collaborative service method provided by an embodiment of the application
  • FIG. 3 is a schematic diagram of an application scenario in which multiple robots perform collaborative services provided by an embodiment of the application;
  • FIG. 4 is a schematic structural diagram of a multi-robot collaborative service device provided by an embodiment of the application.
  • Fig. 5 is a schematic structural diagram of a robot control device provided by an embodiment of the application.
  • the robot can lead the user to a location designated by the user, helping the user find the destination as soon as possible in an unfamiliar environment.
  • robots still have many problems in the process of providing services. For example, when leading the user, the robot needs to continue to follow a user until the user reaches the destination. This causes the robot to perform a single task longer and less efficient.
  • the robot generally moves slowly. , This will delay the user’s time or the robot will easily lose track of the user.
  • the inventor of the present application considers that a larger area is divided into multiple service areas, at least one robot is arranged in one service area, and each robot provides services to users in its corresponding service area. Synchronize the user's service tasks to the robots in all service areas.
  • the task information may include: the user's biological information and service tasks generated according to the services required by the user. After any robot in the service area recognizes the user based on the biological information, Actively provide users with service operations corresponding to their service tasks; after the user enters the next service area, after the robot in the next service area recognizes the user, continue to provide the user with the service operations corresponding to the service task until completion The user's service task.
  • each service area provides users with continuous services through relay. Since each robot only provides services within its own service area, a single robot does not need to follow the same user throughout the entire process, which improves the service efficiency of the robot, and as long as the service area is divided reasonably , The robot does not need to enter difficult-to-pass areas such as elevators and stairs to ensure the smooth completion of service tasks.
  • FIG. 1A is a schematic diagram of an application scenario of the multi-robot collaborative service method provided by an embodiment of the application.
  • each robot 11 moves in its respective service area and provides services to users 10 who need help who appear in their respective service areas.
  • the robot 11 and the server 12 communicate with each other through a network.
  • the network may be a local area network or a wide area network.
  • the robots 11 in each service area can synchronize the user's task information to the robots 11 in other service areas through the server 12.
  • the robot 11 detects the user 10, obtains the service task matching the biological information of the user 10 from the task information synchronized by the server 12, and then according to the information in the task information
  • the service task and the service area where the robot 11 is located perform corresponding service operations to provide the user 10 with corresponding services.
  • each robot 11 moves in its own service area and provides services to users 10 who need help who appear in their respective service areas.
  • Each robot 11 communicates directly through a network, which may be a local area network, a wide area network, etc., and each robot 11 synchronizes the task information of the user 10 to the robots 11 in other service areas through the network.
  • the robot 11 detects the user 10, collects the biological information of the user 10, obtains the task information matching the user’s biological information from the synchronized task information, and then according to the The service tasks in the task information and the service area where the robot 11 is located perform corresponding service operations to provide users 10 with corresponding services.
  • an embodiment of the present application provides a multi-robot collaborative service method, including the following steps:
  • the robot can continuously detect in its corresponding service area. Once a user is detected, it will identify the user and obtain the service task corresponding to the user. Specifically, the robot can detect whether there are users around through infrared sensors or cameras, and it can also detect whether there are users around through sound sensors. After detecting the user, the robot collects the user's biological information to identify the user through the collected biological information .
  • the user's biological information includes but is not limited to at least one of the following information: face information, iris information, voiceprint information, gait information, and so on.
  • the user's biological information can be collected through equipment such as a face recognition device, an iris recognition device, a voiceprint recognition device, and a motion collection device integrated inside the robot.
  • the robot When the user uses the robot for the first time, the robot has not obtained any information about the user.
  • the user's service tasks detected by the robot can be obtained by the following methods: according to the user input information detected by the robot, the user's service tasks are generated .
  • the input information may be various types of information such as voice information or text information.
  • the user's biological information detected by the robot is acquired, and the user's task information is synchronized to other robots.
  • the task information includes the user's biological information and service tasks. That is, the robot that provides services to the user for the first time creates the user's task information and synchronizes the task information to other robots.
  • the robot For example, the user asks the robot "Where is the bathroom?" Then, the robot generates the user's corresponding service task "Find the bathroom” according to the information input by the user, and collects the user's face image at the same time, and uses the generated service task and the collected face image as The user’s task information is synchronized to other robots.
  • the user can also create task information through the mobile terminal he uses, and send the created task information to the server, and the mobile terminal is installed with an application program that interacts with the server. For example, after the user opens the application, he can input "Where is the restroom" by voice or text. Then, the application generates the user's corresponding service task "find the restroom” according to the information input by the user, and collects the user's information through the mobile terminal. Face image, the generated service task and collected face image are sent to the server as the user’s task information, and the server synchronizes the user’s task information to all robots in the service area, and any robot in the service area recognizes the user Later, the user will be provided with a route to the nearest toilet.
  • an application program that interacts with the server. For example, after the user opens the application, he can input "Where is the restroom" by voice or text. Then, the application generates the user's corresponding service task "find the restroom” according to the information input by the user, and collects the user'
  • the robots in each service area can synchronize task information through the server. After the server receives the task information of a certain user, it immediately sends the task information to the robots in all service areas, so that each robot can identify the user after the user enters the service area corresponding to each robot. The user provides services corresponding to the service tasks in the service information.
  • the robots in each service area can also synchronize task information through P2P (peer-to-peer, point-to-point technology) transmission, for example, through Bluetooth or MESH (wireless mesh network).
  • P2P peer-to-peer, point-to-point technology
  • MESH wireless mesh network
  • the robots in the other service areas can obtain the user’s service tasks in the following ways: obtain the user’s biological information detected by the robot, from the synchronized task information Obtain the target task information that matches the user's biological information, and obtain the user's service task from the target task information.
  • S202 Determine the service operation of the robot to the user in the service area of the robot according to the service task and the service area of the robot.
  • S203 Control the robot to perform service operations in its service area.
  • each robot only provides services to users in their respective service areas. Therefore, when performing the same service task of the same user, each robot can determine the service operation based on the location information of the service area where it is located.
  • Each robot can perform service operations in at least one of the following ways: displaying text, displaying pictures, playing videos, voice broadcasts, performing actions corresponding to service operations, or mobile navigation.
  • the user 31 first enters the service area corresponding to the robot 301, and the user 31 asks the robot 301: "How to get to the bathroom?"
  • the robot 301 determines the route to the nearest bathroom according to the current location of the robot 301 , And output the feedback message to the user through voice broadcast, "Walk forward 30 meters and turn left, you can reach the nearest restroom”.
  • the robot 301 can also lead the user 31 to walk a certain distance forward, but the robot 301 will not Leave its corresponding service area; at the same time, the robot 301 generates a service task of "find the restroom", and synchronizes the service task "find the restroom” and the biological information of the user 31 as the task information of the user 31 to other robots (including The robot 302, the robot 303, and the robot 304 shown in FIG. 3).
  • the user 31 walks forward under the guidance of the robot 301.
  • the robot 301 can return to the designated point or wait in place, and wait for the next user to be detected and provide the next user with the corresponding service.
  • the robot 302 collects the biological information of the user 31 after detecting the user 31, and obtains the service task of the user 31 from the task information of the user 31 synchronized by the robot 301 according to the biological information of the user 31 , It is determined that the service task of the user 31 is “find the restroom”, and then the robot 302 determines the route to the nearest restroom according to the current location of the robot 302, and reminds the user 31 “10 meters after turning left” by means of voice broadcast Arrive at the nearest toilet.”
  • the robot 304 collects the biological information of the user 31 after detecting the user 31, and determines that the service task of the user 31 is "find the restroom” according to the biological information of the user 31. 304
  • the current location confirms that the user has reached the bathroom. At this time, the user can be reminded by voice broadcast 31 "You have reached the bathroom”.
  • the multi-robot collaborative service method of the embodiment of the present application synchronizes the task information of the user to the robots in different service areas. After identifying the user according to the biological information, the robot in any service area actively provides the user with information corresponding to the service task. After the user enters the next service area, after the robot in the next service area detects the user, it continues to provide the user with service operations corresponding to the service task.
  • the robots in each service area provide users with continuous Service. Since each robot only conducts small-scale activities and provides services to users in its corresponding service area, a single robot does not need to follow the same user throughout, which reduces the execution time of a single task of the robot, improves the service efficiency of the robot, and expands The coverage of robotic services.
  • the multi-robot collaborative service method of the embodiment of the present application uses computer vision technology and other methods to actively detect users and collect their biological information, so that users can bypass outdated robots and actively provide services to users without requiring users
  • the service can only be performed after the robot is in front of the robot, which improves the user experience.
  • the multi-robot collaborative service method of the embodiment of the present application solves the problem that robots cannot enter elevators, stairs and other areas and cannot pass through narrow spaces through the relay service mode of robots in each service area, and ensures the smooth completion and expansion of service tasks
  • the coverage of robotic services is improved. For example, after a user enters a hotel lobby, he asks robot No. 1 how to get to room 807. Robot No. 1 generates a service task of "Finding Room 807" and collects the user's biological information. The user’s biological information is synchronized to other robots in the hotel as the user’s task information. After the first robot leads the user to the elevator entrance, it calls the elevator to the eighth floor.
  • the first robot can continue to be the first floor.
  • the second robot in the eighth-floor corridor detects the user, collects the user’s biological information, and obtains the user’s biological information from the task information synchronized by the first robot according to the user’s biological information.
  • the user’s service task determine the user’s service task as "find room 807"
  • the second robot determines the route to room 807 based on the current location of the second robot, and informs the user to walk through voice broadcast Route, or lead the user to room 807.
  • the robot synchronization information also includes task execution status.
  • the method of the embodiment of the present application further includes the following steps: generating task execution status corresponding to the service task according to the service operation, and synchronizing the generated task execution status to other robots.
  • the task execution status may be the progress of the service task determined according to the service operation that the robot has provided for the user.
  • the task execution status may be the area that the user passes through; when the service task can be divided into multiple phases of subtasks, the task execution status may be the execution status of each subtask.
  • the robot determines the service operation to be performed, it generates the task execution status corresponding to the service task according to the determined service operation, and synchronizes the generated task execution status to other robots.
  • the robot can choose to overwrite the original task execution status, that is, only save the latest task execution status to reduce the amount of data storage; the robot can also choose to save the previous task execution status corresponding to the service task , In order to provide users with a better service experience based on the performance of previous tasks.
  • the specific method described above can be determined according to factors such as the actual application scenario and the type of the service task, which is not limited in this embodiment.
  • step S202 the method of the embodiment of the present application further includes the following step: acquiring the task execution status corresponding to the service task from the synchronized task execution status.
  • step S202 specifically includes: determining the service operation of the robot to the user in its service area according to the service task, the service area of the robot and the task execution status corresponding to the service task.
  • the service operation of the robot to the user in its service area can be determined according to the service task, the robot’s service area and the latest task execution status corresponding to the service task; or, according to the service task, the robot’s service area and the The previous task execution status corresponding to the service task determines the service operation of the robot to the user in its service area.
  • each robot can learn the latest execution progress of the service task, so as to better provide services to users.
  • the service task is a navigation service
  • the user 31 first enters the service area corresponding to the robot 301, and the user 31 asks the robot 301: "How to get to the bathroom", and the robot 301 determines according to the current location of the robot 301 The route to the nearest restroom, and the feedback message is output to the user through voice broadcast, "Go forward 30 meters and turn left, you can reach the nearest restroom”.
  • the robot 301 generates a service task of "find the restroom”
  • the task execution status is generated.
  • the task execution status can be: the user 31 has passed the service area to which the robot 301 belongs, and the user 31's service task "find the bathroom", biological information and task execution status are synchronized to other robots (including those shown in Figure 3). Robot 302, robot 303, robot 304 shown).
  • the robot 302 collects the biological information of the user 31 after detecting the user 31, and obtains the service of the user 31 from the task information of the user 31 synchronized by the robot 301 according to the biological information of the user 31
  • the task is to determine that the service task of the user 31 is "find the restroom", then the robot 302 determines the route to the nearest restroom according to its current location, and combines the task execution status synchronized by the robot 301 to determine that the content of the voice broadcast is: " Ten meters after turning left, you will reach the nearest restroom.”
  • the task execution status of “finding restroom” is: the user 31 has passed the service area to which the robot 302 belongs, and the task execution status is synchronized to other robots.
  • the robot 303 detects the user 31, collects the biological information of the user 31, obtains the service task of the user 31 from the synchronized task information of the user 31 according to the biological information of the user 31, and determines The service task of the user 31 is "find the restroom".
  • the robot 303 determines the route to the nearest restroom according to its current location, and the task execution status corresponding to the task information is obtained: the user 31 has passed the service to which the robot 302 belongs In the area, combined with the task execution status, it can be determined that the content of the voice announcement is: "You have gone wrong, please turn around and turn right, go straight for ten meters and you will reach the nearest toilet.” If the robot 302 does not synchronize the task execution status corresponding to the user 31 to the robot 301 and the robot 303, the robot 303 may not be able to determine from which direction the user 31 entered the service area where the robot 303 is located, and it cannot generate "You have gone wrong.” "Please turn around and turn right" are more accurate guidance sentences. Therefore, when the user leads the way, the robot can record the user's walking route according to the task execution status. Based on the user's walking route, the robot can output more accurate guidance sentences.
  • the task execution status of each subtask can be synchronized.
  • the business H that the user needs to handle can be divided into sub-business H1 and sub-business H2, and the user needs to go to different floors in the office building to handle various sub-tasks.
  • the robots on the first floor will synchronize the task information to the robots on other floors.
  • the task execution status of the service task can also be synchronized to not handle any business, and then The robot guides the user to the second floor for sub-business H1 in accordance with the business order.
  • the robot on the second floor detects the user, collects the user’s biological information, and finds the user’s service task from the synchronized task information according to the user’s biological information.
  • the robot on the second floor will guide the user to the room for sub-business H1, and after the user enters the room for sub-business H1, a new task execution status is generated: sub-business H1 has been processed, and the new task execution status is synchronized
  • the robot on the second floor guides the user to the fourth floor to handle the sub-business H2.
  • the robot on the fourth floor detects the user, collects the user’s biological information, and finds the user’s service task from the synchronized task information based on the user’s biological information.
  • the robot on the fourth floor will guide the user to the room for sub-business H2, and after the user enters the room for sub-business H2, a new task execution status is generated: sub-business H2 has been processed, and the newly generated task information Synchronize to other robots.
  • the robot on the fourth floor detects the user and prompts the user that the service H has been completed. Suppose the user goes to the third layer after completing the sub-service H1.
  • the robot on the third layer After the robot on the third layer detects the user, it collects the user's biological information, and finds the user's service task from the synchronized task information according to the user's biological information to handle the business H, and the latest task execution status is that the sub-business H1 has been processed, the robot on the third layer can guide the user to the fourth floor to handle the sub-business H2.
  • additional information may be combined to determine the service operation of the robot to the user in its service area.
  • the service operation of the robot to the user within its service area can be determined according to the service task, the robot’s service area and additional information; or, the robot’s operating status can be determined according to the service task, the robot’s service area, task execution status and additional information.
  • Service operations for users in its service area can be determined according to the service task, the robot’s service area and additional information.
  • the additional information includes at least one of the following information: time information, weather information, and local property information.
  • the user’s service task is "find store A”. If the task of "find store A” has not ended after a certain period of time, the robot can output a prompt message to the user after detecting the user, "You still need to go Shop A? If the user replies "No need”, it means that the service task of "Find shop A” is over; if the user replies "Need”, the robot continues to perform the service task of "Find shop A”.
  • the robot can output a prompt message to the user after detecting the user "It's not raining anymore, you don't need to find an umbrella.” If the user replies "OK, I don't need an umbrella”, it means that the service task of "finding an umbrella” is over; if the user replies "I want to buy an umbrella", the robot Continue to perform the service task of "finding umbrella”.
  • the robot obtains that the restroom on this floor is under maintenance, and can output a prompt message to the user "the restroom on this floor is under maintenance, you can go to the restroom on the second floor”.
  • the method of the embodiment of the present application further includes the following steps: after determining the end of the service task, deleting all the information corresponding to the service task, and synchronizing the service task end message corresponding to the service task to others robot.
  • all information corresponding to a service task includes all information related to the service task, such as task information, task execution status, and so on.
  • the end of the service task can be determined:
  • the user’s service task is “find the restroom”.
  • the robots around the restroom detect that the user has reached the restroom, the robot will output a voice similar to “you have reached the restroom” indicating the completion of the service.
  • the robot deletes all the information corresponding to the service task "find the restroom”, and synchronizes the service task end message corresponding to the service task to other robots.
  • the preset duration can be set according to actual application scenarios. Assuming that the preset duration is 1 hour, if one hour after the service task is generated, it is detected that the service task has not been completed, then the service task is determined to end.
  • the registered service task is "find shop A", but the user leaves the mall after registering the service task.
  • the preset time is 1 hour
  • the service task "find shop A” will be generated 1 hour after , It is detected that the service task "find shop A” has not been completed, and it is determined that the service task of "find shop A” ends.
  • the user does not want to go to shop A after registering the service task "find shop A”, and 1 hour after generating the service task "find shop A”, it is determined that the service task "find shop A” has not been completed yet The service task of "find shop A” ends.
  • the server can also detect whether the user's service task has timed out. When the user's service task is detected to be timed out, delete the service task corresponding to the service task stored in the server All information, and the service task end message corresponding to the service task is synchronized to the robots in all service areas.
  • the end task instruction may be directly input by the user through the robot.
  • the user can input the end task instruction by voice.
  • the service task registered by the user is "find shop A”.
  • the robot determines that the user's service task is "find shop A”
  • it reminds the user of the route to shop A.
  • the robot can determine whether the voice input by the user is an instruction to end a task through semantic recognition.
  • the semantic recognition is an existing technology and will not be repeated.
  • the user can also end the task through the mobile terminal he uses. If the user created the task information through the mobile terminal, the user can directly find the task information through the mobile terminal, and click the end task button corresponding to the task information to input the end task instruction, and the mobile terminal ends the task information
  • the task instruction is sent to the server, and the server synchronizes the end task instruction of the task information to all robots in the area. Each robot deletes the task information stored in the robot after receiving the task end message corresponding to the task information sent by the server. If the user did not create the task information through the mobile terminal, the mobile terminal can send a request for task information to the server through the mobile terminal.
  • the server obtains the task information request to feed back all the task information corresponding to the user to the mobile terminal, and the mobile terminal displays the received task Information, the user enters the end task instruction by clicking the end task button corresponding to the task information, the mobile terminal sends the end task instruction of the task information to the server, and the server synchronizes the end task instruction of the task information to all robots in the area. After the robot receives the task end message corresponding to the task information sent by the server, it deletes the task information stored in the robot.
  • an embodiment of the present application also provides a multi-robot collaborative service device 40, wherein each robot in the multi-robot corresponds to a service area,
  • the robot cooperative service device 40 includes: an acquisition module 401, a processing module 402, and a control module 403.
  • the obtaining module 401 is used to obtain the service task of the user detected by the robot.
  • the processing module 402 is used to determine the service operation of the robot to the user in the service area of the robot according to the service task and the service area of the robot.
  • the control module 403 is used to control the robot to perform service operations in its service area.
  • the acquiring module 401 is specifically configured to generate the user's service task according to the user's input information detected by the robot.
  • the multi-robot collaborative service device 40 of the embodiment of the present application further includes a task creation module, which is used to: obtain the biological information of the user detected by the robot; synchronize the user's task information to other robots in the multi-robot, and the task information includes The user's biological information and service tasks.
  • a task creation module which is used to: obtain the biological information of the user detected by the robot; synchronize the user's task information to other robots in the multi-robot, and the task information includes The user's biological information and service tasks.
  • the acquiring module 401 is specifically configured to: acquire the biological information of the user detected by the robot; acquire target task information matching the user’s biological information from the synchronized task information, the task information including the user’s biological information and Service task; obtain the user's service task from the target task information.
  • the multi-robot cooperative service device 40 of the embodiment of the present application further includes an execution status determining module, which is used to generate task execution status corresponding to the service task according to the service operation, and synchronize the generated task execution status to the multi-robot.
  • an execution status determining module which is used to generate task execution status corresponding to the service task according to the service operation, and synchronize the generated task execution status to the multi-robot.
  • the multi-robot cooperative service device 40 of the embodiment of the present application further includes an execution status obtaining module, configured to obtain the task execution status corresponding to the service task from the synchronized task execution status.
  • the processing module 402 is specifically configured to determine the service operation of the robot to the user in its service area according to the service task, the service area of the robot, and the task execution status corresponding to the service task.
  • the processing module 402 is specifically configured to: determine the service operation of the robot to the user in its service area according to the service task, the service area of the robot, and additional information.
  • the additional information includes at least one of the following information: time information , Weather information and local property information.
  • the multi-robot cooperative service device 40 of the embodiment of the present application further includes a task deletion module, which is used to delete all the information corresponding to the service task after determining the end of the service task, and end the service task corresponding to the service task The message is synchronized to other robots in the multi-robot.
  • the task deletion module is further used to delete all information corresponding to the service task after receiving the service task end message corresponding to the service task.
  • the multi-robot cooperative service device 40 of the embodiment of the present application further includes a task end judgment module, which is used to determine that the service task is completed if it is judged that the service task has been completed; or, if the current time is different from the generated service task If the length of time between the moments exceeds the preset time length, it is determined that the service task ends; or, if the end task instruction corresponding to the service task is received, it is determined that the service task ends.
  • a task end judgment module which is used to determine that the service task is completed if it is judged that the service task has been completed; or, if the current time is different from the generated service task If the length of time between the moments exceeds the preset time length, it is determined that the service task ends; or, if the end task instruction corresponding to the service task is received, it is determined that the service task ends.
  • the multi-robot cooperative service device provided in the embodiment of the present application adopts the same inventive concept as the above-mentioned multi-robot cooperative service method, and can achieve the same beneficial effects, which will not be repeated here.
  • the robot control device 50 may include a processor 501, a memory 502, and a transceiver 503.
  • the transceiver 503 is used to receive and send data under the control of the processor 501.
  • the memory 502 may include a read only memory (ROM) and a random access memory (RAM), and provides the processor with program instructions and data stored in the memory.
  • the memory may be used to store the program of the multi-robot cooperative service method.
  • the processor 501 can be a CPU (central embedded device), ASIC (Application Specific Integrated Circuit, application-specific integrated circuit), FPGA (Field-Programmable Gate Array, field programmable gate array) or CPLD (Complex Programmable Logic Device), complex programmable
  • the (logic device) processor implements the multi-robot cooperative service method in any of the foregoing embodiments according to the obtained program instructions by calling the program instructions stored in the memory.
  • an embodiment of the present application also provides a robot, including a robot control device 50 as shown in FIG. 5.
  • the robot in the embodiment of the present application may also include, but is not limited to, at least one of the following equipment: a face recognition device, an iris recognition device, a voiceprint recognition device, a motion collection device, etc., to collect the user's biological information.
  • the robot can also be a mobile robot to guide the user.
  • an embodiment of the present application also provides a multi-robot collaborative service system, which includes multiple robots.
  • One robot in the multi-robot collaborative service system corresponds to a service area.
  • the robots communicate through the network to synchronize relevant information corresponding to the service task.
  • each robot moves in its own service area and provides services for users appearing in their respective service area.
  • the robot in the multi-robot collaborative service system may be the robot including the robot control device 50 in the embodiment of the present application.
  • each robot in the multi-robot collaborative service system can synchronize related information corresponding to the service task through the server.
  • the server is used to receive the relevant information corresponding to the service task sent by any robot, and synchronize the relevant information corresponding to the service task to other robots.
  • Each robot and the server communicate with each other through a network, and the network can be a local area network, a wide area network, etc.
  • each robot in the multi-robot collaborative service system can also synchronize the relevant information corresponding to the service task through the P2P (peer-to-peer) transmission mode, such as Bluetooth or MESH (wireless mesh).
  • P2P peer-to-peer
  • MESH wireless mesh
  • the embodiment of the present application provides a computer-readable storage medium for storing computer program instructions used for the above-mentioned electronic device, which includes a program for executing the above-mentioned multi-robot cooperative service method.
  • the above-mentioned computer storage medium may be any available medium or data storage device that the computer can access, including but not limited to magnetic storage (such as floppy disk, hard disk, magnetic tape, magneto-optical disk (MO), etc.), optical storage (such as CD, DVD, BD) , HVD, etc.), and semiconductor memory (such as ROM, EPROM, EEPROM, non-volatile memory (NAND FLASH), solid state drive (SSD)), etc.
  • magnetic storage such as floppy disk, hard disk, magnetic tape, magneto-optical disk (MO), etc.
  • optical storage such as CD, DVD, BD) , HVD, etc.
  • semiconductor memory such as ROM, EPROM, EEPROM, non-volatile memory (NAND FLASH), solid state drive (SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un procédé de service coopératif à robots multiples, comprenant les étapes consistant à : obtenir une tâche de service d'un utilisateur (10) détecté par des robots (11); déterminer, en fonction de la tâche de service et des zones de service des robots (11), des opérations de service des robots (11) sur l'utilisateur (10) dans leurs zones de service; et commander les robots (11) pour exécuter les opérations de service dans leurs zones de service. Selon le procédé de service, des services sont fournis par les robots dans différentes zones de service pour l'utilisateur dans un mode de relais, et un seul robot n'a pas besoin de suivre un utilisateur dans tout son trajet, ce qui réduit le temps lorsqu'une seule tâche est exécutée par un robot, améliore l'efficacité de service du robot et améliore l'expérience de l'utilisateur. La présente invention concerne également un appareil de service coopératif à robots multiples, un dispositif de commande de robot, un robot, un système de service coopératif à robots multiples et un support de stockage lisible par ordinateur.
PCT/CN2020/073900 2019-01-25 2020-01-22 Procédé, appareil et système de service coopératif à robots multiples, et dispositif de commande WO2020151755A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910074518.2A CN109676611B (zh) 2019-01-25 2019-01-25 多机器人协同服务方法、装置、控制设备及系统
CN201910074518.2 2019-01-25

Publications (1)

Publication Number Publication Date
WO2020151755A1 true WO2020151755A1 (fr) 2020-07-30

Family

ID=66194732

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/073900 WO2020151755A1 (fr) 2019-01-25 2020-01-22 Procédé, appareil et système de service coopératif à robots multiples, et dispositif de commande

Country Status (3)

Country Link
CN (1) CN109676611B (fr)
TW (1) TWI732438B (fr)
WO (1) WO2020151755A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114012741A (zh) * 2021-12-14 2022-02-08 北京云迹科技有限公司 基于程序的机器人的控制方法及装置
CN114571449A (zh) * 2022-02-21 2022-06-03 北京声智科技有限公司 数据处理方法、装置,智能机器人及计算机介质
CN115951785A (zh) * 2023-03-10 2023-04-11 广东工业大学 一种用于咨询服务的人机交互方法及系统
CN118003343A (zh) * 2024-04-10 2024-05-10 招商积余数字科技(深圳)有限公司 物业工程机器人的物业工程管控方法及系统

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109676611B (zh) * 2019-01-25 2021-05-25 北京猎户星空科技有限公司 多机器人协同服务方法、装置、控制设备及系统
CN110297725B (zh) * 2019-06-28 2023-11-21 北京金山安全软件有限公司 一种智能终端故障上报方法和装置
CN112149937A (zh) * 2019-06-28 2020-12-29 百度在线网络技术(北京)有限公司 服务信息提供方法、装置、设备、服务器和介质
JP6900058B2 (ja) * 2019-07-30 2021-07-07 株式会社リビングロボット パーソナルアシスタント制御システム
DE112020004141T5 (de) * 2019-09-02 2022-07-14 Honda Motor Co., Ltd. Steuerungsplattform, steuersystem, dienstbereitstellungssystem, dienstbereitstellungsverfahren und steuerverfahren
CN110765288B (zh) * 2019-09-04 2022-09-27 北京旷视科技有限公司 一种图像信息同步方法、装置、系统及存储介质
CN110937480B (zh) * 2019-12-12 2022-01-04 广州赛特智能科技有限公司 一种机器人自主搭乘电梯的方法及系统
CN110879556A (zh) * 2019-12-13 2020-03-13 华南智能机器人创新研究院 一种多机器人局域网内的多机器人协同控制方法及装置
CN111618876A (zh) * 2020-06-11 2020-09-04 北京云迹科技有限公司 管理房间的方法、设备以及服务机器人
CN111950431B (zh) * 2020-08-07 2024-03-26 北京猎户星空科技有限公司 一种对象查找方法及装置
CN111941431B (zh) * 2020-09-04 2022-03-08 上海木木聚枞机器人科技有限公司 一种医院物流机器人自动跟随方法、系统及存储介质
CN112735128B (zh) * 2020-12-25 2022-04-12 广东嘉腾机器人自动化有限公司 Agv运输系统的交管控制方法
CN113093763B (zh) * 2021-04-13 2023-04-07 塔米智能科技(北京)有限公司 一种移动机器人调度系统和方法
CN114355877B (zh) * 2021-11-25 2023-11-03 烟台杰瑞石油服务集团股份有限公司 一种多机器人作业区域的分配方法和装置
CN114407044A (zh) * 2022-02-25 2022-04-29 合肥言尚智能科技有限公司 一种导引机器人及其导引方法
CN114932553B (zh) * 2022-06-06 2024-04-02 乐聚(深圳)机器人技术有限公司 机器人组队方法、机器人及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100145514A1 (en) * 2008-12-08 2010-06-10 Electronics And Telecommunications Research Institute Apparatus and method for controlling multi-robot linked in virtual space
CN103278151A (zh) * 2013-02-28 2013-09-04 中国矿业大学 一种动态烟羽环境下多机器人协作搜索气味源方法
CN105141899A (zh) * 2015-08-10 2015-12-09 北京科技大学 一种养老服务机器人的交互方法及系统
CN107186728A (zh) * 2017-06-15 2017-09-22 重庆柚瓣家科技有限公司 智能养老服务机器人控制系统
CN109676611A (zh) * 2019-01-25 2019-04-26 北京猎户星空科技有限公司 多机器人协同服务方法、装置、控制设备及系统

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017118001A1 (fr) * 2016-01-04 2017-07-13 杭州亚美利嘉科技有限公司 Procédé et dispositif de retour de robots à partir d'un site
JP6738555B2 (ja) * 2016-05-23 2020-08-12 富士ゼロックス株式会社 ロボット制御システム
CN106228302A (zh) * 2016-07-21 2016-12-14 上海仙知机器人科技有限公司 一种用于在目标区域内进行任务调度的方法与设备
CN107728609A (zh) * 2016-08-10 2018-02-23 鸿富锦精密电子(天津)有限公司 智能运动控制系统及智能运动控制方法
TWI631483B (zh) * 2016-09-30 2018-08-01 國立臺灣科技大學 機器人合作系統
CN106774345B (zh) * 2017-02-07 2020-10-30 上海仙软信息科技有限公司 一种进行多机器人协作的方法与设备
CN108858207A (zh) * 2018-09-06 2018-11-23 顺德职业技术学院 一种基于远程控制的多机器人协同目标搜索方法及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100145514A1 (en) * 2008-12-08 2010-06-10 Electronics And Telecommunications Research Institute Apparatus and method for controlling multi-robot linked in virtual space
CN103278151A (zh) * 2013-02-28 2013-09-04 中国矿业大学 一种动态烟羽环境下多机器人协作搜索气味源方法
CN105141899A (zh) * 2015-08-10 2015-12-09 北京科技大学 一种养老服务机器人的交互方法及系统
CN107186728A (zh) * 2017-06-15 2017-09-22 重庆柚瓣家科技有限公司 智能养老服务机器人控制系统
CN109676611A (zh) * 2019-01-25 2019-04-26 北京猎户星空科技有限公司 多机器人协同服务方法、装置、控制设备及系统

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114012741A (zh) * 2021-12-14 2022-02-08 北京云迹科技有限公司 基于程序的机器人的控制方法及装置
CN114012741B (zh) * 2021-12-14 2023-05-30 北京云迹科技股份有限公司 基于程序的机器人的控制方法及装置
CN114571449A (zh) * 2022-02-21 2022-06-03 北京声智科技有限公司 数据处理方法、装置,智能机器人及计算机介质
CN115951785A (zh) * 2023-03-10 2023-04-11 广东工业大学 一种用于咨询服务的人机交互方法及系统
CN115951785B (zh) * 2023-03-10 2023-05-12 广东工业大学 一种用于咨询服务的人机交互方法及系统
CN118003343A (zh) * 2024-04-10 2024-05-10 招商积余数字科技(深圳)有限公司 物业工程机器人的物业工程管控方法及系统
CN118003343B (zh) * 2024-04-10 2024-06-07 招商积余数字科技(深圳)有限公司 物业工程机器人的物业工程管控方法及系统

Also Published As

Publication number Publication date
TWI732438B (zh) 2021-07-01
CN109676611A (zh) 2019-04-26
CN109676611B (zh) 2021-05-25
TW202045324A (zh) 2020-12-16

Similar Documents

Publication Publication Date Title
WO2020151755A1 (fr) Procédé, appareil et système de service coopératif à robots multiples, et dispositif de commande
KR102518973B1 (ko) 모바일 장치 상태 관리 및 위치 결정
EP2880858B1 (fr) Utilisation d'un avatar dans un système de visioconférence
JP6158150B2 (ja) 屋内尤度ヒートマップ
CN110723609B (zh) 电梯控制方法、装置、系统、计算机设备和存储介质
JP6849813B2 (ja) ナビゲーションデータを生成し、対象物を搬送する方法及びシステム
JP2005324278A (ja) ロボット制御装置
WO2021027964A1 (fr) Procédé et dispositif de commande de robot, support d'informations et dispositif électronique
TW201202656A (en) Scalable routing for mobile station navigation with location context identifier
CN105371848A (zh) 一种室内导航方法及用户终端
WO2022205357A1 (fr) Procédé de commande de conduite autonome, dispositif électronique, terminal mobile et véhicule
US10397750B2 (en) Method, controller, telepresence robot, and storage medium for controlling communications between first communication device and second communication devices
CN105424045A (zh) 一种室内跨层寻路的路网构建和寻路方法
CN113160607B (zh) 停车位导航方法、装置、电子设备、存储介质及产品
US12019440B2 (en) Control system for receiving an elevator call in conjunction with a request for an autonomous vehicle
US20230213941A1 (en) Telepresence robots having cognitive navigation capability
JP2006259963A (ja) 移動ロボットの経路生成装置
CN115687553A (zh) 咨询指路方法、装置、电子设备和计算机可读介质
CN115783915A (zh) 一种楼宇设备的控制方法、系统、设备及存储介质
EP3907679B1 (fr) Navigation et séquençage améliorés d'une flotte de robots
CN107194484A (zh) 一种预约排队的规划方法和系统
US20200312145A1 (en) Two-way personalized communications for individuals with impairment
CN106662450B (zh) 路线上的设施分配
JP2010120129A (ja) ロボット連携システム、ロボット連携方法及びプログラム
CN113220459A (zh) 一种任务处理方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20744689

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 17.11.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 20744689

Country of ref document: EP

Kind code of ref document: A1