WO2018174289A1 - Conversation control system, and robot control system - Google Patents

Conversation control system, and robot control system Download PDF

Info

Publication number
WO2018174289A1
WO2018174289A1 PCT/JP2018/011914 JP2018011914W WO2018174289A1 WO 2018174289 A1 WO2018174289 A1 WO 2018174289A1 JP 2018011914 W JP2018011914 W JP 2018011914W WO 2018174289 A1 WO2018174289 A1 WO 2018174289A1
Authority
WO
WIPO (PCT)
Prior art keywords
customer
dialogue
information
scenario
unit
Prior art date
Application number
PCT/JP2018/011914
Other languages
French (fr)
Japanese (ja)
Inventor
雄介 柴田
智彦 大内
麻莉子 矢作
浩平 小川
雄一郎 吉川
石黒 浩
Original Assignee
株式会社 ゼンショーホールディングス
国立大学法人大阪大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 ゼンショーホールディングス, 国立大学法人大阪大学 filed Critical 株式会社 ゼンショーホールディングス
Publication of WO2018174289A1 publication Critical patent/WO2018174289A1/en

Links

Images

Definitions

  • the present invention relates to a dialog control system and a robot control system.
  • the dialogue may be distracted or the customer may feel uncomfortable. It can also reduce customer satisfaction.
  • the problem to be solved by the present invention is to provide a dialog control device and a robot control system capable of performing an appropriate dialog with a customer.
  • the dialogue control device determines a stay state of the customer who entered the restaurant in the restaurant, and a stay state determination unit that determines the stay state of the customer determined by the stay state determination unit.
  • An information acquisition unit that acquires the corresponding dialogue scenario information; and a dialogue control unit that performs a dialogue with the customer based on the dialogue scenario information acquired by the information acquisition unit.
  • the stay state determination unit may determine the stay state of the customer in the restaurant based on information from the store system.
  • the dialogue control apparatus may further include a reception unit that receives a signal indicating that the customer has ordered food and drink from the store system, and the stay state determination unit is configured to receive the signal at the reception unit. When the signal is received, it may be determined that the customer is waiting for the ordered food and drink.
  • the stay state of the customer determined by the stay state determination unit may be a state that changes over time.
  • the stay state of the customer determined by the stay state determination unit is a state from when the customer enters or sits in the restaurant until an order for food and drink, and the food and drink ordered by the customer are provided. At least one of the following states: a state until the customer completes eating and drinking of the food and drink, and a state until the customer leaves the restaurant after completing the eating and drinking of the food and drink You may go out.
  • the dialogue control device may include a recommended menu providing unit that provides a recommended menu of food and drink to the customer when the customer does not order food and drink, and the information acquisition unit When the customer selects the recommended menu and the customer selects another menu without selecting the recommended menu in the dialog with the customer regarding menu selection of May be.
  • the dialog control apparatus may include a scenario storage unit that stores a plurality of different dialog scenario information, and the information acquisition unit includes a dialog scenario according to the stay state of the customer determined by the stay state determination unit. Information may be acquired from the scenario storage unit.
  • the information acquisition unit may acquire dialogue scenario information according to the stay state of the customer determined by the stay state determination unit via a communication network.
  • a robot control system includes a robot that performs an operation according to an instruction signal, and an operation terminal that generates the instruction signal and transmits the instruction signal to the robot. Based on the information, a stay state determination unit that determines a stay state of the customer who entered the restaurant in the restaurant, and a dialogue scenario according to the stay state of the customer determined by the stay state determination unit An information acquisition unit that acquires information, and a dialog control unit that performs a dialog with the customer based on the dialog scenario information acquired by the information acquisition unit.
  • the operation terminal may include a receiving unit that receives the customer order information from the store system, and the stay state determination unit receives the order information at the receiving unit. It may be determined that the customer is waiting for the ordered food or drink.
  • the store system includes, in addition to the order information, first information indicating that the customer has entered the store, second information indicating that the customer has been seated, and the ordered food and drink being provided to the customer.
  • the third information to be displayed and the stay information including at least one of the fourth information indicating that the food and drink provided by the customer have been completed may be wirelessly transmitted, and the receiving unit of the operation terminal may The order information and the stay information transmitted by the store system are received, and the stay state determining unit determines the stay state of the customer based on the order information and the stay information received by the receiving unit. You may do it.
  • FIG. 1 is a block diagram showing a robot control system 1 according to an embodiment of the present invention.
  • the robot control system 1 includes a robot 2, an operation terminal 3 that is an example of a robot control device, a handy terminal 4, and a store system 5 such as a POS (Point ⁇ ⁇ of Sales) system.
  • the robot control system 1 is, for example, a system for a restaurant customer (hereinafter referred to as a user) to interact with the robot 2 via the operation terminal 3 that operates the robot 2.
  • the robot 1 is a machine having a human-like appearance and dialogue function, that is, a humanoid.
  • the robot 2 may have a dissimilar appearance to humans such as animals and characters. Further, the robot 2 may be a virtual robot based on an image displayed on the display unit 35.
  • the robot 2 includes a robot drive unit 21 and a robot control unit 22 that is an example of a drive control unit.
  • the robot 2 may be driven by electric power supplied from a commercial power source or may be driven by a battery.
  • the robot drive unit 21 includes a voice output device that outputs the speech voice of the robot 2. By driving the robot drive unit 21 as necessary, the robot 2 can speak to perform a dialogue with the user.
  • a drive control signal for controlling the drive of the robot drive unit 21 is input from the robot control unit 22 to the robot drive unit 21.
  • the robot drive unit 21 is driven according to the drive control signal.
  • the robot drive unit 21 may include an actuator that drives a portion of the robot 2 having a degree of freedom, a lighting device that lights the eyeball unit of the robot 2, and the like.
  • the robot control unit 22 receives a robot control command, which is an example of a command signal, from the operation terminal 3.
  • the robot control unit 22 generates the above-described drive control signal based on the received robot control command, and outputs the generated drive control signal to the robot drive unit 21. That is, the robot 2 can operate according to the robot control command.
  • the command for causing the robot 2 to speak includes data indicating the content of the utterance (scenario data described later).
  • the robot control unit 22 includes, for example, at least one memory storing software such as an application program and an operating system that operates the application program, and a CPU (Central Processing Unit) that executes the software stored in the memory.
  • the drive control signal may be generated by the CPU executing software stored in the memory.
  • the operation terminal 3 is carried by the user and is, for example, a tablet terminal having a touch function.
  • the operation terminal 3 may be a smartphone, a desktop display type terminal, or the like.
  • the operation terminal 3 includes an orientation sensor 31 that is an example of a sensor, and an operation generation unit 32 that is an example of a command signal generation unit and a transmission unit.
  • the operation terminal 3 is driven by electric power supplied from the built-in battery.
  • the azimuth sensor 31 outputs an azimuth detection signal indicating the azimuth of the operation terminal 3 to the motion generation unit 32.
  • the direction sensor 31 outputs a direction detection signal indicating the detected direction of the operation terminal 3 to the motion generation unit 32.
  • the motion generation unit 32 generates a robot control command for controlling the posture of the robot 2 so as to face the direction in which the operation terminal 3 exists based on the direction detection signal from the direction sensor 31.
  • the motion generation unit 32 generates a robot control command for controlling the motion of the robot 2 based on the output of the direction sensor 31.
  • the motion generation unit 32 transmits the generated robot control command to the robot control unit 22 via wireless communication such as Wi-Fi, for example.
  • the robot control unit 22 receives the robot control command from the motion generation unit 32, and outputs a drive control signal corresponding to the received robot control command to the robot drive unit 21, thereby controlling the operation of the robot 2. Do.
  • the operation of the robot 2 includes an interactive operation in which a conversation with the user is performed using voice.
  • the operation terminal 3 includes a scenario DB (database) 33, a scenario control unit 34, and a display unit 35 in addition to the motion generation unit 32 described above.
  • an input unit 36 such as a touch panel, and an audio output unit 37.
  • the scenario DB 33 stores a plurality of pieces of dialogue scenario information for carrying out a dialogue between the user and the robot 2.
  • the dialogue scenario information includes a plurality of dialogue scenarios.
  • the dialogue scenario is a story of dialogue exchanged between the user and the robot 2.
  • a customer who enters a restaurant changes a plurality of staying states after entering the restaurant and before leaving the store. Therefore, a separate dialogue scenario is accumulated in the scenario DB 33 for each transition state. Further, when a customer selects a specific option from among a plurality of options while the dialog scenario executed in a certain stay state is in progress, there may be a case of switching to another dialog scenario.
  • Each dialogue scenario is composed of robot-side scenario data that the robot 2 uses for dialogue (ie, utterance) and user-side scenario data that the user uses for dialogue (ie, selection on the operation terminal 3).
  • the user-side scenario data is data that collects user-side options and the like for the utterances of the robot 2 in each dialogue scenario.
  • the user-side scenario data is used for displaying options and the like on the display unit 35 of the operation terminal 3 in accordance with the progress of the conversation scenario. The user can select a specific option from the displayed options using the input unit 36.
  • the dialogue scenario data has a tree structure in which the robot side scenario data and the user side scenario data are alternately coupled as nodes.
  • a predetermined series of nodes ranging from the highest level to the lowest level are managed as basic scenarios used for typical dialogue, for example, and the other node groups are managed as correction scenarios for correcting the basic scenarios. May be.
  • dialogue scenario data for restaurants may be divided into a plurality of scenario groups according to status data (stay information) indicating a user's stay status at the restaurant.
  • status data in the example of FIG. 1 is information indicating that the user has performed the following actions: sitting, ordering, food and drink provision, meal termination, and accounting.
  • the dialogue scenario data is provided from the scenario group classified by these status data.
  • the scenario control unit 34 controls the scenario based on the data output from the store system 5.
  • FIG. 2 is a block diagram of the scenario control unit 34.
  • the scenario control unit 34 includes a reception unit 341, a stay state determination unit 342, an information acquisition unit 343, and a dialogue control unit 345.
  • the scenario control unit 34 may include a recommended menu providing unit 344.
  • the recommended menu providing unit 344 is not an essential configuration.
  • the receiving unit 341 receives information including status data output from the store system 5.
  • the status data can be input, for example, at the handy terminal 4 for ordering carried by the store staff of the restaurant.
  • the status data is transmitted from the handy terminal 4 to the store system 5 and then transmitted from the store system 5 to the receiving unit 341 of the scenario control unit 34. Further, the status data may be transmitted from the handy terminal 4 directly to the operation terminal 3 via the receiving unit 341 without passing through the store system 5.
  • status data examples include, in addition to order information indicating that the user has ordered food and drink, the first state indicating that the user has entered the store, the second state indicating that the user has been seated, and the ordered food and drink. There is a third state indicating that the user has been provided to the user, a fourth state indicating that the user has finished eating and drinking food and drink.
  • the status data is not limited to this, and the status data may include a leaving state indicating that the user has left the seat, a conversation state in which the users are having a conversation, or a state in which the user has ordered dessert after eating. Good.
  • the status data is information indicating the staying state of the user that changes with the passage of time.
  • the stay state determination unit 342 determines the stay state of the user based on the status data among the information received by the reception unit 341.
  • the stay state determination unit 342 receives a second state signal indicating that the user has been seated, for example, via the reception unit 341, the stay state determination unit 342 determines that the user is in the stay state before ordering after sitting.
  • the stay state determination unit 342 determines that the user is in a stay state waiting for food and drink after the order is placed. Similar to the above-described status data, the staying state is information indicating the state of the user that changes over time.
  • the relationship between the status data and the staying state may be a configuration other than those described above.
  • the store staff may input the stay state itself as the status data. That is, instead of inputting the user's action as a trigger, the state itself may be input. Specifically, the store staff may not directly input the act of being seated or ordered, but directly input the state of being seated or ordering and waiting for food or drink. .
  • stay state determination unit 342 determines that the received status data itself is a stay state. This difference in input is basically a matter of design matters.
  • the information acquisition unit 343 acquires, from the scenario DB 33, dialogue scenario information corresponding to the stay state determined by the stay state determination unit 342.
  • the dialogue scenario information refers to information on a scenario group that changes depending on the stay state, or individual scenario data included in the scenario group. There may be no scenario group, and the dialogue scenario information may be composed of data of a plurality of dialogue scenarios.
  • the scenario group is a bundle of scenario data suitable for the staying state of the user, and includes various scenario data according to the staying state.
  • the scenario group includes scenario data that presents a menu that matches each time, and the content of the conversation and the way the conversation changes depending on the number of adults and children Data and other scenario data according to various situations are included.
  • a plurality of dialogue scenario data that can be selected according to the operation of the operation terminal 3 may be provided even in the same staying state.
  • the information acquisition unit 343 may acquire scenario data for dialogue according to the staying state of the user, or after the scenario information is acquired by the information acquisition unit 343, scenario data to be used for dialogue from the scenario group. You may make it select and output.
  • the information acquisition unit 343 or the dialogue control unit 345 may select the scenario data used for the dialogue from the scenario group.
  • the recommended menu providing unit 344 transmits recommended menu information related to the recommended menu to the information acquisition unit 343 when there is a recommended menu in the menu.
  • the information acquisition unit 343 acquires dialogue scenario information related to the recommended menu from the scenario DB 33.
  • the recommended menu providing unit 344 may output the recommended menu based on the dialogue scenario information regarding the recommended menu acquired by the information acquiring unit 343 when the user is in the stay state before ordering after sitting. .
  • the information acquisition unit 343 checks the recommended menu providing unit 344 for the presence of a recommended menu when acquiring the dialogue scenario information. If there is a recommended menu, the scenario data recommending the recommended menu is acquired from the scenario DB 33. Then, the acquired scenario data is output to the dialogue control unit 345. When there are a plurality of recommended menus, the plurality of recommended menus may be transmitted to the information acquisition unit 343, and information that there is another recommended menu in addition to the first recommended menu is also acquired by the information acquisition unit 343. You may send to.
  • the dialogue control unit 345 outputs information including scenario data used for the dialogue acquired according to the staying status of the user to the action generation unit 32. Further, the dialogue control unit 345 may output data used for the dialogue to the display unit 35 of the operation terminal 3 if necessary.
  • the data used for the dialogue is basically information on the user side scenario data used for responding to the robot side scenario data.
  • the motion generation unit 32 that has received the scenario data used for the dialogue transmits a signal obtained by converting the dialogue data into a voice to the robot control unit 22, thereby generating a voice from the robot 2. At this time, it is also possible to send a signal for opening / closing or lighting the mouth of the robot 2 to the robot control unit 22 to open / close or light the robot 2. Further, instead of appropriately transmitting the voice data from the operation terminal 3 to the robot control unit 22 as described above, the voice data is stored in the robot 2 and an identification signal associated with each voice data is transmitted. Thus, the robot 2 may be uttered.
  • the voice output from the robot 2 may be an utterance that includes a meaningful word, or may be an artificial sound that does not include a word.
  • the data output by the dialogue control unit 345 is not limited to the user-side scenario data, and when the user selects a menu, an image or video of food or drink may be output to the display unit 35. . That is, not only information about words and characters used for dialogue, but also data that can be used for display, such as images and moving images, may be output to the display unit 35 as appropriate.
  • the present invention is not limited to this, and sound may be output from the operation terminal 3 or the operation terminal 3 may be vibrated as necessary. In this case, the sound data, vibration data, and the like are output. You may do it. Further, when the robot 2 utters a sound, the sound uttered by the robot 2 may be output to the display unit 35 as characters. On the other hand, the user makes an arbitrary selection at the input unit 36 from the options displayed on the display unit 35 based on the user-side scenario data displayed on the display unit 35.
  • the dialogue control unit 345 outputs the robot side scenario data for responding to the data selected by the user to the display unit 35 and the motion generation unit 32.
  • the user's selection may be output as audio information from the operation terminal 3 via the audio output unit 37.
  • the dialogue control unit 345 is based on a user input indicating that another recommended menu is to be presented from the input unit 36.
  • a signal may be output to the information acquisition unit 343 so as to present a different recommended menu.
  • a new recommended menu may be notified to the information acquisition unit 343 via the recommended menu providing unit 344.
  • the information acquisition unit 343 Upon receiving the instruction of the new recommended menu, the information acquisition unit 343 transmits the dialog scenario related to the new recommended menu acquired from the scenario DB 33 or the recommended menu providing unit 344 in advance to the dialog control unit 345. Good.
  • the information acquisition unit 343 acquires a dialogue scenario related to the new recommended menu from the scenario DB 33 or the recommended menu providing unit 344 at a timing when a notification that a new recommended menu is output has been received. It may be.
  • the recommended menu can be set not only on the store side or on the chain side of the store, but also, for example, a menu with high sales or a popular menu.
  • the input unit 36 is connected to the dialogue control unit 345, but is not limited thereto, and may be connected to the recommended menu providing unit 344 together.
  • the recommended menu providing unit 344 does not go through the dialogue control unit 345 but sends a signal regarding the other recommended menus to the information acquisition unit 343 based on the input. Is output.
  • the scenario control unit 34 acquires appropriate data from the various scenario data stored in the scenario DB 33 based on the staying state of the user input to the handy terminal 4 by the store staff, and the motion generation unit 32, and output to the display unit 35. Further, the scenario control unit 34 reads out new robot-side scenario data that responds to the dialogue scenario selected by the dialogue scenario selection signal from the scenario DB 33 in accordance with the dialogue scenario selection signal input from the input unit 36.
  • the scenario control unit 34 outputs the read new robot side scenario data to the motion generation unit 32 for speech. By repeating this process, it is possible to perform a dialogue between the user and the robot 2.
  • the dialogue between the user and the robot 2 can be performed using the dialogue scenario data.
  • FIG. 3 is a diagram schematically showing the processing from the operation of the handy terminal 4 by the store staff to the selection of the scenario data together with the time series.
  • solid arrows indicate data and signal input / output, and broken arrows indicate the passage of time.
  • the process up to the scenario data selection is executed by the store system 5, the scenario DB 33, and the scenario control unit 34.
  • the first information that the user has entered the handy terminal 4 or the second information that the user has been seated is input to the handy terminal 4 as the stay information A1 by the operation of the store staff.
  • the stay state determination unit 342 indicates that the current stay state of the user has entered the restaurant or seated. It is determined that the stay state P1 is a state from when the order is placed to when the order is placed.
  • the information acquisition part 343 acquires the scenario data which belongs to the scenario group 1 according to this stay state P1 from scenario DB33, and performs a dialog with a user.
  • the handy terminal 4 may transmit, for example, a table number or the like on which the user is seated as a key. By doing so, it is possible to associate the status data typed from the handy terminal 4 with the user.
  • the store staff transmits the user's order to the store system 5 via the handy terminal 4, that is, the order information indicating that the order has been transmitted is transmitted as the stay information A2.
  • the stay state determination unit 342 waits for the food and drink ordered by the user to be provided from the stay state P1 before the order, that is, the order is provided and the food and drink are provided. It is determined that the state has shifted to the waiting stay state P2.
  • the information acquisition part 343 acquires the scenario data which belongs to the scenario group 2 according to this stay state P2 from scenario DB33, and performs a dialog with a user.
  • the state shifts to a state until the user completes eating and drinking of the food and drink, that is, stay state P3 being eaten and drink, and scenario data belonging to scenario group 3
  • the state transitions to the staying state P4, which is a state from when the user completes eating and drinking until the user leaves the store, and interacts with scenario data belonging to the scenario group 4 I do.
  • the staying state changes, and a dialog using the scenario data corresponding thereto is performed.
  • the stay status is not limited to the above-mentioned information, just like the stay information described above.
  • a staying state for example, when dessert is ordered, the waiting state is waiting for the dessert to be served, and after the ordered dessert is served, the staying state is that the dessert is being eaten or eaten. It may be included.
  • the scenario group is not limited to the above, and may include a scenario group according to the dessert standby state, a scenario group according to the dessert eating and drinking state, and the like.
  • the user may return to the staying state P2.
  • the input of status data is not limited to that input via the handy terminal 4.
  • status data indicating that the checkout has been made may be transmitted to the operation terminal 3 from the POS terminal (store system 5) at the time of checkout.
  • the operation terminal 3 that has received the status data indicating that the transaction has been made may return the staying state to the initial state.
  • FIG. 4 is a flowchart showing the flow of scenario data acquisition processing and dialogue processing in the operation terminal 3 according to the present embodiment. An example of operation will be described with reference to FIGS.
  • an initial stay state for example, a null value may be entered, and the state may be left uninput, or a stay state that is not yet entered or not seated may be set. In the following description, it is assumed that a stay state of not entering or not seating is set as an initial state.
  • the stay state determination unit 342 acquires status data (stay information) from the store system 5 via the reception unit 341 (step S10).
  • the stay state determination unit 342 receives, for example, status data A1 that the store is entered in FIG.
  • the stay state determination unit 342 determines the stay state of the user based on the received status data (step S11). For example, when the status data A1 is received, the stay state determination unit 342 determines that the stay state is the stay state P1 that is the state before the user places an order. The stay state determination unit 342 outputs the obtained stay state P1 to the information acquisition unit 343.
  • the information acquisition unit 343 determines whether or not the staying state of the user has been changed (step S12). For example, when the state is changed from a non-entry or unoccupied state to the stay state P1, the information acquisition unit 343 determines that the state has changed to a new stay state (step S12: YES), and a scenario acquisition step Migrate to
  • the information acquisition unit 343 that has determined that the stay state has been changed selects a scenario group (step S13). For example, in the stay state P1, the information acquisition unit 343 selects the scenario group 1 as a group for acquiring a scenario.
  • the information acquisition unit 343 selects scenario data used for the dialogue from the scenarios belonging to the scenario group 1 (step S14). For example, if you transition from the non-seated state to stay state P1, the user, immediately after visiting, or, since there is likely to be immediately after being guided to the seat, tailored to the time "Good morning”, “Hello”, etc. It is also possible to select a scenario for greeting and a scenario for menu selection.
  • the recommended menu information may be provided from the recommended menu providing unit 344 at the timing of selecting this scenario.
  • the information acquisition unit 343 selects a scenario when there is a recommended menu, and outputs a scenario based on the recommended menu provided from the recommended menu providing unit 344 to the dialogue control unit 345.
  • step S15 the dialogue control unit 345 executes a dialogue process with the user via the robot 2 (step S15). If the stay state has not changed (step S12: NO), this process is performed without the processes of steps S13 and S14.
  • FIG. 5 is a diagram illustrating an example of an interaction between the robot 2 and the user, and is a diagram illustrating an example when the recommended menu providing unit 344 provides recommended menu information.
  • the dialogue is performed using the robot 2 and the operation terminal 3.
  • the operation terminal 3 is, for example, a tablet-type portable terminal that includes a display unit 35 and a touch panel type input unit 36 integrated into the display unit 35. From the robot 2, for example, an utterance “What to order?” Is executed based on the robot-side scenario data selected by the information acquisition unit 343.
  • the display unit 35 of the operation terminal 3 has options of “view menu” and “tell me a recommended menu!” As an answer to the above robot side scenario data. Including user-side scenario data is displayed.
  • the user executes a dialog with the robot 2.
  • the selected content may be uttered by automatic voice. Thereby, the feeling of interaction with the robot 2 can be further enhanced.
  • the dialogue control unit 345 determines whether or not a scenario used for the subsequent dialogue is changed based on an answer from the user (step S16). For example, in FIG. 5, it is assumed that a scenario that recommends a recommended menu is selected as the scenario data. Here, when the user selects the option “view menu”, the scenario needs to be changed. In such a case, the dialog control unit 345 determines that the scenario needs to be changed (step S16: YES), returns to step S14, and selects the scenario that shows the menu from the scenario group 1. The acquisition unit 343 is notified.
  • step S16 NO
  • the dialogue between the user and the robot 2 is continued.
  • the stay state determination unit 342 acquires the stay information of the user (step S17). This is a process for dealing with such a situation because the store staff may input the stay information of the user using the handy terminal 4 during the dialogue.
  • the operation terminal 3 may be configured so that the stay information can be acquired during the dialogue in this way.
  • the dialogue control unit 345 determines whether or not to terminate the dialogue with the robot 2 (step S17).
  • the dialogue control unit 345 terminates the dialogue.
  • the robot 2 may utter “Let's talk again”.
  • the process returns to step S11, and the dialogue is continued according to the above-described procedure.
  • FIG. 6 is a diagram illustrating a display example of the display unit 35 when “Tell me a recommended menu!” In FIG. 5 is selected by the user.
  • the letters “Italian cheese hamburger” provided by the recommended menu providing unit 344 are displayed together with images and videos as the recommended menu for today.
  • other recommended menus may be displayed together.
  • the explanation of the Italian cheese hamburger from the robot 2 the contents of the menu, the reason for recommendation, etc. may be spoken.
  • the user selects a menu for ordering Italian cheese hamburger that is, when “select this menu” is selected, the scenario is not changed (step S16: NO), and the dialog is performed. Will continue.
  • FIG. 7 is a diagram showing an example of dialogue when the recommended menu is selected.
  • the robot 2 utters a message such as “Thank you for choosing the recommended menu. Thank you!” For the user's selection.
  • options such as “of course!” And “change after all” are displayed.
  • the robot 2 utters, for example, “Recommended menu is ... Good choice!”, followeded by “Call the store staff. "Let's place an order!”
  • FIG. 8 is a diagram illustrating an example when another scenario is selected.
  • the robot 2 speaks, for example, “I made a different menu”.
  • options such as “I'm sorry” and “I'll change after all” are displayed.
  • the robot 2 may utter a message such as “See another recommended menu?” And recommend another recommended menu to the user. .
  • the dialog control unit 345 determines whether to change the scenario according to the option selected by the user, and the information acquisition unit 343 obtains new scenario data from the scenario DB 33 based on the determination result. get.
  • the information acquiring unit 343 can also exchange information with the recommended menu providing unit 344.
  • the information acquisition unit 343 acquires new recommended menu information from the recommended menu providing unit 344, and the scenario DB 33 Another scenario may be acquired from the recommended menu.
  • stay information is not updated between FIG. 6 to FIG. 8 described above, the stay state is not updated by the stay state determination unit 342 even if the stay information is acquired in step S17 after the user answers. There is no scenario group selection.
  • order information that the user has ordered food and drink and the user has placed an order via the handy terminal 4 is received via the receiving unit 341 of the scenario control unit 34. This case will be described below.
  • step S17 the stay information A2 that the handy terminal 4 ordered in FIG. 3 is transmitted from the handy terminal 4 to the stay state determination unit 342 via the store system 5 and the reception unit 341. If it is not determined that the dialogue has ended (step S18: NO), then the stay state determination unit 342 detects the stay state (step S11). The stay state determination unit 342 is based on having received the stay information A2, and the current stay state is a state in which the user completes the order and is waiting for the ordered food and drink to be provided. It is determined that transition to P2 has occurred.
  • the information acquisition unit 343 notified of the transition from the stay state determination unit 342 to the stay state P2 selects the scenario group 2 corresponding to the stay state P2 as a scenario group for acquiring scenario data (step S13). ). Subsequently, the information acquisition unit 343 selects and acquires the dialogue scenario information belonging to the scenario group 2 from the scenario DB 33 (step S14). And based on the dialogue scenario information acquired by the information acquisition unit 343, the dialogue control unit 345 executes a dialogue with the user via the robot 2 (step S15).
  • the stay state determination unit 342 may simply notify that the current stay state is the stay state P2. In this case, the information acquisition unit 343 determines that the stay state has changed, and selects the scenario group 2 You may make it do.
  • FIG. 9 is a diagram showing an example of dialogue when scenario data belonging to scenario group 2 is selected by the information acquisition unit 343. As shown in FIG. 9, the robot 2 performs a dialogue suitable for a state where the order is finished and the user is waiting to provide food and drink, such as “Let's talk until the food comes!”.
  • a customer who visits a restaurant can perform an appropriate dialogue with the robot 2 installed in the restaurant and the operation terminal 3. That is, in the present embodiment, the staying state of the customer can be grasped based on information input by the store staff via the handy terminal 4. And by acquiring dialogue scenario information according to the staying state of the customer that changes in a relatively short time, it is suitable for the state of the customer, for example, the state before ordering or the state of eating and drinking It is possible to have a dialogue and increase customer satisfaction.
  • the stay state changes when the store staff inputs from the handy terminal 4, but the present invention is not limited to this.
  • the state may transition to a staying state during eating and drinking.
  • the user may automatically transition to a stay state in which the user has finished eating after a predetermined time such as 30 minutes or 45 minutes.
  • the store staff when an order is received, the store staff does not forget to update the stay status in order to operate the handy terminal 4, but other stay information is confirmed by the store staff and operated by the handy terminal 4. Since it is necessary, the stay information may not be updated via the handy terminal 4. In such a case, it is effective to automatically change the stay state after a predetermined time has elapsed.
  • FIG. 10 is a block diagram illustrating functions of the scenario control unit 34 according to the first modification.
  • the scenario control unit 34 includes a reception unit 341, a stay state determination unit 342, an information acquisition unit 343, and a dialogue control unit 345.
  • the recommended menu providing unit 344 is not provided. As described above, the recommended menu providing unit 344 is not an essential component of the scenario control unit 34.
  • the information acquisition unit 343 acquires scenario data without obtaining recommended menu information in step S14 in FIG.
  • the user can interact with the robot 2 through the scenario control unit 34 according to the staying state of the user.
  • scenario data including a recommended menu may be stored in the scenario DB 33.
  • the store may appropriately select the structure shown in FIG. 2 and the structure shown in FIG. 10 according to the situation of the scenario DB 33 and the scenario control unit 34.
  • FIG. 11 is a block diagram illustrating an example of the robot control system 1 according to the second modification.
  • the scenario DB 33 is installed in the operation terminal 3, but is not limited thereto.
  • the scenario DB 33 may be outside the operation terminal 3.
  • the scenario control unit 34 is connected to the scenario DB 33 via the network 6.
  • the scenario DB 33 may be provided in a predetermined server outside the operation terminal 3, or a database (including a distributed database) existing on a so-called cloud connected via the network 6 instead of a specific server. It may be.
  • the network 6 and the operation terminal 3 or the scenario DB 33 are connected via wired or wireless communication.
  • the scenario DB 33 by having the scenario DB 33 outside the operation terminal 3, it is possible to secure the capacity of the operation terminal 3 and to centrally manage scenario data.
  • the scenario DB 33 exists in the cloud, and it is possible to select more interactive content by setting it so that a specific number of editors can edit or add scenario data. Is also possible.
  • FIG. 12 is a diagram illustrating a state of the display unit 35 in the conversation according to the third modification.
  • the robot 2 is an indispensable component, but this modification is an example in which the robot 2 does not physically exist.
  • a robot serving as a substitute for the robot 2 may be drawn on the display unit 35 of the operation terminal 3.
  • the user can interact with the virtual robot displayed in the operation terminal 3.
  • the user may be able to select a favorite robot as a partner to interact with.
  • a dialogue with a virtual robot may be performed using a balloon character or the like.
  • a camera mounted on the operation terminal 3 is displayed on the display unit 35 on the table on which the user is seated. You may project so that the robot 2 may exist on a table.
  • An AR marker may be provided at a predetermined position on or around the table, or a markerless AR may be used.
  • a plurality of robot candidates to be displayed as AR may be set in advance so that the user can select a favorite robot as a conversation partner.
  • the robot 2 is not essential, and the user can enjoy a dialogue according to the staying state using the operation terminal 3. By doing so, the cost can be reduced and the table used by the user can be widely used.
  • the voice of the robot may be uttered from the operation terminal 3.
  • the robot to be displayed can be selected by the user, the robot may be uttered by changing the voice pitch, voice quality, voice color, etc. according to the selected robot.
  • FIG. 13 is a diagram illustrating a configuration of the robot control system 1 according to the fourth modification.
  • the motion generation unit 32, the scenario control unit 34, and the scenario DB 33 may be provided in the robot 2 instead of the operation terminal 3. That is, the main control function may be provided not in the operation terminal 3 but in the robot 2.
  • the operations of the operation generation unit 32 and the scenario control unit 34 are not significantly different from the operations in the above-described embodiment, and in this modification as well, the dialogue process is executed according to the flowchart shown in FIG.
  • the configuration of the operation terminal 3 can be simplified, and the communication fee transmitted from the operation terminal 3 to the robot 2 can be reduced, so that the performance of the communication line between the operation terminal 3 and the robot 2 can be reduced. Even when the value is low, dialogue with the robot 2 can be performed without any trouble.
  • the motion generation unit 32 may be provided outside the operation terminal 3 and the robot 2.
  • the motion generation unit 32 may be built in the store system 5 or provided in a communication device separate from the store system 5. May be.
  • the dialogue is described as a state in which the user or the robot 2 outputs some information, but is not limited thereto.
  • a silent scenario may be prepared in advance and the silent scenario may be output. .
  • the robot control system 1 according to all the embodiments and modifications described above can be suitably applied to the food and beverage service as described above, but may be applied to various services other than the food and beverage service.
  • the robot control system 1 is a system in which software processing is specifically executed by hardware according to a program recorded in a computer such as a CPU and a memory.
  • a computer such as a CPU and a memory.
  • each unit may be configured by an analog circuit or a digital circuit to execute a function, or a function may be realized by mixing a computer and a circuit.

Abstract

Provided are: a conversion control device capable of carrying on a proper conversation with a customer; and a robot control system. This conversion control device is provided with a staying state determination unit, an information acquisition unit, and a conversation control unit. The staying state determination unit determines a staying state in the restaurant of a customer visiting the restaurant. The information acquisition unit acquires conversation scenario information according to the staying state of the customer determined by the staying state determination unit. The conversation control unit carries on a conversation with the customer on the basis of the conversation scenario information acquired by the information acquisition unit.

Description

対話制御システム、及び、ロボット制御システムDialog control system and robot control system
 本発明は、対話制御システム、及び、ロボット制御システムに関する。 The present invention relates to a dialog control system and a robot control system.
 近年、レストランその他公共施設において、来場客の案内役としてヒューマノイド型ロボットを活用する事例が見られる(特開2015-66623号公報参照)。このようなロボットとのコミュニケーション(対話)を専用の操作端末を用いて行う技術が検討されており、この技術によれば、ユーザは、操作端末を介してロボットとの対話を楽しんだり、案内情報を入手したりすることができる。 In recent years, there have been cases where humanoid robots are used as guides for visitors in restaurants and other public facilities (see Japanese Patent Laid-Open No. 2015-66623). A technology for performing communication (dialogue) with such a robot using a dedicated operation terminal has been studied. According to this technology, a user can enjoy a dialogue with a robot via an operation terminal, or guide information can be obtained. Can be obtained.
 例えば、レストランにロボットを設置する場合、テーブル席に設置されたロボットと、直接的あるいはタブレット等を介して間接的に対話を行うサービスが考えられる。この場合、お薦めメニューを提示したり、雑談をしたりすることにより顧客の満足度を向上させることができる。 For example, when installing a robot in a restaurant, a service is conceivable in which the robot installed at the table seat interacts directly or indirectly via a tablet or the like. In this case, customer satisfaction can be improved by presenting a recommended menu or chatting.
 しかしながら、顧客の現在の状態、例えば、注文前、食事提供前、食事中などの状態に適した対話を行うものでないと、対話がちぐはぐになったり、顧客に不快感を与えたりして、却って顧客の満足度を下げることにもなり得る。 However, if the dialogue is not suitable for the customer's current state, for example, before ordering, before serving meals, during meals, etc., the dialogue may be distracted or the customer may feel uncomfortable. It can also reduce customer satisfaction.
 そこで、本発明が解決しようとする課題は、顧客との適切な対話を行うことが可能な対話制御装置およびロボット制御システムを提供することである。 Therefore, the problem to be solved by the present invention is to provide a dialog control device and a robot control system capable of performing an appropriate dialog with a customer.
 一実施形態に係る対話制御装置は、飲食店に入店した顧客の前記飲食店内での滞在状態を判断する、滞在状態判断部と、前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を取得する、情報取得部と、前記情報取得部で取得された対話シナリオ情報に基づいて、前記顧客との対話を行う、対話制御部と、を備える。 The dialogue control device according to an embodiment determines a stay state of the customer who entered the restaurant in the restaurant, and a stay state determination unit that determines the stay state of the customer determined by the stay state determination unit. An information acquisition unit that acquires the corresponding dialogue scenario information; and a dialogue control unit that performs a dialogue with the customer based on the dialogue scenario information acquired by the information acquisition unit.
 前記滞在状態判断部は、店舗システムからの情報に基づいて、前記顧客の前記飲食店内での滞在状態を判断してもよい。 The stay state determination unit may determine the stay state of the customer in the restaurant based on information from the store system.
 対話制御装置は、前記顧客が飲食物の注文を行ったことを示す信号を前記店舗システムから受信する、受信部をさらに備えていてもよく、前記滞在状態判断部は、前記受信部にて前記信号が受信されると、注文した飲食物を前記顧客が待っている状態であることを判断するようにしてもよい。 The dialogue control apparatus may further include a reception unit that receives a signal indicating that the customer has ordered food and drink from the store system, and the stay state determination unit is configured to receive the signal at the reception unit. When the signal is received, it may be determined that the customer is waiting for the ordered food and drink.
 前記滞在状態判断部が判断する前記顧客の滞在状態は、時間の経過により変化する状態であってもよい。 The stay state of the customer determined by the stay state determination unit may be a state that changes over time.
 前記滞在状態判断部が判断する前記顧客の滞在状態は、前記顧客が前記飲食店に入店または着席してから飲食物を注文するまでの状態、前記顧客が注文した飲食物が提供されるのを待っている状態、前記顧客が飲食物の飲食を完了するまでの状態、及び、前記顧客が飲食物の飲食を完了してから退店するまでの状態と、のうち少なくとも1つの状態を含んでいてもよい。 The stay state of the customer determined by the stay state determination unit is a state from when the customer enters or sits in the restaurant until an order for food and drink, and the food and drink ordered by the customer are provided. At least one of the following states: a state until the customer completes eating and drinking of the food and drink, and a state until the customer leaves the restaurant after completing the eating and drinking of the food and drink You may go out.
 対話制御装置は、前記顧客が飲食物の注文を行っていない場合に、前記顧客に飲食物の推奨メニューを提供する、推奨メニュー提供部を備えていてもよく、前記情報取得部は、飲食物のメニュー選択に関する前記顧客との対話において、前記顧客が前記推奨メニューを選択した場合と、前記推奨メニューを選択せずに別のメニューを選択した場合とで、それぞれ相違する対話シナリオ情報を取得してもよい。 The dialogue control device may include a recommended menu providing unit that provides a recommended menu of food and drink to the customer when the customer does not order food and drink, and the information acquisition unit When the customer selects the recommended menu and the customer selects another menu without selecting the recommended menu in the dialog with the customer regarding menu selection of May be.
 対話制御装置は、それぞれ異なる複数の対話シナリオ情報を蓄積するシナリオ蓄積部を備えていてもよく、前記情報取得部は、前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を、前記シナリオ蓄積部から取得するようにしてもよい。 The dialog control apparatus may include a scenario storage unit that stores a plurality of different dialog scenario information, and the information acquisition unit includes a dialog scenario according to the stay state of the customer determined by the stay state determination unit. Information may be acquired from the scenario storage unit.
 前記情報取得部は、前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を、通信ネットワークを介して取得してもよい。 The information acquisition unit may acquire dialogue scenario information according to the stay state of the customer determined by the stay state determination unit via a communication network.
 一実施形態に係るロボット制御システムは、指示信号に応じた動作を行う、ロボットと、前記指示信号を生成して前記ロボットに送信する、操作端末と、を備え、前記操作端末は、店舗システムからの情報に基づいて、飲食店に入店した顧客の前記飲食店内での滞在状態を判断する、滞在状態判断部と、前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を取得する、情報取得部と、前記情報取得部で取得された対話シナリオ情報に基づいて、前記顧客との対話を行う、対話制御部と、を有する。 A robot control system according to an embodiment includes a robot that performs an operation according to an instruction signal, and an operation terminal that generates the instruction signal and transmits the instruction signal to the robot. Based on the information, a stay state determination unit that determines a stay state of the customer who entered the restaurant in the restaurant, and a dialogue scenario according to the stay state of the customer determined by the stay state determination unit An information acquisition unit that acquires information, and a dialog control unit that performs a dialog with the customer based on the dialog scenario information acquired by the information acquisition unit.
 前記操作端末は、前記店舗システムから前記顧客の注文情報を受信する、受信部、を有していてもよく、前記滞在状態判断部は、前記受信部にて前記注文情報が受信されると、注文した飲食物を前記顧客が待っている状態であることを判断するようにしてもよい。 The operation terminal may include a receiving unit that receives the customer order information from the store system, and the stay state determination unit receives the order information at the receiving unit. It may be determined that the customer is waiting for the ordered food or drink.
 前記店舗システムは、前記注文情報に加えて、前記顧客が入店したことを示す第1情報、前記顧客が着席したことを示す第2情報、注文した飲食物が前記顧客に提供されたことを示す第3情報、及び、前記顧客が提供された飲食物の飲食を完了したことを示す第4情報の少なくとも1つを含む滞在情報を無線送信してもよく、前記操作端末の前記受信部は、前記店舗システムが送信した前記注文情報および前記滞在情報を受信し、前記滞在状態判断部は、前記受信部にて受信した前記注文情報および前記滞在情報に基づいて前記顧客の滞在状態を判断するようにしてもよい。 The store system includes, in addition to the order information, first information indicating that the customer has entered the store, second information indicating that the customer has been seated, and the ordered food and drink being provided to the customer. The third information to be displayed and the stay information including at least one of the fourth information indicating that the food and drink provided by the customer have been completed may be wirelessly transmitted, and the receiving unit of the operation terminal may The order information and the stay information transmitted by the store system are received, and the stay state determining unit determines the stay state of the customer based on the order information and the stay information received by the receiving unit. You may do it.
 本発明によれば、顧客との適切な対話を行うことが可能な対話制御装置およびロボット制御システムを提供できる。 According to the present invention, it is possible to provide a dialog control device and a robot control system capable of performing an appropriate dialog with a customer.
一実施形態に係るロボット制御システムの一例を示すブロック図。The block diagram which shows an example of the robot control system which concerns on one Embodiment. 一実施形態に係るシナリオ制御部の一例を示すブロック図。The block diagram which shows an example of the scenario control part which concerns on one Embodiment. 一実施形態に係る対話制御におけるシナリオ制御の処理を示す図。The figure which shows the process of the scenario control in the dialog control which concerns on one Embodiment. 一実施形態に係る対話制御の処理の流れを示すフローチャート。The flowchart which shows the flow of the process of the dialog control which concerns on one Embodiment. 一実施形態に係る対話の一例を示す図。The figure which shows an example of the dialogue which concerns on one Embodiment. 一実施形態に係る推奨メニュー提供の対話の一例を示す図。The figure which shows an example of the dialogue of the recommendation menu provision which concerns on one Embodiment. 一実施形態に係る推奨メニュー選択時の対話の一例を示す図。The figure which shows an example of the dialogue at the time of the recommendation menu selection concerning one embodiment. 一実施形態に係る推奨メニュー非選択時の対話の一例を示す図。The figure which shows an example of the dialog at the time of the recommendation menu non-selection which concerns on one Embodiment. 一実施形態に係る注文後の対話の一例を示す図。The figure which shows an example of the dialogue after the order which concerns on one Embodiment. 一実施形態に係るシナリオ制御部の別の例を示すブロック図。The block diagram which shows another example of the scenario control part which concerns on one Embodiment. 変形例に係るロボット制御システムの一例を示すブロック図。The block diagram which shows an example of the robot control system which concerns on a modification. 変形例に係る対話の一例を示す図。The figure which shows an example of the dialog which concerns on a modification. 変形例に係るロボット制御システムの一例を示すブロック図。The block diagram which shows an example of the robot control system which concerns on a modification.
 以下、本発明の実施形態に係る対話制御システム及びロボット制御システムの構成及び作用について、図面を参照しながら詳細に説明する。なお、以下に示す実施形態は、本発明の実施形態の一例であって、本発明はこれらの実施形態に限定して解釈されるものではない。また、本実施形態で参照する図面において、同一部分又は同様な機能を有する部分には同一の符号又は類似の符号を付し、その繰り返しの説明は省略する場合がある。また、図面の寸法比率は説明の都合上実際の比率とは異なる場合や、構成の一部が図面から省略される場合がある。 Hereinafter, the configuration and operation of the dialogue control system and the robot control system according to the embodiment of the present invention will be described in detail with reference to the drawings. In addition, embodiment shown below is an example of embodiment of this invention, This invention is limited to these embodiment, and is not interpreted. In the drawings referred to in this embodiment, the same portions or portions having similar functions are denoted by the same reference symbols or similar symbols, and repeated description thereof may be omitted. In addition, the dimensional ratio in the drawing may be different from the actual ratio for convenience of explanation, or a part of the configuration may be omitted from the drawing.
 図1は、本発明の実施形態によるロボット制御システム1を示すブロック図である。図1に示すように、ロボット制御システム1は、ロボット2と、ロボット制御装置の一例である操作端末3と、ハンディ端末4と、POS(Point of Sales)システムなどの店舗システム5と、を備える。ロボット制御システム1は、例えば、飲食店の顧客(以下、ユーザと呼ぶ)が、ロボット2を操作する操作端末3を介してロボット2と対話をするためのシステムである。 FIG. 1 is a block diagram showing a robot control system 1 according to an embodiment of the present invention. As shown in FIG. 1, the robot control system 1 includes a robot 2, an operation terminal 3 that is an example of a robot control device, a handy terminal 4, and a store system 5 such as a POS (Point な ど of Sales) system. . The robot control system 1 is, for example, a system for a restaurant customer (hereinafter referred to as a user) to interact with the robot 2 via the operation terminal 3 that operates the robot 2.
 図1におけるロボット2は、人間に似た外観及び対話機能を有する機械、すなわちヒューマノイドである。なお、ロボット2は、動物やキャラクタなどの人間と非類似の外観を有していてもよい。また、ロボット2は、表示部35に表示された画像による仮想的なロボットであってもよい。 1 is a machine having a human-like appearance and dialogue function, that is, a humanoid. Note that the robot 2 may have a dissimilar appearance to humans such as animals and characters. Further, the robot 2 may be a virtual robot based on an image displayed on the display unit 35.
 (ロボット2)
 図1に示すように、ロボット2は、ロボット駆動部21と、駆動制御部の一例であるロボット制御部22とを有する。ロボット2は、商用電源から供給される電力によって駆動するものであってもよいし、バッテリにより駆動するものであってもよい。
(Robot 2)
As shown in FIG. 1, the robot 2 includes a robot drive unit 21 and a robot control unit 22 that is an example of a drive control unit. The robot 2 may be driven by electric power supplied from a commercial power source or may be driven by a battery.
 ロボット駆動部21は、ロボット2の発話音声を出力する音声出力装置を備える。ロボット駆動部21を必要に応じて駆動させることにより、ユーザとの対話を行うためにロボット2に発話させることができる。ロボット駆動部21には、ロボット制御部22から、ロボット駆動部21の駆動を制御する駆動制御信号が入力される。ロボット駆動部21は、駆動制御信号に応じて駆動される。なお、ロボット駆動部21は、他に、自由度を有するロボット2の部位を駆動するアクチュエータや、ロボット2の眼球部を点灯させる点灯装置等を有していてもよい。 The robot drive unit 21 includes a voice output device that outputs the speech voice of the robot 2. By driving the robot drive unit 21 as necessary, the robot 2 can speak to perform a dialogue with the user. A drive control signal for controlling the drive of the robot drive unit 21 is input from the robot control unit 22 to the robot drive unit 21. The robot drive unit 21 is driven according to the drive control signal. In addition, the robot drive unit 21 may include an actuator that drives a portion of the robot 2 having a degree of freedom, a lighting device that lights the eyeball unit of the robot 2, and the like.
 ロボット制御部22は、指令信号の一例であるロボット制御コマンドを操作端末3から受信する。ロボット制御部22は、受信したロボット制御コマンドに基づいて上述の駆動制御信号を生成し、生成された駆動制御信号をロボット駆動部21へと出力する。すなわち、ロボット2は、ロボット制御コマンドに応じて動作することができる。なお、ロボット2を発話させるためのコマンドには、発話内容を示すデータ(後述するシナリオデータ)が含まれる。 The robot control unit 22 receives a robot control command, which is an example of a command signal, from the operation terminal 3. The robot control unit 22 generates the above-described drive control signal based on the received robot control command, and outputs the generated drive control signal to the robot drive unit 21. That is, the robot 2 can operate according to the robot control command. Note that the command for causing the robot 2 to speak includes data indicating the content of the utterance (scenario data described later).
 ロボット制御部22は、例えば、アプリケーションプログラムやこれを動作させるオペレーティングシステムなどのソフトウェアが記憶された少なくとも1つのメモリと、メモリに記憶されたソフトウェアを実行するCPU(Central Processing Unit)とを有し、メモリに記憶されたソフトウェアをCPUが実行することにより駆動制御信号を生成してもよい。 The robot control unit 22 includes, for example, at least one memory storing software such as an application program and an operating system that operates the application program, and a CPU (Central Processing Unit) that executes the software stored in the memory. The drive control signal may be generated by the CPU executing software stored in the memory.
 (操作端末3)
 操作端末3は、ユーザが携帯するものであり、例えば、タッチ機能を備えたタブレット端末である。この他、操作端末3は、スマートフォンや卓上のディスプレイ型の端末などでもよい。図1に示すように、操作端末3は、センサの一例である方位センサ31と、指令信号生成部及び送信部の一例である動作生成部32とを有する。操作端末3は、内蔵電池から供給される電力によって駆動される。
(Operation terminal 3)
The operation terminal 3 is carried by the user and is, for example, a tablet terminal having a touch function. In addition, the operation terminal 3 may be a smartphone, a desktop display type terminal, or the like. As illustrated in FIG. 1, the operation terminal 3 includes an orientation sensor 31 that is an example of a sensor, and an operation generation unit 32 that is an example of a command signal generation unit and a transmission unit. The operation terminal 3 is driven by electric power supplied from the built-in battery.
 方位センサ31は、操作端末3の方位を示す方位検出信号を動作生成部32へと出力する。方位センサ31は、検出された操作端末3の方位を示す方位検出信号を動作生成部32へと出力する。動作生成部32は、例えば、方位センサ31からの方位検出信号に基づいて、操作端末3が存在する方向を向くように、ロボット2の姿勢を制御するためのロボット制御コマンドを生成する。 The azimuth sensor 31 outputs an azimuth detection signal indicating the azimuth of the operation terminal 3 to the motion generation unit 32. The direction sensor 31 outputs a direction detection signal indicating the detected direction of the operation terminal 3 to the motion generation unit 32. For example, the motion generation unit 32 generates a robot control command for controlling the posture of the robot 2 so as to face the direction in which the operation terminal 3 exists based on the direction detection signal from the direction sensor 31.
 動作生成部32は、方位センサ31の出力に基づいてロボット2の動作を制御するロボット制御コマンドを生成する。動作生成部32は、生成されたロボット制御コマンドを、例えば、Wi-Fiなどの無線通信を介してロボット制御部22へと送信する。ロボット制御部22は、動作生成部32からのロボット制御コマンドを受信し、受信されたロボット制御コマンドに応じた駆動制御信号をロボット駆動部21へと出力することにより、ロボット2の動作の制御を行う。 The motion generation unit 32 generates a robot control command for controlling the motion of the robot 2 based on the output of the direction sensor 31. The motion generation unit 32 transmits the generated robot control command to the robot control unit 22 via wireless communication such as Wi-Fi, for example. The robot control unit 22 receives the robot control command from the motion generation unit 32, and outputs a drive control signal corresponding to the received robot control command to the robot drive unit 21, thereby controlling the operation of the robot 2. Do.
 ロボット2の動作としては、音声を用いてユーザと会話を行う対話動作も含まれる。ユーザとロボット2との間での対話を行うための構成として、操作端末3は、上述の動作生成部32に加え、さらに、シナリオDB(データベース)33と、シナリオ制御部34と、表示部35と、タッチパネル等の入力部36と、音声出力部37とを備える。 The operation of the robot 2 includes an interactive operation in which a conversation with the user is performed using voice. As a configuration for performing a dialogue between the user and the robot 2, the operation terminal 3 includes a scenario DB (database) 33, a scenario control unit 34, and a display unit 35 in addition to the motion generation unit 32 described above. And an input unit 36 such as a touch panel, and an audio output unit 37.
 シナリオDB 33には、ユーザとロボット2との間での対話を行うための複数の対話シナリオ情報が蓄積されている。対話シナリオ情報は、複数の対話シナリオを含んでいる。対話シナリオとは、ユーザとロボット2との間でやり取りされる対話のストーリである。飲食店に入店した顧客は、入店してから退店するまでの間に複数の滞在状態を遷移する。そこで、シナリオDB 33には、遷移状態ごとに別個の対話シナリオが蓄積されている。また、ある滞在状態にて実行された対話シナリオの進行中に、顧客が操作端末3で複数の選択肢の中から特定の選択肢を選択した場合には、別の対話シナリオに切り替わる場合もありうる。各対話シナリオは、ロボット2が対話(すなわち、発話)に用いるロボット側シナリオデータと、ユーザが対話(すなわち、操作端末3での選択)に用いるユーザ側シナリオデータとで構成されている。ユーザ側シナリオデータは、各対話シナリオにおけるロボット2の発話に対するユーザ側の選択肢等を集めたデータである。ユーザ側シナリオデータは、対話シナリオの進行に合わせて、操作端末3の表示部35に選択肢等を表示するために用いられる。ユーザは、表示された選択肢の中から特定の選択肢を入力部36にて選択することができる。 The scenario DB 33 stores a plurality of pieces of dialogue scenario information for carrying out a dialogue between the user and the robot 2. The dialogue scenario information includes a plurality of dialogue scenarios. The dialogue scenario is a story of dialogue exchanged between the user and the robot 2. A customer who enters a restaurant changes a plurality of staying states after entering the restaurant and before leaving the store. Therefore, a separate dialogue scenario is accumulated in the scenario DB 33 for each transition state. Further, when a customer selects a specific option from among a plurality of options while the dialog scenario executed in a certain stay state is in progress, there may be a case of switching to another dialog scenario. Each dialogue scenario is composed of robot-side scenario data that the robot 2 uses for dialogue (ie, utterance) and user-side scenario data that the user uses for dialogue (ie, selection on the operation terminal 3). The user-side scenario data is data that collects user-side options and the like for the utterances of the robot 2 in each dialogue scenario. The user-side scenario data is used for displaying options and the like on the display unit 35 of the operation terminal 3 in accordance with the progress of the conversation scenario. The user can select a specific option from the displayed options using the input unit 36.
 そして、ユーザ側シナリオデータに示される対話シナリオの選択肢のそれぞれに、互いに異なる新たな(下位の)ロボット側シナリオデータが対応付けられている。すなわち、対話シナリオデータは、ロボット側シナリオデータとユーザ側シナリオデータが交互にノードとして結合されたツリー構造を有する。ツリー構造のうち、最上位から最下位にわたる所定の一連のノード群は、例えば典型的な対話に用いられる基本シナリオとして管理され、他のノード群は、基本シナリオを補正する補正シナリオとして管理されていてもよい。 Further, different (lower) robot-side scenario data different from each other are associated with each of the dialogue scenario options shown in the user-side scenario data. That is, the dialogue scenario data has a tree structure in which the robot side scenario data and the user side scenario data are alternately coupled as nodes. In the tree structure, a predetermined series of nodes ranging from the highest level to the lowest level are managed as basic scenarios used for typical dialogue, for example, and the other node groups are managed as correction scenarios for correcting the basic scenarios. May be.
 例えば、飲食店向けの対話シナリオデータは、飲食店におけるユーザの滞在状態を示すステータスデータ(滞在情報)に応じて複数のシナリオ群に区分されていてもよい。図1の例におけるステータスデータは、着席、注文、飲食物提供、食事終了、及び、会計の各行為をしたことを示す情報である。対話シナリオデータはこれらのステータスデータにより区分されるシナリオ群の中から提供される。 For example, dialogue scenario data for restaurants may be divided into a plurality of scenario groups according to status data (stay information) indicating a user's stay status at the restaurant. The status data in the example of FIG. 1 is information indicating that the user has performed the following actions: sitting, ordering, food and drink provision, meal termination, and accounting. The dialogue scenario data is provided from the scenario group classified by these status data.
 シナリオ制御部34は、店舗システム5から出力されたデータに基づいてシナリオを制御する。図2は、シナリオ制御部34のブロック図を示す図である。この図2に示すように、シナリオ制御部34は、受信部341と、滞在状態判断部342と、情報取得部343と、対話制御部345と、を備える。また、シナリオ制御部34は、推奨メニュー提供部344を備えていてもよい。このように、推奨メニュー提供部344は、必須の構成ではない。 The scenario control unit 34 controls the scenario based on the data output from the store system 5. FIG. 2 is a block diagram of the scenario control unit 34. As shown in FIG. 2, the scenario control unit 34 includes a reception unit 341, a stay state determination unit 342, an information acquisition unit 343, and a dialogue control unit 345. Further, the scenario control unit 34 may include a recommended menu providing unit 344. Thus, the recommended menu providing unit 344 is not an essential configuration.
 受信部341は、店舗システム5から出力されたステータスデータを含む情報を受信する。ステータスデータは、例えば飲食店の店舗スタッフが携帯する注文用のハンディ端末4にて入力可能である。このステータスデータは、ハンディ端末4から店舗システム5に送信された上で、店舗システム5からシナリオ制御部34の受信部341へと送信される。また、ステータスデータは、店舗システム5を経由せずに、ハンディ端末4から直接受信部341を介して操作端末3へと送信されるようにしてもよい。 The receiving unit 341 receives information including status data output from the store system 5. The status data can be input, for example, at the handy terminal 4 for ordering carried by the store staff of the restaurant. The status data is transmitted from the handy terminal 4 to the store system 5 and then transmitted from the store system 5 to the receiving unit 341 of the scenario control unit 34. Further, the status data may be transmitted from the handy terminal 4 directly to the operation terminal 3 via the receiving unit 341 without passing through the store system 5.
 ステータスデータの例としては、ユーザが飲食物の注文をしたことを示す注文情報の他、ユーザが入店したことを示す第1状態、ユーザが着席したことを示す第2状態、注文した飲食物がユーザに提供されたことを示す第3状態、ユーザが提供された飲食物の飲食を完了したことを示す第4状態などがある。これには限られず、ステータスデータは、ユーザが離席したことを示す離席状態、ユーザ同士が会話をしている会話状態、又は、ユーザが食後にデザートを注文した状態などを備えていてもよい。このように、ステータスデータは、時間の経過とともに変化するユーザの滞在状態を示す情報である。 Examples of status data include, in addition to order information indicating that the user has ordered food and drink, the first state indicating that the user has entered the store, the second state indicating that the user has been seated, and the ordered food and drink. There is a third state indicating that the user has been provided to the user, a fourth state indicating that the user has finished eating and drinking food and drink. The status data is not limited to this, and the status data may include a leaving state indicating that the user has left the seat, a conversation state in which the users are having a conversation, or a state in which the user has ordered dessert after eating. Good. Thus, the status data is information indicating the staying state of the user that changes with the passage of time.
 滞在状態判断部342は、受信部341が受信した情報のうち、ステータスデータに基づいて、ユーザの滞在状態を判断する。滞在状態判断部342は、例えば受信部341を介してユーザが着席したことを示す第2状態の信号を受信すると、ユーザが着席後、注文前の滞在状態であると判断する。受信部341を介して注文情報を受信すると、滞在状態判断部342は、ユーザが注文後、飲食物の提供待ちの滞在状態であると判断する。上述のステータスデータと同様に、滞在状態も、時間の経過とともに変化するユーザの状態を示す情報である。 The stay state determination unit 342 determines the stay state of the user based on the status data among the information received by the reception unit 341. When the stay state determination unit 342 receives a second state signal indicating that the user has been seated, for example, via the reception unit 341, the stay state determination unit 342 determines that the user is in the stay state before ordering after sitting. When the order information is received via the reception unit 341, the stay state determination unit 342 determines that the user is in a stay state waiting for food and drink after the order is placed. Similar to the above-described status data, the staying state is information indicating the state of the user that changes over time.
 なお、ステータスデータと滞在状態の関連性は、上述したもの以外の構成であってもよい。例えば、店舗スタッフが、ステータスデータとして滞在状態そのものを入力するようにしてもよい。すなわち、ユーザの行為をトリガとして入力するのではなく、状態そのものを入力するものであってもよい。具体的には、店舗スタッフが、着席した、注文した、といった行為を入力するのではなく、着席している状態、注文して飲食物待機中の状態といった状態を直接入力するようにしてもよい。この場合、滞在状態判断部342は、受信したステータスデータそのものを滞在状態であると判断する。このような入力の違いは、基本的には設計事項の問題である。 It should be noted that the relationship between the status data and the staying state may be a configuration other than those described above. For example, the store staff may input the stay state itself as the status data. That is, instead of inputting the user's action as a trigger, the state itself may be input. Specifically, the store staff may not directly input the act of being seated or ordered, but directly input the state of being seated or ordering and waiting for food or drink. . In this case, stay state determination unit 342 determines that the received status data itself is a stay state. This difference in input is basically a matter of design matters.
 情報取得部343は、シナリオDB 33から、滞在状態判断部342が判断した滞在状態に応じた対話シナリオ情報を取得する。対話シナリオ情報とは、滞在状態により変化するシナリオ群の情報、又は、当該シナリオ群に含まれる個々のシナリオデータ等のことを言う。シナリオ群が存在せず、複数の対話シナリオのデータで対話シナリオ情報が構成されていてもよい。 The information acquisition unit 343 acquires, from the scenario DB 33, dialogue scenario information corresponding to the stay state determined by the stay state determination unit 342. The dialogue scenario information refers to information on a scenario group that changes depending on the stay state, or individual scenario data included in the scenario group. There may be no scenario group, and the dialogue scenario information may be composed of data of a plurality of dialogue scenarios.
 シナリオ群とは、ユーザの滞在状態に適したシナリオデータの束であり、滞在状態に合わせた種々のシナリオデータを含む。例えば、ユーザの滞在状態が注文前の状態である場合、シナリオ群には、それぞれの時刻に合わせたメニューを提示するシナリオデータや、大人、子どもの人数により対話の内容や、話し方が変化するシナリオデータ、その他様々なシチュエーションに応じたシナリオデータが含まれている。また、同一の滞在状態であっても、操作端末3の操作に応じて選択されうる複数の対話シナリオデータを設けてもよい。 The scenario group is a bundle of scenario data suitable for the staying state of the user, and includes various scenario data according to the staying state. For example, when the user's staying state is the state before ordering, the scenario group includes scenario data that presents a menu that matches each time, and the content of the conversation and the way the conversation changes depending on the number of adults and children Data and other scenario data according to various situations are included. Further, a plurality of dialogue scenario data that can be selected according to the operation of the operation terminal 3 may be provided even in the same staying state.
 情報取得部343は、ユーザの滞在状態に応じて対話するシナリオデータを取得するようにしてもよいし、情報取得部343がシナリオ群を取得した上で、当該シナリオ群から対話に用いるシナリオデータを選択して出力するようにしてもよい。シナリオ群から対話に用いるシナリオデータを選択するのは、情報取得部343であってもよいし、対話制御部345であってもよい。 The information acquisition unit 343 may acquire scenario data for dialogue according to the staying state of the user, or after the scenario information is acquired by the information acquisition unit 343, scenario data to be used for dialogue from the scenario group. You may make it select and output. The information acquisition unit 343 or the dialogue control unit 345 may select the scenario data used for the dialogue from the scenario group.
 推奨メニュー提供部344は、メニュー中にお薦めのメニューがある場合に、当該お薦めのメニューに関する推奨メニュー情報を情報取得部343に送信する。情報取得部343は、シナリオDB 33から、当該推奨メニューに関する対話シナリオ情報を取得する。 The recommended menu providing unit 344 transmits recommended menu information related to the recommended menu to the information acquisition unit 343 when there is a recommended menu in the menu. The information acquisition unit 343 acquires dialogue scenario information related to the recommended menu from the scenario DB 33.
 例えば、ユーザが着席後、注文前の滞在状態である場合に、情報取得部343で取得した推奨メニューに関する対話シナリオ情報に基づいて、推奨メニュー提供部344が推奨メニューを出力するようにしてもよい。 For example, the recommended menu providing unit 344 may output the recommended menu based on the dialogue scenario information regarding the recommended menu acquired by the information acquiring unit 343 when the user is in the stay state before ordering after sitting. .
 別の方法としては、情報取得部343は、対話シナリオ情報を取得する際に、推奨メニュー提供部344に対して推奨メニューの有無を確認する。推奨メニューがある場合には、シナリオDB 33から、推奨メニューを推奨するシナリオデータを取得する。そして、取得したシナリオデータは、対話制御部345へと出力される。複数の推奨メニューがある場合には、当該複数の推奨メニューを情報取得部343へと送信してもよいし、第1の推奨メニューに加え、その他の推奨メニューがあるという情報を情報取得部343へと送信してもよい。 As another method, the information acquisition unit 343 checks the recommended menu providing unit 344 for the presence of a recommended menu when acquiring the dialogue scenario information. If there is a recommended menu, the scenario data recommending the recommended menu is acquired from the scenario DB 33. Then, the acquired scenario data is output to the dialogue control unit 345. When there are a plurality of recommended menus, the plurality of recommended menus may be transmitted to the information acquisition unit 343, and information that there is another recommended menu in addition to the first recommended menu is also acquired by the information acquisition unit 343. You may send to.
 対話制御部345は、ユーザの滞在状況に応じて取得した対話に用いるシナリオデータを含む情報を動作生成部32へと出力する。さらに、対話制御部345は、必要であれば、操作端末3の表示部35へと対話に用いるデータを出力してもよい。対話に用いるデータは、基本的には、ロボット側シナリオデータに応答するために使用するユーザ側シナリオデータの情報である。 The dialogue control unit 345 outputs information including scenario data used for the dialogue acquired according to the staying status of the user to the action generation unit 32. Further, the dialogue control unit 345 may output data used for the dialogue to the display unit 35 of the operation terminal 3 if necessary. The data used for the dialogue is basically information on the user side scenario data used for responding to the robot side scenario data.
 対話に用いるシナリオデータを受信した動作生成部32は、対話データを音声化した信号をロボット制御部22へと送信することにより、ロボット2から音声を発生させる。この際、ロボット2の口の部分を開閉させたり、光らせたりする信号を併せてロボット制御部22に送信し、ロボット2の口を開閉させたり、光らせたりさせることもできる。また、音声データは、上記のように操作端末3からロボット制御部22に適宜送信する代わりに、ロボット2内に音声データを格納しておき、各音声データに紐付けられた識別信号を送信することにより、ロボット2に発声させるようにしてもよい。ロボット2から出力される音声は、意味のある言葉を含む発話でもよいし、言葉を含まない擬制音などでもよい。 The motion generation unit 32 that has received the scenario data used for the dialogue transmits a signal obtained by converting the dialogue data into a voice to the robot control unit 22, thereby generating a voice from the robot 2. At this time, it is also possible to send a signal for opening / closing or lighting the mouth of the robot 2 to the robot control unit 22 to open / close or light the robot 2. Further, instead of appropriately transmitting the voice data from the operation terminal 3 to the robot control unit 22 as described above, the voice data is stored in the robot 2 and an identification signal associated with each voice data is transmitted. Thus, the robot 2 may be uttered. The voice output from the robot 2 may be an utterance that includes a meaningful word, or may be an artificial sound that does not include a word.
 なお、対話制御部345が出力するデータは、ユーザ側シナリオデータには限られず、ユーザがメニューを選択する際には、飲食物の画像や動画を表示部35へと出力するようにしてもよい。すなわち、対話に用いる言葉、文字に関する情報だけではなく、適宜、画像、動画等、表示に用いることのできるデータを表示部35へと出力するようにしてもよい。 Note that the data output by the dialogue control unit 345 is not limited to the user-side scenario data, and when the user selects a menu, an image or video of food or drink may be output to the display unit 35. . That is, not only information about words and characters used for dialogue, but also data that can be used for display, such as images and moving images, may be output to the display unit 35 as appropriate.
 また、これには限られず、必要に応じて操作端末3から音声を出力させたり、操作端末3を振動させたりしてもよく、この場合には、これらの音声データ、振動データ等を出力するようにしてもよい。また、ロボット2が音声を発する場合には、ロボット2が発する音声を文字として表示部35へと併せて出力するようにしてもよい。一方、ユーザは、表示部35に表示されているユーザ側シナリオデータに基づき、表示部35に表示された選択肢の中から、入力部36にて任意の選択を行う。 In addition, the present invention is not limited to this, and sound may be output from the operation terminal 3 or the operation terminal 3 may be vibrated as necessary. In this case, the sound data, vibration data, and the like are output. You may do it. Further, when the robot 2 utters a sound, the sound uttered by the robot 2 may be output to the display unit 35 as characters. On the other hand, the user makes an arbitrary selection at the input unit 36 from the options displayed on the display unit 35 based on the user-side scenario data displayed on the display unit 35.
 そして、対話制御部345は、ユーザに選択されたデータに応答するためのロボット側シナリオデータを表示部35、及び、動作生成部32へと出力する。この際、ユーザの選択を、音声出力部37を介して操作端末3から音声情報として出力するようにしてもよい。 Then, the dialogue control unit 345 outputs the robot side scenario data for responding to the data selected by the user to the display unit 35 and the motion generation unit 32. At this time, the user's selection may be output as audio information from the operation terminal 3 via the audio output unit 37.
 表示部35に推奨メニューが表示された後であり、かつ、複数の推奨メニューがある場合、対話制御部345は、入力部36からの別の推奨メニューを提示する旨のユーザの入力に基づいて、異なる推奨メニューの提示を行うように情報取得部343へと信号を出力してもよい。 When the recommended menu is displayed on the display unit 35 and there are a plurality of recommended menus, the dialogue control unit 345 is based on a user input indicating that another recommended menu is to be presented from the input unit 36. A signal may be output to the information acquisition unit 343 so as to present a different recommended menu.
 この場合、点線で示されるように、推奨メニュー提供部344を介して情報取得部343へと新たな推奨メニューを通知するようにしてもよい。新たな推奨メニューの指示を受信すると、情報取得部343は、あらかじめシナリオDB 33又は推奨メニュー提供部344から取得していた新たな推奨メニューに係る対話シナリオを対話制御部345へと送信してもよい。 In this case, as indicated by the dotted line, a new recommended menu may be notified to the information acquisition unit 343 via the recommended menu providing unit 344. Upon receiving the instruction of the new recommended menu, the information acquisition unit 343 transmits the dialog scenario related to the new recommended menu acquired from the scenario DB 33 or the recommended menu providing unit 344 in advance to the dialog control unit 345. Good.
 別の例としては、情報取得部343は、新たな推奨メニューを出力する旨の通知が来たタイミングでシナリオDB 33又は推奨メニュー提供部344から当該新たな推奨メニューに係る対話シナリオを取得するようにしてもよい。さらに、推奨メニューは、店舗側や、店舗のチェーン側で設定できるだけではなく、例えば、売り上げの高いメニューや、人気のあるメニューとするようにしてもよい。 As another example, the information acquisition unit 343 acquires a dialogue scenario related to the new recommended menu from the scenario DB 33 or the recommended menu providing unit 344 at a timing when a notification that a new recommended menu is output has been received. It may be. Furthermore, the recommended menu can be set not only on the store side or on the chain side of the store, but also, for example, a menu with high sales or a popular menu.
 なお、図2においては、入力部36は、対話制御部345と接続されているが、これには限られず、併せて推奨メニュー提供部344へと接続されていてもよい。この場合、他の推奨メニューの提示をユーザが望んだ場合に、推奨メニュー提供部344は、対話制御部345を介さずに、当該入力に基づき情報取得部343へと他の推奨メニューについての信号を出力する。 In FIG. 2, the input unit 36 is connected to the dialogue control unit 345, but is not limited thereto, and may be connected to the recommended menu providing unit 344 together. In this case, when the user desires to present other recommended menus, the recommended menu providing unit 344 does not go through the dialogue control unit 345 but sends a signal regarding the other recommended menus to the information acquisition unit 343 based on the input. Is output.
 このように、シナリオ制御部34は、店舗スタッフがハンディ端末4へ入力したユーザの滞在状態に基づいて、シナリオDB 33に格納されている各種のシナリオデータから適したものを取得し、動作生成部32、表示部35へと出力する。さらに、シナリオ制御部34は、入力部36から入力された対話シナリオ選択信号に応じて、対話シナリオ選択信号で選択された対話シナリオに応答する新たなロボット側シナリオデータをシナリオDB 33から読み出す。 As described above, the scenario control unit 34 acquires appropriate data from the various scenario data stored in the scenario DB 33 based on the staying state of the user input to the handy terminal 4 by the store staff, and the motion generation unit 32, and output to the display unit 35. Further, the scenario control unit 34 reads out new robot-side scenario data that responds to the dialogue scenario selected by the dialogue scenario selection signal from the scenario DB 33 in accordance with the dialogue scenario selection signal input from the input unit 36.
 そして、シナリオ制御部34は、読み出された新たなロボット側シナリオデータを、発話のために動作生成部32へと出力する。この処理を繰り返すことにより、ユーザとロボット2との間の対話を行うことが可能となる。 Then, the scenario control unit 34 outputs the read new robot side scenario data to the motion generation unit 32 for speech. By repeating this process, it is possible to perform a dialogue between the user and the robot 2.
 以上の構成により、対話シナリオデータを用いてユーザとロボット2との間での対話を行うことができる。 With the above configuration, the dialogue between the user and the robot 2 can be performed using the dialogue scenario data.
 (滞在状態)
 図3は、店舗スタッフによるハンディ端末4の操作から、シナリオデータの選択までの処理の様子を時系列と併せて模式的に表した図である。この図3において、実線の矢印は、データ及び信号の入出力、破線の矢印は、時間の経過を表す。上述した説明、及び、この図3に示されているように、シナリオデータ選択までの処理は、店舗システム5と、シナリオDB 33と、シナリオ制御部34により実行される。
(Stay status)
FIG. 3 is a diagram schematically showing the processing from the operation of the handy terminal 4 by the store staff to the selection of the scenario data together with the time series. In FIG. 3, solid arrows indicate data and signal input / output, and broken arrows indicate the passage of time. As described above and as shown in FIG. 3, the process up to the scenario data selection is executed by the store system 5, the scenario DB 33, and the scenario control unit 34.
 この図3を用いてステータスデータと、シナリオ群の推移について説明する。まず、店舗スタッフの操作によりハンディ端末4に、ユーザが入店したという第1情報、又は、ユーザが着席したという第2情報が滞在情報A1として入力される。店舗システム5を介して当該滞在情報A1がシナリオ制御部34へと送信されると、滞在状態判断部342は、現在のユーザの滞在状態が、ユーザが飲食店に入店、又は、着席してから注文をするまでの状態である滞在状態P1であると判断する。そして、情報取得部343は、この滞在状態P1に応じたシナリオ群1に属するシナリオデータをシナリオDB 33から取得し、ユーザとの対話を行う。店舗スタッフがステータスデータを打ち込む際に、ハンディ端末4は、例えばユーザの着席しているテーブル番号等をキーとして送信するようにしてもよい。このようにすることにより、ハンディ端末4から打ち込んだステータスデータと、ユーザとを紐付けることもできる。 The transition of status data and scenario group will be described with reference to FIG. First, the first information that the user has entered the handy terminal 4 or the second information that the user has been seated is input to the handy terminal 4 as the stay information A1 by the operation of the store staff. When the stay information A1 is transmitted to the scenario control unit 34 via the store system 5, the stay state determination unit 342 indicates that the current stay state of the user has entered the restaurant or seated. It is determined that the stay state P1 is a state from when the order is placed to when the order is placed. And the information acquisition part 343 acquires the scenario data which belongs to the scenario group 1 according to this stay state P1 from scenario DB33, and performs a dialog with a user. When the store staff inputs status data, the handy terminal 4 may transmit, for example, a table number or the like on which the user is seated as a key. By doing so, it is possible to associate the status data typed from the handy terminal 4 with the user.
 次に、時間が経過し、店舗スタッフが、ハンディ端末4を介してユーザの注文を店舗システム5へと送信、すなわち注文したという注文情報を滞在情報A2として送信する。すると、滞在状態判断部342は、ユーザの滞在状態が注文前の滞在状態P1から、ユーザが注文した飲食物が提供されるのを待っている状態、すなわち、注文をして飲食物の提供を待機している滞在状態P2へと移行したと判断する。そして、情報取得部343は、この滞在状態P2に応じたシナリオ群2に属するシナリオデータをシナリオDB 33から取得し、ユーザとの対話を行う。 Next, after a lapse of time, the store staff transmits the user's order to the store system 5 via the handy terminal 4, that is, the order information indicating that the order has been transmitted is transmitted as the stay information A2. Then, the stay state determination unit 342 waits for the food and drink ordered by the user to be provided from the stay state P1 before the order, that is, the order is provided and the food and drink are provided. It is determined that the state has shifted to the waiting stay state P2. And the information acquisition part 343 acquires the scenario data which belongs to the scenario group 2 according to this stay state P2 from scenario DB33, and performs a dialog with a user.
 同様に、飲食物提供の滞在情報A3が送信されると、ユーザが飲食物の飲食を完了するまでの状態、すなわち、飲食中である滞在状態P3へと移行し、シナリオ群3に属するシナリオデータで対話を行い、飲食終了の滞在情報A4が送信されると、ユーザが飲食を完了してから退店するまでの状態である滞在状態P4へと移行し、シナリオ群4に属するシナリオデータで対話を行う。このように、時間の経過にしたがい、店舗スタッフがユーザのステータスデータを適宜入力することにより、滞在状態が変化し、それに応じたシナリオデータを用いた対話が行われる。 Similarly, when stay information A3 for providing food and drink is transmitted, the state shifts to a state until the user completes eating and drinking of the food and drink, that is, stay state P3 being eaten and drink, and scenario data belonging to scenario group 3 When the staying information A4 at the end of eating and drinking is transmitted, the state transitions to the staying state P4, which is a state from when the user completes eating and drinking until the user leaves the store, and interacts with scenario data belonging to the scenario group 4 I do. As described above, as the store staff appropriately inputs the user status data as time passes, the staying state changes, and a dialog using the scenario data corresponding thereto is performed.
 滞在状態に関しても、上述した滞在情報と同様に、上記したものには限られない。滞在状態として、例えば、デザートを注文した場合においては、デザートが提供されるのを待っている待機状態、注文したデザートが提供された後には、デザートを飲食中である状態であるといった滞在状態を含んでもいてもよい。シナリオ群も上記に限定されるものではなく、デザート待機状態に応じたシナリオ群、デザート飲食中状態に応じたシナリオ群等を含んでいてもよい。別の例として、追加の注文を受けた場合には、滞在状態P2へと戻るようにしてもよい。 The stay status is not limited to the above-mentioned information, just like the stay information described above. As a staying state, for example, when dessert is ordered, the waiting state is waiting for the dessert to be served, and after the ordered dessert is served, the staying state is that the dessert is being eaten or eaten. It may be included. The scenario group is not limited to the above, and may include a scenario group according to the dessert standby state, a scenario group according to the dessert eating and drinking state, and the like. As another example, when an additional order is received, the user may return to the staying state P2.
 なお、ステータスデータの入力は、ハンディ端末4を介して入力されるものには限られない。例えば、ユーザが会計をして退店をする場合、会計をする際のPOS端末(店舗システム5)から会計をした旨のステータスデータが、操作端末3へと送信されるようにしてもよい。また、会計をした旨のステータスデータを受信した操作端末3は、滞在状態を初期状態へと戻すようにしてもよい。 The input of status data is not limited to that input via the handy terminal 4. For example, when the user makes a checkout and leaves the store, status data indicating that the checkout has been made may be transmitted to the operation terminal 3 from the POS terminal (store system 5) at the time of checkout. The operation terminal 3 that has received the status data indicating that the transaction has been made may return the staying state to the initial state.
 (動作例)
 次に、図1のロボット制御システム1の主要な動作例について説明する。図4は、本実施形態に係る操作端末3における、シナリオデータの取得処理、及び、対話処理の流れを示すフローチャートである。この図4と図3を用いて動作例について説明する。初期の滞在状態としては、例えば、ヌル値を入れておき、状態が未入力としてもよいし、未入店又は未着席という滞在状態を設定しておいてもよい。以下の説明においては、初期状態として未入店又は未着席という滞在状態が設定されているものとする。
(Operation example)
Next, a main operation example of the robot control system 1 of FIG. 1 will be described. FIG. 4 is a flowchart showing the flow of scenario data acquisition processing and dialogue processing in the operation terminal 3 according to the present embodiment. An example of operation will be described with reference to FIGS. As an initial stay state, for example, a null value may be entered, and the state may be left uninput, or a stay state that is not yet entered or not seated may be set. In the following description, it is assumed that a stay state of not entering or not seating is set as an initial state.
 まず、滞在状態判断部342は、受信部341を介して店舗システム5からステータスデータ(滞在情報)を取得する(ステップS10)。滞在状態判断部342は、例えば、図3における入店した、というステータスデータA1を、受信部341を介して受信する。 First, the stay state determination unit 342 acquires status data (stay information) from the store system 5 via the reception unit 341 (step S10). The stay state determination unit 342 receives, for example, status data A1 that the store is entered in FIG.
 次に、滞在状態判断部342は、受信したステータスデータに基づいて、ユーザの滞在状態を判断する(ステップS11)。例えば、ステータスデータA1を受信した場合、滞在状態判断部342は、滞在状態を、ユーザが注文をする前の状態である滞在状態P1であると判断する。滞在状態判断部342は、得られた滞在状態P1を、情報取得部343へと出力する。 Next, the stay state determination unit 342 determines the stay state of the user based on the received status data (step S11). For example, when the status data A1 is received, the stay state determination unit 342 determines that the stay state is the stay state P1 that is the state before the user places an order. The stay state determination unit 342 outputs the obtained stay state P1 to the information acquisition unit 343.
 次に、情報取得部343は、ユーザの滞在状態が変更されたか否かを判定する(ステップS12)。例えば、未入店又は未着席の状態から滞在状態P1へと変更している場合、情報取得部343は、新たな滞在状態へと遷移したと判定し(ステップS12:YES)、シナリオの取得ステップへと移行する。 Next, the information acquisition unit 343 determines whether or not the staying state of the user has been changed (step S12). For example, when the state is changed from a non-entry or unoccupied state to the stay state P1, the information acquisition unit 343 determines that the state has changed to a new stay state (step S12: YES), and a scenario acquisition step Migrate to
 滞在状態が変更されたと判定した情報取得部343は、シナリオ群の選択を行う(ステップS13)。例えば、滞在状態P1であれば、情報取得部343は、シナリオを取得するグループとして、シナリオ群1を選択する。 The information acquisition unit 343 that has determined that the stay state has been changed selects a scenario group (step S13). For example, in the stay state P1, the information acquisition unit 343 selects the scenario group 1 as a group for acquiring a scenario.
 次に、情報取得部343は、シナリオ群1に属するシナリオから、対話に用いるシナリオデータの選択を行う(ステップS14)。例えば、未着席状態から滞在状態P1に遷移した場合、ユーザは、入店した直後、又は、席に案内された直後である可能性が高いので、時刻に合わせた「おはよう」、「こんにちは」等の挨拶をするシナリオ、及び、メニュー選択についてのシナリオを選択するようにしてもよい。 Next, the information acquisition unit 343 selects scenario data used for the dialogue from the scenarios belonging to the scenario group 1 (step S14). For example, if you transition from the non-seated state to stay state P1, the user, immediately after visiting, or, since there is likely to be immediately after being guided to the seat, tailored to the time "Good morning", "Hello", etc. It is also possible to select a scenario for greeting and a scenario for menu selection.
 このシナリオの選択をするタイミングにおいて、推奨メニュー提供部344から推奨メニュー情報が提供されてもよい。この場合、情報取得部343は、推奨メニューがある場合のシナリオを選択し、推奨メニュー提供部344から提供された推奨メニューに基づいたシナリオを対話制御部345へと出力する。 The recommended menu information may be provided from the recommended menu providing unit 344 at the timing of selecting this scenario. In this case, the information acquisition unit 343 selects a scenario when there is a recommended menu, and outputs a scenario based on the recommended menu provided from the recommended menu providing unit 344 to the dialogue control unit 345.
 次に、対話制御部345は、ロボット2を介してユーザとの対話の処理を実行する(ステップS15)。滞在状態が遷移していない場合(ステップS12:NO)には、上記のステップS13、ステップS14の処理を省いてこの処理を行う。 Next, the dialogue control unit 345 executes a dialogue process with the user via the robot 2 (step S15). If the stay state has not changed (step S12: NO), this process is performed without the processes of steps S13 and S14.
 図5は、ロボット2とユーザとの対話の一例を示す図であり、推奨メニュー提供部344が推奨メニュー情報を提供した場合の例を示す図である。この図5に示すように、対話は、ロボット2と、操作端末3を用いて行われる。操作端末3は、例えば、表示部35と、表示部35に一体に組み込まれたタッチパネル式の入力部36と、を備えるタブレット型の携帯端末である。ロボット2からは、情報取得部343により選択されたロボット側シナリオデータに基づいて、例えば「なにを注文する?」という発話が実行される。 FIG. 5 is a diagram illustrating an example of an interaction between the robot 2 and the user, and is a diagram illustrating an example when the recommended menu providing unit 344 provides recommended menu information. As shown in FIG. 5, the dialogue is performed using the robot 2 and the operation terminal 3. The operation terminal 3 is, for example, a tablet-type portable terminal that includes a display unit 35 and a touch panel type input unit 36 integrated into the display unit 35. From the robot 2, for example, an utterance “What to order?” Is executed based on the robot-side scenario data selected by the information acquisition unit 343.
 そして、このロボット2の発話に返答するタイミングにおいて、操作端末3の表示部35には、上記のロボット側シナリオデータに対する回答として、「メニューを見る」及び「おすすめメニューを教えて!」という選択肢を含むユーザ側シナリオデータが表示される。入力部36を介して、選択肢からの選択を行うことにより、ユーザは、ロボット2との対話を実行する。入力部36にて選択を行ったときに、選択内容を自動音声にて発話してもよい。これにより、ロボット2との対話感をより高めることができる。 Then, at the timing of replying to the utterance of the robot 2, the display unit 35 of the operation terminal 3 has options of “view menu” and “tell me a recommended menu!” As an answer to the above robot side scenario data. Including user-side scenario data is displayed. By performing selection from the options via the input unit 36, the user executes a dialog with the robot 2. When a selection is made by the input unit 36, the selected content may be uttered by automatic voice. Thereby, the feeling of interaction with the robot 2 can be further enhanced.
 対話制御部345は、ユーザからの回答により、以降の対話に用いるシナリオが変更されるか否かを判定する(ステップS16)。例えば、図5において、シナリオデータは、推奨メニューをお薦めするシナリオが選択されているとする。ここで、「メニューを見る」という選択肢をユーザが選択すると、シナリオを変更する必要が出てくる。このような場合、対話制御部345は、シナリオの変更が必要であると判断し(ステップS16:YES)、ステップS14に戻り、シナリオ群1の中からメニューを見せるシナリオを選択するように、情報取得部343へ通知する。 The dialogue control unit 345 determines whether or not a scenario used for the subsequent dialogue is changed based on an answer from the user (step S16). For example, in FIG. 5, it is assumed that a scenario that recommends a recommended menu is selected as the scenario data. Here, when the user selects the option “view menu”, the scenario needs to be changed. In such a case, the dialog control unit 345 determines that the scenario needs to be changed (step S16: YES), returns to step S14, and selects the scenario that shows the menu from the scenario group 1. The acquisition unit 343 is notified.
 一方、「おすすめメニューを教えて!」という選択肢をユーザが選択した場合、シナリオの変更は必要なく(ステップS16:NO)、ユーザとロボット2との対話が継続される。 On the other hand, when the user selects the option “Tell me a recommended menu!”, There is no need to change the scenario (step S16: NO), and the dialogue between the user and the robot 2 is continued.
 ユーザ側シナリオデータに基づいてユーザが対話を行うと、滞在状態判断部342は、ユーザの滞在情報を取得する(ステップS17)。これは、対話中に、店舗スタッフがハンディ端末4を用いてユーザの滞在情報を入力する場合があるので、そのような状況に対応するための処理である。操作端末3は、このように滞在情報を対話中に取得できるようにしておいてもよい。 When the user performs a dialogue based on the user-side scenario data, the stay state determination unit 342 acquires the stay information of the user (step S17). This is a process for dealing with such a situation because the store staff may input the stay information of the user using the handy terminal 4 during the dialogue. The operation terminal 3 may be configured so that the stay information can be acquired during the dialogue in this way.
 次に、対話制御部345は、ロボット2との対話を終了させるか否かを判定する(ステップS17)。ユーザが選んだ選択肢により対話が終了する場合(ステップS17:YES)、対話制御部345は、対話を終了させる。この際、ロボット2から、「またお話ししようね。」などと発話させるようにしてもよい。一方、ユーザが選んだ選択肢により対話が継続される場合(ステップS17:NO)、ステップS11へと戻り、上述した手順にしたがい、対話が継続される。 Next, the dialogue control unit 345 determines whether or not to terminate the dialogue with the robot 2 (step S17). When the dialogue is terminated by the option selected by the user (step S17: YES), the dialogue control unit 345 terminates the dialogue. At this time, the robot 2 may utter “Let's talk again”. On the other hand, when the dialogue is continued by the option selected by the user (step S17: NO), the process returns to step S11, and the dialogue is continued according to the above-described procedure.
 以下、ロボット2とユーザとのやりとりを通じて、上記の図5の説明に続く処理手順を詳しく説明する。 Hereinafter, the processing procedure following the description of FIG. 5 will be described in detail through the interaction between the robot 2 and the user.
 図6は、図5において、「おすすめメニューを教えて!」がユーザにより選択された場合の表示部35の表示例を示す図である。この図6に示すように、本日のおすすめメニューとして、推奨メニュー提供部344が提供した、「イタリア産チーズのハンバーグ」の文字が、画像や動画とともに表示される。それとともに、その他のおすすめメニューを併せて表示してもよい。さらに、ロボット2からイタリア産チーズのハンバーグの説明について、メニューの内容、おすすめの理由等を発話するようにしてもよい。ここで、ユーザにより、イタリア産チーズのハンバーグを注文するメニューとして選択される、すなわち、「このメニューを選択する」が選択されると、シナリオの変更はせず(ステップS16:NO)に対話が継続される。 FIG. 6 is a diagram illustrating a display example of the display unit 35 when “Tell me a recommended menu!” In FIG. 5 is selected by the user. As shown in FIG. 6, the letters “Italian cheese hamburger” provided by the recommended menu providing unit 344 are displayed together with images and videos as the recommended menu for today. At the same time, other recommended menus may be displayed together. Further, regarding the explanation of the Italian cheese hamburger from the robot 2, the contents of the menu, the reason for recommendation, etc. may be spoken. Here, when the user selects a menu for ordering Italian cheese hamburger, that is, when “select this menu” is selected, the scenario is not changed (step S16: NO), and the dialog is performed. Will continue.
 図7は、おすすめメニューを選択した場合の対話例を示す図である。まず、ロボット2は、ユーザが選択したことに対して、「おすすめメニューを選んでくれたんだ。ありがとう!」等のメッセージを発話する。そして、表示部35には、その発話に対する回答として、「もちろん!」、「やっぱり変える」と言った選択肢が表示される。ここで、「もちろん!」がユーザにより選択されると、ロボット2は、例えば、「おすすめメニューは、・・・、グッドチョイスだよ!」と発話し、続いて、「店舗スタッフさんを呼んで注文しよう!」と、注文を促す旨のメッセージを発話する。 FIG. 7 is a diagram showing an example of dialogue when the recommended menu is selected. First, the robot 2 utters a message such as “Thank you for choosing the recommended menu. Thank you!” For the user's selection. Then, on the display unit 35, as an answer to the utterance, options such as “of course!” And “change after all” are displayed. Here, when “of course!” Is selected by the user, the robot 2 utters, for example, “Recommended menu is ... Good choice!”, Followed by “Call the store staff. "Let's place an order!"
 一方、図6において、「他のメニューを見る」が選択されると、シナリオが変更される旨の信号が対話制御部345から情報取得部343へと通知される(ステップS16:YES)。信号を受信した情報取得部343は、ステップS14に戻り、別のシナリオを選択する。図8は、別のシナリオが選択された場合の例を示す図である。この図8に示すように、ロボット2は、例えば、「違うメニューにしたんだ。」等発話する。表示部35には、その発話に対する回答として、「ごめんね。」、「やっぱり変える」と言った選択肢が表示される。ここで、「ごめんね。」がユーザにより選択されると、ロボット2は、例えば、「別のおすすめメニューもみてみる?」等のメッセージを発話し、ユーザに別のおすすめメニューをリコメンドしてもよい。 On the other hand, when “view other menu” is selected in FIG. 6, a signal indicating that the scenario is changed is notified from the dialogue control unit 345 to the information acquisition unit 343 (step S16: YES). The information acquisition unit 343 that has received the signal returns to step S14 and selects another scenario. FIG. 8 is a diagram illustrating an example when another scenario is selected. As shown in FIG. 8, the robot 2 speaks, for example, “I made a different menu”. On the display unit 35, as an answer to the utterance, options such as “I'm sorry” and “I'll change after all” are displayed. Here, when “I ’m sorry” is selected by the user, the robot 2 may utter a message such as “See another recommended menu?” And recommend another recommended menu to the user. .
 このように、ユーザが選んだ選択肢により、対話制御部345は、シナリオを変更するか否かを判定し、情報取得部343は、その判定結果を基に、シナリオDB 33から新たなシナリオデータを取得する。新たなシナリオデータを取得する際に、別のおすすめメニューがあり、それを表示させる場合には、情報取得部343は、推奨メニュー提供部344と情報をやりとりすることもできる。なお、図6において、その他のおすすめメニューである「黒毛和牛のすき焼き」等を選択した場合にも、情報取得部343は、推奨メニュー提供部344から新たな推奨メニュー情報を取得し、シナリオDB 33から推奨メニューに別のシナリオを取得するようにしてもよい。 In this way, the dialog control unit 345 determines whether to change the scenario according to the option selected by the user, and the information acquisition unit 343 obtains new scenario data from the scenario DB 33 based on the determination result. get. When acquiring new scenario data, there is another recommended menu, and when it is displayed, the information acquiring unit 343 can also exchange information with the recommended menu providing unit 344. In FIG. 6, even when “Kuroge Wagyu beef sukiyaki” or the like, which is another recommended menu, is selected, the information acquisition unit 343 acquires new recommended menu information from the recommended menu providing unit 344, and the scenario DB 33 Another scenario may be acquired from the recommended menu.
 上記の図6乃至図8の間には、滞在情報が更新されないので、ユーザの回答後、ステップS17において滞在情報を取得したとしても、滞在状態判断部342により、滞在状態が更新されることは無く、シナリオ群の選択をすることは無い。一方で、例えば、図7の状態の後、ユーザが飲食物の注文を行い、ハンディ端末4を介してユーザが注文を行った、という注文情報がシナリオ制御部34の受信部341を介してされた場合について、以下に説明する。 Since stay information is not updated between FIG. 6 to FIG. 8 described above, the stay state is not updated by the stay state determination unit 342 even if the stay information is acquired in step S17 after the user answers. There is no scenario group selection. On the other hand, for example, after the state of FIG. 7, order information that the user has ordered food and drink and the user has placed an order via the handy terminal 4 is received via the receiving unit 341 of the scenario control unit 34. This case will be described below.
 この場合、ステップS17において、ハンディ端末4から図3における注文したという滞在情報A2が、店舗システム5及び受信部341を介して、滞在状態判断部342へと送信される。そして、対話終了と判断されない(ステップS18:NO)と、次に、滞在状態判断部342は、滞在状態の検出を行う(ステップS11)。滞在状態判断部342は、滞在情報A2を受信したことに基づいて、現在の滞在状態が、ユーザが注文を完了し、注文した飲食物が提供されるのを待っている状態であるという滞在状態P2へと遷移したと判断する。 In this case, in step S17, the stay information A2 that the handy terminal 4 ordered in FIG. 3 is transmitted from the handy terminal 4 to the stay state determination unit 342 via the store system 5 and the reception unit 341. If it is not determined that the dialogue has ended (step S18: NO), then the stay state determination unit 342 detects the stay state (step S11). The stay state determination unit 342 is based on having received the stay information A2, and the current stay state is a state in which the user completes the order and is waiting for the ordered food and drink to be provided. It is determined that transition to P2 has occurred.
 次に、滞在状態判断部342から滞在状態P2へ遷移したことを通知された情報取得部343は、シナリオデータを取得するシナリオ群として、滞在状態P2に対応するシナリオ群2を選択する(ステップS13)。続いて、情報取得部343は、シナリオDB 33から、シナリオ群2に属する対話シナリオ情報を選択し、取得する(ステップS14)。そして、情報取得部343により取得された対話シナリオ情報に基づいて、対話制御部345は、ロボット2を介してユーザとの対話を実行する(ステップS15)。なお、滞在状態判断部342は、単に現在の滞在状態が滞在状態P2であることを通知するだけでもよく、この場合、情報取得部343が滞在状態が遷移したと判定し、シナリオ群2を選択するようにしてもよい。 Next, the information acquisition unit 343 notified of the transition from the stay state determination unit 342 to the stay state P2 selects the scenario group 2 corresponding to the stay state P2 as a scenario group for acquiring scenario data (step S13). ). Subsequently, the information acquisition unit 343 selects and acquires the dialogue scenario information belonging to the scenario group 2 from the scenario DB 33 (step S14). And based on the dialogue scenario information acquired by the information acquisition unit 343, the dialogue control unit 345 executes a dialogue with the user via the robot 2 (step S15). The stay state determination unit 342 may simply notify that the current stay state is the stay state P2. In this case, the information acquisition unit 343 determines that the stay state has changed, and selects the scenario group 2 You may make it do.
 図9は、情報取得部343により、シナリオ群2に属するシナリオデータが選択された場合の対話例を示す図である。この図9に示すように、ロボット2は、「料理が来るまでお話ししよう!」等、注文が終了して、ユーザが飲食物の提供を待機している状態に適した対話を行う。 FIG. 9 is a diagram showing an example of dialogue when scenario data belonging to scenario group 2 is selected by the information acquisition unit 343. As shown in FIG. 9, the robot 2 performs a dialogue suitable for a state where the order is finished and the user is waiting to provide food and drink, such as “Let's talk until the food comes!”.
 以上のように、本実施形態によれば、飲食店に来店した顧客が飲食店内に設置されたロボット2と、操作端末3を介して適切な対話を行うことができる。すなわち、本実施形態では、店舗スタッフがハンディ端末4を介して入力した情報に基づいて、顧客の滞在状態を把握できる。そして、比較的短時間の間に変化していく顧客の滞在状態に応じた対話シナリオ情報を取得することで、顧客の状態、例えば、注文前である状態や、飲食中である状態に適した対話を行うことができ、顧客の満足度を高めることが可能となる。 As described above, according to the present embodiment, a customer who visits a restaurant can perform an appropriate dialogue with the robot 2 installed in the restaurant and the operation terminal 3. That is, in the present embodiment, the staying state of the customer can be grasped based on information input by the store staff via the handy terminal 4. And by acquiring dialogue scenario information according to the staying state of the customer that changes in a relatively short time, it is suitable for the state of the customer, for example, the state before ordering or the state of eating and drinking It is possible to have a dialogue and increase customer satisfaction.
 なお、上記の説明においては、滞在状態は、店舗スタッフがハンディ端末4から入力した際に遷移するものであるとしたが、これには限られない。例えば、ユーザの注文を確定した後に、ユーザの注文した飲食物の平均的な提供までの待ち時間が経過した後に、飲食中の滞在状態へと遷移するようにしてもよい。また、例えば、飲食物の提供が完了した後、30分や45分と言った所定の時間後に自動的にユーザの飲食が完了した滞在状態へと遷移するようにしてもよい。 In the above description, it is assumed that the stay state changes when the store staff inputs from the handy terminal 4, but the present invention is not limited to this. For example, after a user's order is confirmed, after the waiting time until the average provision of the food and drink ordered by the user has elapsed, the state may transition to a staying state during eating and drinking. Further, for example, after the provision of food and drink is completed, the user may automatically transition to a stay state in which the user has finished eating after a predetermined time such as 30 minutes or 45 minutes.
 特に、注文を受けた際には、店舗スタッフがハンディ端末4を操作するために滞在状態の更新を忘れることはないが、その他の滞在情報は、店舗スタッフが確認してハンディ端末4を操作しなくてはならないため、ハンディ端末4を介しての滞在情報の更新が為されない場合がある。このような場合に、予め決められた時間が経過すると自動的に滞在状態が遷移するようにしておくことは有効である。 In particular, when an order is received, the store staff does not forget to update the stay status in order to operate the handy terminal 4, but other stay information is confirmed by the store staff and operated by the handy terminal 4. Since it is necessary, the stay information may not be updated via the handy terminal 4. In such a case, it is effective to automatically change the stay state after a predetermined time has elapsed.
 (第1変形例)
 図10は、第1変形例に係るシナリオ制御部34の機能を示すブロック図である。この図10に示すように、シナリオ制御部34は、受信部341と、滞在状態判断部342と、情報取得部343と、対話制御部345と、を備える一方、上述した実施形態とは異なり、推奨メニュー提供部344を有していない。前述したとおり、推奨メニュー提供部344は、シナリオ制御部34の必須の構成要素ではない。
(First modification)
FIG. 10 is a block diagram illustrating functions of the scenario control unit 34 according to the first modification. As shown in FIG. 10, the scenario control unit 34 includes a reception unit 341, a stay state determination unit 342, an information acquisition unit 343, and a dialogue control unit 345. However, unlike the above-described embodiment, The recommended menu providing unit 344 is not provided. As described above, the recommended menu providing unit 344 is not an essential component of the scenario control unit 34.
 この場合、情報取得部343は、図4におけるステップS14において、推奨メニュー情報を得ずに、シナリオデータを取得する。以下のステップを前述した実施形態と同様に行うことにより、シナリオ制御部34を介して、ユーザは、ロボット2との間で、ユーザの滞在状態に適した対話を行うことが可能である。このような構成においてもお薦めメニューのリコメンドをしたい場合には、シナリオDB 33に、お薦めメニューを含めたシナリオデータを格納するようにしてもよい。 In this case, the information acquisition unit 343 acquires scenario data without obtaining recommended menu information in step S14 in FIG. By performing the following steps in the same manner as in the above-described embodiment, the user can interact with the robot 2 through the scenario control unit 34 according to the staying state of the user. Even in such a configuration, when it is desired to recommend a recommended menu, scenario data including a recommended menu may be stored in the scenario DB 33.
 本変形例によれば、シナリオ制御部の構造を、前述した実施形態と比べて簡易なものとすることが可能となる。店舗は、シナリオDB 33やシナリオ制御部34の状況に応じて図2に示した構造と、図10に示した構造を適宜選択するようにしてもよい。 According to this modification, it is possible to simplify the structure of the scenario control unit as compared with the embodiment described above. The store may appropriately select the structure shown in FIG. 2 and the structure shown in FIG. 10 according to the situation of the scenario DB 33 and the scenario control unit 34.
 (第2変形例)
 図11は、第2変形例に係るロボット制御システム1の一例を示すブロック図である。前述した実施形態においては、シナリオDB 33は、操作端末3内に設置されているものであったが、これには限られない。例えば、図11に示すように、シナリオDB 33は、操作端末3の外部にあるものでもよい。そして、シナリオ制御部34は、ネットワーク6を介してシナリオDB 33と接続される。
(Second modification)
FIG. 11 is a block diagram illustrating an example of the robot control system 1 according to the second modification. In the embodiment described above, the scenario DB 33 is installed in the operation terminal 3, but is not limited thereto. For example, as shown in FIG. 11, the scenario DB 33 may be outside the operation terminal 3. The scenario control unit 34 is connected to the scenario DB 33 via the network 6.
 シナリオDB 33は、操作端末3の外部の所定のサーバに備えられているものでもよいし、特定のサーバではなく、ネットワーク6により接続される所謂クラウド上に存在するデータベース(分散型データベースを含む)であってもよい。ネットワーク6と、操作端末3又はシナリオDB 33は、有線又は無線の通信を介して接続される。 The scenario DB 33 may be provided in a predetermined server outside the operation terminal 3, or a database (including a distributed database) existing on a so-called cloud connected via the network 6 instead of a specific server. It may be. The network 6 and the operation terminal 3 or the scenario DB 33 are connected via wired or wireless communication.
 本変形例によれば、シナリオDB 33を操作端末3内ではなく外部に有することにより、操作端末3の容量を確保することが可能となるとともに、シナリオデータの一元管理をすることも可能となる。さらに、シナリオDB 33がクラウド上に存在し、特定多数の編集者がシナリオデータを編集、あるいは、追加することが可能であるように設定することにより、よりバリエーションに富んだ対話内容を選択することも可能となる。 According to this modification, by having the scenario DB 33 outside the operation terminal 3, it is possible to secure the capacity of the operation terminal 3 and to centrally manage scenario data. . In addition, the scenario DB 33 exists in the cloud, and it is possible to select more interactive content by setting it so that a specific number of editors can edit or add scenario data. Is also possible.
 (第3変形例)
 図12は、第3変形例に係る対話における表示部35の様子を示す図である。前述した実施形態においては、ロボット2が必須の構成要素であったが、本変形例は、ロボット2が物理的には存在しない例である。
(Third Modification)
FIG. 12 is a diagram illustrating a state of the display unit 35 in the conversation according to the third modification. In the above-described embodiment, the robot 2 is an indispensable component, but this modification is an example in which the robot 2 does not physically exist.
 例えば、図12に示すように、操作端末3の表示部35にロボット2の代わりとなるロボットを描画するようにしてもよい。このようにすることにより、ユーザは、操作端末3内に表示された仮想的なロボットと対話をすることが可能となる。この場合、あらかじめ表示するロボットの候補を複数設定しておくことにより、ユーザが好みのロボットを対話する相手として選択できるようにしてもよい。 For example, as shown in FIG. 12, a robot serving as a substitute for the robot 2 may be drawn on the display unit 35 of the operation terminal 3. By doing so, the user can interact with the virtual robot displayed in the operation terminal 3. In this case, by setting a plurality of robot candidates to be displayed in advance, the user may be able to select a favorite robot as a partner to interact with.
 なお、このように、表示部35に表示するものには限られない。例えば、ロボット自体を表示部35に表示しない、すなわち、文字を用いてロボット側シナリオデータを表示することも可能である。より具体的な一例としては、吹き出し文字等を用いて、仮想的なロボットとの対話を行ってもよい。 In addition, it is not restricted to what is displayed on the display part 35 in this way. For example, it is possible not to display the robot itself on the display unit 35, that is, to display the robot-side scenario data using characters. As a more specific example, a dialogue with a virtual robot may be performed using a balloon character or the like.
 他の例では、AR(Augmented Reality)技術を用い、操作端末3に搭載されているカメラでユーザが着席しているテーブル上を表示部35へと映すことにより、表示部35を介して、あたかもテーブル上にロボット2が存在するように投影してもよい。テーブル上又はテーブル周辺の所定の位置にARマーカを設けてもよいし、マーカレスのARを用いてもよい。この場合、上記の例と同様に、あらかじめARとして表示するロボットの候補を複数設定しておくことにより、ユーザが好みのロボットを対話する相手として選択できるようにしてもよい。 In another example, using AR (Augmented Reality) technology, a camera mounted on the operation terminal 3 is displayed on the display unit 35 on the table on which the user is seated. You may project so that the robot 2 may exist on a table. An AR marker may be provided at a predetermined position on or around the table, or a markerless AR may be used. In this case, similarly to the above-described example, a plurality of robot candidates to be displayed as AR may be set in advance so that the user can select a favorite robot as a conversation partner.
 本変形例によれば、ロボット2が必須ではなく、ユーザは、操作端末3を用いて滞在状態に応じた対話を楽しむこともできる。このようにすることにより、コストの削減となるとともに、ユーザが使用するテーブルを広く使用することも可能となる。なお、これらの場合、ロボットの音声を、操作端末3から発話させるようにしてもよい。ユーザにより表示するロボットが選択できるようにした場合、選択したロボットに応じた声の高さ、声質、声色等に変更してロボットの発話を行うようにしてもよい。 According to this modification, the robot 2 is not essential, and the user can enjoy a dialogue according to the staying state using the operation terminal 3. By doing so, the cost can be reduced and the table used by the user can be widely used. In these cases, the voice of the robot may be uttered from the operation terminal 3. When the robot to be displayed can be selected by the user, the robot may be uttered by changing the voice pitch, voice quality, voice color, etc. according to the selected robot.
 (第4変形例)
 図13は、第4変形例に係るロボット制御システム1の構成を示す図である。この図13に示すように、動作生成部32、シナリオ制御部34、及び、シナリオDB 33は、操作端末3ではなく、ロボット2内に備えられているものとしてもよい。すなわち、主要な制御機能を操作端末3ではなく、ロボット2に備えるようにしてもよい。動作生成部32、及び、シナリオ制御部34の動作は、上述した実施形態における動作と大きく異なることはなく、本変形例においても、図4に示すフローチャートにしたがって対話の処理が実行される。
(Fourth modification)
FIG. 13 is a diagram illustrating a configuration of the robot control system 1 according to the fourth modification. As illustrated in FIG. 13, the motion generation unit 32, the scenario control unit 34, and the scenario DB 33 may be provided in the robot 2 instead of the operation terminal 3. That is, the main control function may be provided not in the operation terminal 3 but in the robot 2. The operations of the operation generation unit 32 and the scenario control unit 34 are not significantly different from the operations in the above-described embodiment, and in this modification as well, the dialogue process is executed according to the flowchart shown in FIG.
 本変形例によれば、操作端末3の構成を簡略化でき、また、操作端末3からロボット2へと送信する通信料を削減できるため、操作端末3とロボット2との間の通信回線の性能が低い場合でも、支障なくロボット2との対話を行うことができる。なお、動作生成部32は、操作端末3及びロボット2の外部に設けられていてもよく、例えば、店舗システム5に内蔵されていてもよいし、店舗システム5とは別個の通信機器内に設けてもよい。 According to the present modification, the configuration of the operation terminal 3 can be simplified, and the communication fee transmitted from the operation terminal 3 to the robot 2 can be reduced, so that the performance of the communication line between the operation terminal 3 and the robot 2 can be reduced. Even when the value is low, dialogue with the robot 2 can be performed without any trouble. The motion generation unit 32 may be provided outside the operation terminal 3 and the robot 2. For example, the motion generation unit 32 may be built in the store system 5 or provided in a communication device separate from the store system 5. May be.
 なお、上述した全ての実施形態及び変形例においては、対話は、ユーザ、又は、ロボット2が何かしらの情報を出力している状態として説明したが、これには限られない。例えば、食事中であれば、ロボット2との対話の必要性は低くなるので、このような場合においては、無言のシナリオをあらかじめ準備しておき、この無言のシナリオを出力するようにしてもよい。 In all the embodiments and modifications described above, the dialogue is described as a state in which the user or the robot 2 outputs some information, but is not limited thereto. For example, since it is less necessary to interact with the robot 2 during a meal, a silent scenario may be prepared in advance and the silent scenario may be output. .
 上述した全ての実施形態及び変形例に係るロボット制御システム1は、既述したように飲食サービスに好適に適用することができるが、飲食サービス以外の種々のサービスに適用されてもよい。 The robot control system 1 according to all the embodiments and modifications described above can be suitably applied to the food and beverage service as described above, but may be applied to various services other than the food and beverage service.
 また、上述した全ての実施形態及び変形例に係るロボット制御システム1は、CPU及びメモリ等の計算機の中に記録されたプログラムによりソフトウェアの処理がハードウェアで具体的に実行されるものであってもよいし、各部がアナログ回路又はデジタル回路により構成され機能が実行されるものであってもよいし、計算機と回路が混合して機能が実現されるものであってもよい。 The robot control system 1 according to all the embodiments and modifications described above is a system in which software processing is specifically executed by hardware according to a program recorded in a computer such as a CPU and a memory. Alternatively, each unit may be configured by an analog circuit or a digital circuit to execute a function, or a function may be realized by mixing a computer and a circuit.
 本発明の態様は、上述した個々の実施形態に限定されるものではなく、当業者が想到しうる種々の変形も含むものであり、本発明の効果も上述した内容に限定されない。すなわち、特許請求の範囲に規定された内容およびその均等物から導き出される本発明の概念的な思想と趣旨を逸脱しない範囲で種々の追加、変更および部分的削除が可能である。 The aspects of the present invention are not limited to the individual embodiments described above, but include various modifications that can be conceived by those skilled in the art, and the effects of the present invention are not limited to the contents described above. That is, various additions, modifications, and partial deletions can be made without departing from the concept and spirit of the present invention derived from the contents defined in the claims and equivalents thereof.
1 ロボット制御システム、2 ロボット、21 ロボット駆動部、22 ロボット制御部、3 操作端末、31 方位センサ、32 動作生成部、33 シナリオDB、34 シナリオ制御部、341 受信部、342 滞在状態判断部、343 情報取得部、344 推奨メニュー提供部、345 対話制御部、35 表示部、36 入力部、37 音声出力部、4 ハンディ端末、5 店舗システム、6 通信ネットワーク 1 robot control system, 2 robot, 21 robot drive unit, 22 robot control unit, 3 operation terminal, 31 direction sensor, 32 motion generation unit, 33 scenario DB, 34 scenario control unit, 341 reception unit, 342 stay state determination unit, 343 Information acquisition part, 344 Recommended menu provision part, 345 Dialog control part, 35 Display part, 36 Input part, 37 Voice output part, 4 Handy terminal, 5 Store system, 6 Communication network

Claims (11)

  1.  飲食店に入店した顧客の前記飲食店内での滞在状態を判断する、滞在状態判断部と、
     前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を取得する、情報取得部と、
     前記情報取得部で取得された対話シナリオ情報に基づいて、前記顧客との対話を行う、対話制御部と、
     を備える対話制御装置。
    A stay state determination unit for determining a stay state of the customer who entered the restaurant in the restaurant;
    An information acquisition unit that acquires dialogue scenario information according to the stay state of the customer determined by the stay state determination unit;
    A dialogue control unit for conducting a dialogue with the customer based on the dialogue scenario information acquired by the information acquisition unit;
    A dialogue control device comprising:
  2.  前記滞在状態判断部は、店舗システムからの情報に基づいて、前記顧客の前記飲食店内での滞在状態を判断する、請求項1に記載の対話制御装置。 The dialogue control device according to claim 1, wherein the stay state determination unit determines a stay state of the customer in the restaurant based on information from a store system.
  3.  前記顧客が飲食物の注文を行ったことを示す信号を前記店舗システムから受信する、受信部をさらに備え、
     前記滞在状態判断部は、前記受信部にて前記信号が受信されると、注文した飲食物を前記顧客が待っている状態であることを判断する、請求項2に記載の対話制御装置。
    Receiving a signal indicating that the customer has ordered food and drink from the store system, further comprising a receiving unit;
    The dialogue control device according to claim 2, wherein the stay state determination unit determines that the customer is waiting for the ordered food and drink when the signal is received by the reception unit.
  4.  前記滞在状態判断部にて判断される前記顧客の滞在状態は、時間の経過により変化する状態である、請求項1又は請求項2に記載の対話制御装置。 3. The dialogue control device according to claim 1 or 2, wherein the stay state of the customer determined by the stay state determination unit is a state that changes with the passage of time.
  5.  前記滞在状態判断部にて判断される前記顧客の滞在状態は、
     前記顧客が前記飲食店に入店または着席してから飲食物を注文するまでの状態、
     前記顧客が注文した飲食物が提供されるのを待っている状態、
     前記顧客が飲食物の飲食を完了するまでの状態、及び、
     前記顧客が飲食物の飲食を完了してから退店するまでの状態、のうち少なくとも1つの状態を含む請求項4に記載の対話制御装置。
    The stay state of the customer determined by the stay state determination unit is:
    A state from when the customer enters or sits in the restaurant to order food or drink,
    Waiting for food and drink ordered by the customer to be provided,
    The state until the customer completes eating and drinking of food and drink, and
    The dialogue control device according to claim 4, comprising at least one state among states from when the customer completes eating and drinking of food and drink until the customer leaves the store.
  6.  前記顧客が飲食物の注文を行っていない場合に、前記顧客に飲食物の推奨メニューを提供する、推奨メニュー提供部を備え、
     前記情報取得部は、飲食物のメニュー選択に関する前記顧客との対話において、前記顧客が前記推奨メニューを選択した場合と、前記推奨メニューを選択せずに別のメニューを選択した場合とで、それぞれ相違する対話シナリオ情報を取得する請求項1乃至5のいずれか1項に記載の対話制御装置。
    When the customer has not ordered food and drink, the recommended menu providing unit for providing a recommended menu of food and drink to the customer,
    In the dialogue with the customer regarding the menu selection of food and drink, the information acquisition unit, when the customer selects the recommended menu, and when selecting another menu without selecting the recommended menu, respectively 6. The dialogue control apparatus according to claim 1, wherein different dialogue scenario information is acquired.
  7.  それぞれ異なる複数の対話シナリオ情報を蓄積するシナリオ蓄積部を備え、
     前記情報取得部は、前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を、前記シナリオ蓄積部から取得する、請求項1乃至6のいずれか1項に記載の対話制御装置。
    A scenario storage unit that stores a plurality of different dialogue scenario information,
    The dialogue according to any one of claims 1 to 6, wherein the information acquisition unit acquires, from the scenario storage unit, dialogue scenario information corresponding to the stay state of the customer determined by the stay state determination unit. Control device.
  8.  前記情報取得部は、前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を、通信ネットワークを介して取得する、請求項1乃至6のいずれか1項に記載の対話制御装置。 The dialogue according to any one of claims 1 to 6, wherein the information acquisition unit acquires, via a communication network, dialogue scenario information according to the stay state of the customer determined by the stay state determination unit. Control device.
  9.  指示信号に応じた動作を行う、ロボットと、
     前記指示信号を生成して前記ロボットに送信する、操作端末と、を備え、
     前記操作端末は、
     店舗システムからの情報に基づいて、飲食店に入店した顧客の前記飲食店内での滞在状態を判断する、滞在状態判断部と、
     前記滞在状態判断部で判断された前記顧客の滞在状態に応じた対話シナリオ情報を取得する、情報取得部と、
     前記情報取得部で取得された対話シナリオ情報に基づいて、前記顧客との対話を行う対話制御部と、
     を有するロボット制御システム。
    A robot that performs an action according to the instruction signal;
    An operation terminal that generates and transmits the instruction signal to the robot,
    The operation terminal is
    Based on information from the store system, a stay state determination unit that determines a stay state in the restaurant of the customer who entered the restaurant,
    An information acquisition unit that acquires dialogue scenario information according to the stay state of the customer determined by the stay state determination unit;
    Based on the dialogue scenario information acquired by the information acquisition unit, a dialog control unit that performs a dialog with the customer;
    A robot control system.
  10.  前記操作端末は、前記店舗システムから前記顧客の注文情報を受信する、受信部を有し、
     前記滞在状態判断部は、前記受信部にて前記注文情報が受信されると、注文した飲食物を前記顧客が待っている状態であることを判断する、請求項9に記載のロボット制御システム。
    The operation terminal has a receiving unit that receives the customer order information from the store system,
    The robot control system according to claim 9, wherein the stay state determination unit determines that the customer is waiting for ordered food and drink when the order information is received by the reception unit.
  11.  前記店舗システムは、前記注文情報に加えて、前記顧客が入店したことを示す第1情報、前記顧客が着席したことを示す第2情報、注文した飲食物が前記顧客に提供されたことを示す第3情報、前記顧客が提供された飲食物の飲食を完了したことを示す第4情報の少なくとも1つを含む滞在情報を無線送信し、
     前記操作端末の前記受信部は、前記店舗システムが送信した前記注文情報および前記滞在情報を受信し、
     前記滞在状態判断部は、前記受信部にて受信した前記注文情報および前記滞在情報に基づいて前記顧客の滞在状態を判断する、
     請求項10に記載のロボット制御システム。
    The store system includes, in addition to the order information, first information indicating that the customer has entered the store, second information indicating that the customer has been seated, and the ordered food and drink being provided to the customer. Wirelessly transmitting stay information including at least one of the third information to be displayed and the fourth information indicating that the customer has completed the food and drink provided by the customer;
    The receiving unit of the operation terminal receives the order information and the stay information transmitted by the store system,
    The stay state determining unit determines the stay state of the customer based on the order information and the stay information received by the receiving unit.
    The robot control system according to claim 10.
PCT/JP2018/011914 2017-03-24 2018-03-23 Conversation control system, and robot control system WO2018174289A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-059967 2017-03-24
JP2017059967A JP2018161704A (en) 2017-03-24 2017-03-24 Dialogue control system, and robot control system

Publications (1)

Publication Number Publication Date
WO2018174289A1 true WO2018174289A1 (en) 2018-09-27

Family

ID=63584539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/011914 WO2018174289A1 (en) 2017-03-24 2018-03-23 Conversation control system, and robot control system

Country Status (2)

Country Link
JP (1) JP2018161704A (en)
WO (1) WO2018174289A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111104494A (en) * 2018-10-25 2020-05-05 丰田自动车株式会社 Dialogue device and control program for dialogue device
CN111390924A (en) * 2020-04-03 2020-07-10 上海明略人工智能(集团)有限公司 Meal delivery robot control method and device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6666400B1 (en) 2018-09-10 2020-03-13 Telexistence株式会社 Robot control device, robot control method and robot control system
US20230030633A1 (en) * 2021-07-28 2023-02-02 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for controlling a serving robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008149427A (en) * 2006-12-19 2008-07-03 Mitsubishi Heavy Ind Ltd Method of acquiring information necessary for service of moving object by robot, and the object movement service system by the robot
JP2010064154A (en) * 2008-09-08 2010-03-25 Nec Corp Robot control system, remote management device, robot, robot control and remote management methods, and robot control and remote management programs
EP3144876A1 (en) * 2015-09-17 2017-03-22 Mastercard International Incorporated Systems and methods for interactive order and payment processing for restaurants

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008149427A (en) * 2006-12-19 2008-07-03 Mitsubishi Heavy Ind Ltd Method of acquiring information necessary for service of moving object by robot, and the object movement service system by the robot
JP2010064154A (en) * 2008-09-08 2010-03-25 Nec Corp Robot control system, remote management device, robot, robot control and remote management methods, and robot control and remote management programs
EP3144876A1 (en) * 2015-09-17 2017-03-22 Mastercard International Incorporated Systems and methods for interactive order and payment processing for restaurants

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Pepper Programming for Creators and Developers", SOFTBANK ROBORICS CORP., 28 September 2015 (2015-09-28), pages 185 - 201, 226-254 *
SUGAWARA YU: "Implementation and Evaluation of visiting service and the seat guide in Robot Café using PRINTEPS", THE 30TH ANNUAL CONFERENCE OF THE JAPANESE SOCIETY FOR ARTIFICIAL INTELLIGENCE, 6 June 2016 (2016-06-06), pages 1 - 4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111104494A (en) * 2018-10-25 2020-05-05 丰田自动车株式会社 Dialogue device and control program for dialogue device
CN111390924A (en) * 2020-04-03 2020-07-10 上海明略人工智能(集团)有限公司 Meal delivery robot control method and device

Also Published As

Publication number Publication date
JP2018161704A (en) 2018-10-18

Similar Documents

Publication Publication Date Title
WO2018174289A1 (en) Conversation control system, and robot control system
Pearl Designing voice user interfaces: Principles of conversational experiences
US20220020360A1 (en) System and method for dialogue management
US20190206402A1 (en) System and Method for Artificial Intelligence Driven Automated Companion
CN102292766B (en) Method and apparatus for providing compound models for speech recognition adaptation
US20140036022A1 (en) Providing a conversational video experience
WO2017200072A1 (en) Dialog method, dialog system, dialog device, and program
CN105284107A (en) Device, system, and method, and computer-readable medium for providing interactive advertising
US11267121B2 (en) Conversation output system, conversation output method, and non-transitory recording medium
WO2015077398A1 (en) Adaptive virtual intelligent agent
US20190205390A1 (en) System and Method for Learning Preferences in Dialogue Personalization
JPH11511859A (en) Educational and entertainment device with dynamic configuration and operation
CN108337380A (en) Adjust automatically user interface is for hands-free interaction
JP7309889B2 (en) Dynamic text message processing for endpoint communication channel selection
JP7207425B2 (en) Dialog device, dialog system and dialog program
CN109471440A (en) Robot control method, device, smart machine and storage medium
US10255266B2 (en) Relay apparatus, display apparatus, and communication system
JP2018084998A (en) Customer service system and customer service method
CN111062728B (en) Queuing optimization method and device for manual online consultation
WO2018174287A1 (en) Store management system
WO2018174285A1 (en) Conversation control device and conversation system
WO2018174290A1 (en) Conversation control system, and robot control system
JP2001249924A (en) Automatic interactive explanation device, automatic interactive explanation method and recording medium having execution program of the method recorded thereon
JP2018161709A (en) Dialogue control system and dialogue control device
JP2018161707A (en) Robot control system and robot control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18771513

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18771513

Country of ref document: EP

Kind code of ref document: A1