WO2021229692A1 - アバター制御プログラム、アバター制御方法および情報処理装置 - Google Patents

アバター制御プログラム、アバター制御方法および情報処理装置 Download PDF

Info

Publication number
WO2021229692A1
WO2021229692A1 PCT/JP2020/019006 JP2020019006W WO2021229692A1 WO 2021229692 A1 WO2021229692 A1 WO 2021229692A1 JP 2020019006 W JP2020019006 W JP 2020019006W WO 2021229692 A1 WO2021229692 A1 WO 2021229692A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
avatar
information
processing terminal
user
Prior art date
Application number
PCT/JP2020/019006
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
慎二 生川
和宏 中村
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2020/019006 priority Critical patent/WO2021229692A1/ja
Priority to JP2022522145A priority patent/JP7371770B2/ja
Publication of WO2021229692A1 publication Critical patent/WO2021229692A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present invention relates to an avatar control program, an avatar control method, and an information processing device.
  • chatbot is a computer system that uses artificial intelligence to interact with humans and exchange messages. Further, in a conversation with a chatbot, if a specific condition is satisfied, the conversation may be switched from the chatbot correspondence to the manned conversation.
  • Prior art is to acquire information about the user, estimate the user's condition from the acquired information, and based on the estimation result, generate information for making the character move and / or speak, and generate the generated information. There is something to output.
  • the information processing terminal receives the user attribute information acquired by the information processing terminal together with the instruction to start the conversation with the chat bot, and uses the script information determined based on the user attribute information to be used in the information processing terminal.
  • the present invention aims to control the output of an avatar according to the external environment.
  • an avatar that moves in response to an operation of the first user in the first information processing terminal is output to the second information processing terminal, and the second user in the second information processing terminal is detected.
  • the information prompting the operation corresponding to the avatar is output to the first information processing terminal and the output operation corresponding to the information is accepted by the first information processing terminal, the received operation and the said operation.
  • An avatar control program that controls the output of the avatar based on the external environment information detected by the second information processing terminal is provided.
  • FIG. 1 is an explanatory diagram showing a system configuration example of the conversation control system 100.
  • FIG. 2 is a block diagram showing a hardware configuration example of the information processing apparatus 101.
  • FIG. 3 is a block diagram showing a hardware configuration example of the information processing terminals 102 and 103.
  • FIG. 4 is an explanatory diagram showing an example of the stored contents of the option management table 120.
  • FIG. 5 is an explanatory diagram showing an example of the stored contents of the avatar management table 130.
  • FIG. 6 is an explanatory diagram showing an example of the stored contents of the script table 140.
  • FIG. 7 is a block diagram showing a functional configuration example of the information processing apparatus 101.
  • FIG. 8 is an explanatory diagram showing an embodiment of the conversation control system 100.
  • FIG. 9 is an explanatory diagram (No.
  • FIG. 10 is an explanatory diagram showing a screen example of an operator support screen displayed on the first information processing terminal 102.
  • FIG. 11 is an explanatory diagram (No. 2) showing a screen example of a talk screen displayed on the second information processing terminal 103.
  • FIG. 12 is a flowchart (No. 1) showing an example of the avatar control processing procedure of the information processing apparatus 101.
  • FIG. 13 is a flowchart (No. 2) showing an example of the avatar control processing procedure of the information processing apparatus 101.
  • FIG. 14 is a flowchart showing an example of a specific processing procedure of the avatar output change processing.
  • the conversation control system 100 is applied to services such as store guidance and career counseling in a company by, for example, automatic response by a chatbot or manned response by an operator.
  • FIG. 1 is an explanatory diagram showing a system configuration example of the conversation control system 100.
  • the conversation control system 100 includes an information processing device 101, a first information processing terminal 102, and a second information processing terminal 103.
  • the information processing device 101, the first information processing terminal 102, and the second information processing terminal 103 are connected via a wired or wireless network 110.
  • the network 110 is, for example, a LAN (Local Area Network), a WAN (Wide Area Network), the Internet, or the like.
  • the conversation control system 100 includes, for example, a plurality of first information processing terminals 102 and a plurality of second information processing terminals 103.
  • the information processing device 101 is a computer that controls the output of the avatar.
  • the avatar is a character that moves in response to an operation of the first user in the first information processing terminal 102, and is displayed as, for example, an operator's alter ego in a virtual space on a network.
  • the information processing device 101 is, for example, a server.
  • the information processing device 101 has, for example, an option management table 120, an avatar management table 130, and a script table 140.
  • the stored contents of the option management table 120, the avatar management table 130, and the script table 140 will be described later with reference to FIGS. 4 to 6.
  • the first information processing terminal 102 is a computer used by the first user, and can operate the avatar displayed on the second information processing terminal 103.
  • the first user is, for example, an operator who responds to a user who uses the second information processing terminal 103.
  • the first user can have a conversation with the second user through the avatar displayed on the second information processing terminal 103.
  • the first information processing terminal 102 is, for example, a PC (Personal Computer), a tablet PC, or the like.
  • the second information processing terminal 103 is a computer used by the second user.
  • the second information processing terminal 103 is, for example, a digital board installed in a store, a facility, or the like and used by an unspecified number of people. Further, the second information processing terminal 103 may be a smartphone, a tablet PC, or the like rented out at a store or facility, or owned by an individual user.
  • the facility is, for example, a commercial facility, a corporate facility, a public facility, or the like.
  • the second information processing terminal 103 may be available when the second user has a conversation in the chatbot.
  • the second user searches for a desired answer or uses a service while interacting with the chatbot operator, for example, by inputting a message or selecting an option. Can be done.
  • the answer is, for example, FAQ (freaky Asked Question).
  • FAQ means "frequently asked questions" and is a collection of questions and answers to expected questions in advance.
  • the question may be a question of how to solve some problem, or it may be a spoken question. Answers include answers that indicate how to solve the question being asked and answers to spoken questions.
  • the information processing apparatus 101 has, for example, a FAQ master, a chat log DB (Database), and the like.
  • the FAQ master stores the FAQ.
  • the chat log DB stores the chat log.
  • the chat log is a conversation history related to a conversation with a second user.
  • the chat log is stored in the chat log DB in association with the room ID, for example.
  • the room ID can identify a series of conversations with chatbots and operators.
  • the avatar make movements such as waving, nodding, and laughing according to the operation of the operator.
  • the movement of the avatar can be changed according to the content of the response to the user.
  • the preferred movement of the avatar may differ depending on the time, place, weather, and so on.
  • the movement of the avatar is controlled by considering not only the content of the response to the user but also the external environment such as time, place, and weather. , Explains the avatar control method that realizes a wide variety of avatar movements.
  • FIG. 2 is a block diagram showing a hardware configuration example of the information processing apparatus 101.
  • the information processing apparatus 101 includes a CPU (Central Processing Unit) 201, a memory 202, a disk drive 203, a disk 204, a communication I / F (Interface) 205, and a portable recording medium I / F 206. , And a portable recording medium 207. Further, each component is connected by a bus 200.
  • CPU Central Processing Unit
  • the CPU 201 controls the entire information processing device 101.
  • the CPU 201 may have a plurality of cores.
  • the memory 202 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a flash ROM, and the like.
  • the flash ROM stores the OS (Operating System) program
  • the ROM stores the application program
  • the RAM is used as the work area of the CPU 201.
  • the program stored in the memory 202 is loaded into the CPU 201 to cause the CPU 201 to execute the coded process.
  • the disk drive 203 controls data read / write to the disk 204 according to the control of the CPU 201.
  • the disk 204 stores the data written under the control of the disk drive 203. Examples of the disk 204 include a magnetic disk and an optical disk.
  • the communication I / F 205 is connected to the network 110 through a communication line, and is connected to an external computer (for example, the first information processing terminal 102 and the second information processing terminal 103 shown in FIG. 1) via the network 110.
  • the communication I / F 205 controls the interface between the network 110 and the inside of the device, and controls the input / output of data from an external computer.
  • a modem, a LAN adapter, or the like can be adopted for the communication I / F 205.
  • the portable recording medium I / F 206 controls data read / write to the portable recording medium 207 according to the control of the CPU 201.
  • the portable recording medium 207 stores data written under the control of the portable recording medium I / F 206.
  • Examples of the portable recording medium 207 include a CD (Compact Disc) -ROM, a DVD (Digital Versaille Disk), and a USB (Universal Serial Bus) memory.
  • the information processing device 101 may have, for example, an SSD (Solid State Drive), an input device, a display, or the like, in addition to the above-mentioned components. Further, the information processing apparatus 101 may not have, for example, a disk drive 203, a disk 204, a portable recording medium I / F 206, and a portable recording medium 207 among the above-mentioned components.
  • an SSD Solid State Drive
  • I / F 206 portable recording medium
  • FIG. 3 is a block diagram showing a hardware configuration example of the information processing terminals 102 and 103.
  • the information processing terminals 102 and 103 include a CPU 301, a memory 302, a display 303, an input device 304, a communication I / F 305, a camera 306, a speaker 307, a microphone 308, and a GPS (Global Positioning). It has a System) unit 309 and. Further, each component is connected by a bus 300.
  • the CPU 301 controls the entire information processing terminals 102 and 103.
  • the CPU 301 may have a plurality of cores.
  • the memory 302 is a storage unit having, for example, a ROM, a RAM, a flash ROM, and the like. Specifically, for example, a flash ROM or ROM stores various programs, and RAM is used as a work area of CPU 301.
  • the program stored in the memory 302 is loaded into the CPU 301 to cause the CPU 301 to execute the coded process.
  • the display 303 is a display device that displays data such as a cursor, an icon, a toolbox, a document, an image, and functional information.
  • a liquid crystal display for example, a liquid crystal display, an organic EL (Electroluminescence) display, or the like can be adopted.
  • the input device 304 has keys for inputting characters, numbers, various instructions, etc., and inputs data.
  • the input device 304 may be a touch panel type input pad, a numeric keypad, or the like, or may be a keyboard, a mouse, or the like.
  • the communication I / F 305 is connected to the network 110 through a communication line, and is connected to an external computer via the network 110 (for example, the information processing device 101 shown in FIG. 1, the first information processing terminal 102, and the second information processing terminal). It is connected to 103). Then, the communication I / F 305 controls the interface between the network 110 and the inside of the own device, and controls the input / output of data from the external device.
  • the camera 306 is a photographing device that captures an image (still image or moving image) and outputs image data.
  • the speaker 307 converts an electric signal into voice and outputs the voice.
  • the microphone 308 receives voice and converts it into an electrical signal.
  • the GPS unit 309 receives radio waves from GPS satellites and outputs the position information of the information processing terminals 102 and 103.
  • the position information of the information processing terminals 102 and 103 is information for specifying one point on the earth such as latitude and longitude.
  • the satellite for example, a satellite of the quasi-zenith satellite system may be used.
  • the information processing terminals 102 and 103 may include, for example, an HDD (Hard Disk Drive), SSD, short-range wireless communication I / F, portable recording medium I / F, portable recording medium, and the like. May have.
  • the second information processing terminal 103 does not have to have the GPS unit 309.
  • the various tables 120, 130, and 140 are realized by, for example, a storage device such as a memory 202 and a disk 204 shown in FIG.
  • FIG. 4 is an explanatory diagram showing an example of the stored contents of the option management table 120.
  • the option management table 120 has fields of conversation content, option 1, option 2, option 3, option 4, and option 5, and by setting information in each field, option management information (for example, option).
  • option management information for example, option.
  • the management information 400-1 to 400-3) is stored as a record.
  • the conversation content is information that identifies the content of the conversation with the operator through the avatar and the conversation with the chatbot.
  • the conversation content indicates selected options, input information, and the like in a conversation with an operator through an avatar or a conversation with a chatbot.
  • Options 1 to 5 are options corresponding to the avatar and are prepared in association with the conversation content. Options 1 to 5 are presented, for example, to an operator who can operate the avatar. That is, the options presented to the operator are switched according to the content of the conversation with the operator through the avatar and the conversation with the chatbot.
  • the option management information 400-1 indicates options ch1 to ch4 corresponding to the conversation content C1.
  • the maximum number of options corresponding to the conversation content "5" will be described as an example, but the present invention is not limited to this.
  • the maximum number of choices may be less than 5 or greater than or equal to 6.
  • FIG. 5 is an explanatory diagram showing an example of the stored contents of the avatar management table 130.
  • the avatar management table 130 has fields for operator ID, choices, face data, external environment, and body data, and by setting information in each field, avatar management information (for example, avatar management information 500-). 1,500-2) is stored as a record.
  • the options correspond to the options that are selectively displayed on the first information processing terminal 102 (for example, options 1 to 5 shown in FIG. 4).
  • the face data is information that identifies the appearance of the face portion of the avatar.
  • face data is prepared in association with the options.
  • the external environment represents the external environment in the second information processing terminal 103 on which the avatar is displayed.
  • the external environment represents time (time, time zone, date, etc.), place, season, weather, temperature, humidity, and the like.
  • Body data is information that identifies the appearance of the body part of the avatar.
  • body data is prepared in association with the external environment.
  • the operator ID is an identifier that uniquely identifies the operator corresponding to the option.
  • the operator ID may be an identifier that uniquely identifies the first information processing terminal 102 used by the operator.
  • the avatar management information 500-1 indicates the face data fc1 and the operator OP1 corresponding to the option ch1 and the body data bd1 corresponding to the external environment E1.
  • FIG. 6 is an explanatory diagram showing an example of the stored contents of the script table 140.
  • the script table 140 has user attributes and script fields, and by setting information in each field, script management information (for example, script management information 600-1 to 600-3) is stored as a record. do.
  • the user attribute is information representing the characteristics of the user.
  • user attributes represent age, age, gender, language used, clothing, presence or absence of companion, and the like.
  • the script is information (scenario) that defines the conversation flow of the chatbot, and is prepared in association with the user attribute.
  • the script is the information that defines the chat process.
  • the talk process is a process for realizing the operation (behavior) of the chatbot.
  • the talk process is a process of speaking, a process of displaying options, a process of selecting an option according to a user operation, a process of performing a procedure according to the selected option or input information, and a process of ending the talk process. And so on.
  • a script corresponding to the user attribute is prepared in advance.
  • the FAQ that is often selected tends to change depending on the age.
  • users in their thirties and forties tend to choose FAQs for career advancement compared to users in other age groups.
  • the FAQ that is often selected tends to change depending on the language. For example, when taking a course, people who speak a language other than Japanese (for example, foreigners) tend to choose FAQs related to Japanese courses compared to people who speak Japanese.
  • the script management information 600-1 indicates the script SP1 corresponding to the user attribute UA1.
  • the FAQ that is often selected may change depending on the time and place. Therefore, in the script table 140, a script corresponding to the combination of the user attribute and the external environment may be prepared in advance.
  • FIG. 7 is a block diagram showing a functional configuration example of the information processing apparatus 101.
  • the information processing apparatus 101 includes a first output control unit 701, a second output control unit 702, a communication unit 703, a conversation control unit 704, and a storage unit 710.
  • the first output control unit 701 to the conversation control unit 704 execute the program stored in the storage device such as the memory 202, the disk 204, and the portable recording medium 207 shown in FIG. 2 in the CPU 201.
  • the function is realized by the communication I / F 205 or by the communication I / F 205.
  • the processing result of each functional unit is stored in a storage device such as a memory 202 or a disk 204, for example.
  • the storage unit 710 is realized by, for example, a storage device such as a memory 202 and a disk 204. Specifically, for example, the storage unit 710 stores the option management table 120 shown in FIG. 4, the avatar management table 130 shown in FIG. 5, the script table 140 shown in FIG. 6, the FAQ master, the chat log DB, and the like. do.
  • the first output control unit 701 outputs an avatar that moves in response to an operation of the first user in the first information processing terminal 102 to the second information processing terminal 103.
  • the first user is, for example, an operator who handles a user who uses the second information processing terminal 103.
  • An avatar is, for example, a character that is displayed as an alter ego of an operator.
  • the first output control unit 701 outputs an avatar to the second information processing terminal 103 in response to the activation of the second information processing terminal 103. Further, the first output control unit 701 attaches an avatar to the second information processing terminal 103 in response to the activation operation of the first user using the input device 304 (see FIG. 3) of the second information processing terminal 103. May be output.
  • the first output control unit 701 outputs an avatar message or an option to the second information processing terminal 103 in response to an operation of the first user in the first information processing terminal 102, for example.
  • the message or option of the avatar may be displayed on the display 303 (see FIG. 3), or may be output by voice from the speaker 307 (see FIG. 3).
  • the first output control unit 701 uses the input message as the second information when the first user inputs a message in Japanese. It may be translated into a foreign language selected by the processing terminal 103 and output to the second information processing terminal 103. Further, when a message is input in a foreign language by the second user, the first output control unit 701 translates the input message into Japanese and outputs it to the first information processing terminal 102. You may do it.
  • the first output control unit 701 may change the facial expression of the avatar according to the movement of the face of the operator (speaker) detected by the motion sensor, for example. It should be noted that what kind of avatar is output when the second information processing terminal 103 is started is set in advance, for example.
  • the second output control unit 702 When the second output control unit 702 detects the second user in the second information processing terminal 103, the second output control unit 702 outputs information prompting the operation corresponding to the avatar to the first information processing terminal 102.
  • the second user is a user who uses the second information processing terminal 103.
  • the second information processing terminal 103 is a digital board installed in a commercial facility or a store as an example, the second user is, for example, a customer.
  • the second information processing terminal 103 when the second information processing terminal 103 is a digital board installed in a corporate facility, the second user is, for example, an employee. Further, for example, when the second information processing terminal 103 is a digital board installed in a public facility, the second user is, for example, a tourist.
  • the second user is detected, for example, in the second information processing terminal 103.
  • the second information processing terminal 103 detects a second user when a person is detected from the image information taken by the camera 306 (see FIG. 3) of the own terminal. Further, the second information processing terminal 103 may detect a second user when a person is detected by a temperature sensor (not shown).
  • the second information processing terminal 103 will detect the second user when the conversation start button or the like is pressed by the user operation using the input device 304 (see FIG. 3) of the own terminal. May be good.
  • the second information processing terminal 103 detects the second user, the second information processing terminal 103 transmits information indicating that the second user has been detected to the information processing apparatus 101.
  • the information indicating that the second user has been detected may be transmitted from the second information processing terminal 103 to the information processing apparatus 101, for example, as a chat start instruction.
  • the chat start instruction is for starting a conversation with the operator through the avatar. Further, the chat start instruction may be for starting a conversation in the chatbot.
  • the chat start instruction from the second information processing terminal 103 is received by the communication unit 703.
  • the second output control unit 702 determines that, for example, when the communication unit 703 receives the chat start instruction, the second user in the second information processing terminal 103 is detected. Thereby, in the information processing apparatus 101, it can be determined that the second user in the second information processing terminal 103 has been detected.
  • the second user detection process may be performed by the information processing apparatus 101.
  • the information processing apparatus 101 receives image information taken by the camera 306 from the second information processing terminal 103, and detects a person from the received image information to detect the second user. May be good.
  • the operation corresponding to the avatar is, for example, an operation of specifying the movement of the avatar.
  • the movement of the avatar includes, for example, waving, nodding, bowing, and rejoicing.
  • the operation corresponding to the avatar may be an operation of designating an operator who responds to the user through the avatar.
  • the operation corresponding to the avatar may be an operation for designating the avatar.
  • the information prompting the operation corresponding to the avatar is, for example, an option for specifying what kind of movement the avatar should be made, or an option for specifying an operator who responds to the user through the avatar.
  • the information prompting the operation corresponding to the avatar may include a message prompting the user to select one of the options.
  • the information prompting the operation corresponding to the avatar output to the first information processing terminal 102 may be predetermined information.
  • options for specifying basic avatar actions such as waving, nodding, and rejoicing may be predetermined.
  • the second output control unit 702 may determine the information to be output based on the attribute information of the second user.
  • the information to be output is information that prompts an operation corresponding to the avatar output to the first information processing terminal 102.
  • the attribute information of the second user is information for specifying the attribute of the second user.
  • the attribute information of the second user is information that can identify the age, age, gender, language, clothes, presence / absence of a wheelchair, presence / absence of a companion, etc. of the second user.
  • the attribute information of the second user is acquired from the second information processing terminal 103.
  • the communication unit 703 receives a chat start instruction from the second information processing terminal 103, and also receives the attribute information of the second user acquired by the second information processing terminal 103. You may do it.
  • the second output control unit 702 specifies the user attribute based on the received attribute information of the second user.
  • the second output control unit 702 refers to the storage unit 710 that stores information indicating the correspondence relationship between the user attribute and the information prompting the operation corresponding to the avatar, and sets the avatar corresponding to the specified user attribute.
  • the information that prompts the corresponding operation is determined as the information to be output.
  • the second output control unit 702 outputs the determined information of the output target to the first information processing terminal 102.
  • the attribute information of the second user may be input by, for example, a user operation using the input device 304.
  • the first information processing terminal 102 transmits the input attribute information of the second user to the information processing apparatus 101 together with the chat start instruction.
  • the second information processing terminal 103 analyzes the image information taken by the camera 306 to acquire attribute information such as the age, gender, clothes, presence / absence of a wheelchair, and presence / absence of a companion of the second user. You may. Further, the second information processing terminal 103 may analyze the voice information received by the microphone 308 (see FIG. 3) to acquire attribute information such as the gender and language of the second user.
  • the communication unit 703 may receive the image information taken by the camera 306 and the voice information received by the microphone 308 as the attribute information of the second user.
  • the information processing apparatus 101 may acquire information representing a user's characteristics from, for example, received image information or voice information. That is, the process of analyzing the image information and the voice information may not be performed by the second information processing terminal 103, but may be performed by the information processing apparatus 101.
  • any existing technology may be used as a technology for acquiring information representing a user's characteristics from image information and audio information.
  • the second information processing terminal 103 or the information processing device 101 uses a method based on machine learning such as deep learning to determine the age, gender, clothes, language, presence / absence of a wheelchair, and companion from the feature quantities of images and sounds. Information such as presence / absence may be extracted.
  • the second output control unit 702 may determine the information to be output based on the information received in the conversation through the avatar in the second information processing terminal 103.
  • the information received in the conversation is information representing the content of the conversation between the first user and the second user through the avatar.
  • the information received in the conversation includes information indicating the options selected by the operation of the second user using the input device 304 (see FIG. 3) of the second information processing terminal 103, information input, and the like. Is. That is, the second output control unit 702 determines the information to be output in consideration of the conversation content through the manned avatar in the second information processing terminal 103.
  • the conversation control unit 704 may start a conversation in the chatbot in the second information processing terminal 103. Specifically, for example, when the conversation control unit 704 receives a chat start instruction for starting a chat on the chatbot, the conversation control unit 704 applies the talk applied to the conversation on the chatbot on the second information processing terminal 103. Determine the script.
  • the conversation control unit 704 specifies the user attribute based on the attribute information of the second user received from the second information processing terminal 103 together with the chat start instruction. Next, the conversation control unit 704 determines a script corresponding to the specified user attribute as a talk script with reference to the script table 140 shown in FIG.
  • the conversation control unit 704 starts a conversation with the chatbot on the second information processing terminal 103 using the determined talk script. Thereby, it is possible to control the conversation in the chatbot according to the characteristics of the second user who uses the second information processing terminal 103.
  • the second output control unit 702 may determine the information to be output based on the information received in the conversation in the chatbot in the second information processing terminal 103. That is, the second output control unit 702 determines the information to be output in consideration of the conversation content in the chatbot in the second information processing terminal 103.
  • the second output control unit 702 specifies the conversation content based on the information received in the conversation through the avatar or the conversation in the chatbot. Then, the second output control unit 702 determines the option corresponding to the specified conversation content as the information to be output by referring to the option management table 120 shown in FIG.
  • the conversation content specified from the information received in the conversation through the avatar or the conversation in the chatbot is referred to as "conversation content C1".
  • the conversation content C1 is, for example, a combination of options selected in a conversation through an avatar or a conversation in a chatbot.
  • the second output control unit 702 determines, for example, the options ch1 to ch4 corresponding to the specified conversation content C1 as the information to be output by referring to the option management table 120.
  • the options ch1 to ch4 are, for example, options for designating the movement of the avatar, options for designating an operator who responds to the user through the avatar, and the like.
  • the information received in the conversation through the avatar or the conversation in the chatbot is specified, for example, from the chat log DB (not shown). Further, an output example of information for prompting an operation corresponding to the avatar, which is output to the first information processing terminal 102, will be described later with reference to FIG.
  • the first output control unit 701 is based on the received operation and the external environment information detected by the second information processing terminal 103. And control the output of the avatar.
  • the external environment information is information representing the external environment in the second information processing terminal 103 on which the avatar is displayed.
  • external environmental information represents time, place, season, weather, temperature, humidity, etc.
  • the time is, for example, a time, a time zone, a date, and the like.
  • the external environment information is acquired from the second information processing terminal 103.
  • the communication unit 703 may receive the chat start instruction from the second information processing terminal 103 and also receive the external environment information detected by the second information processing terminal 103. ..
  • the second information processing terminal 103 detects external environmental information representing time, season, weather, temperature, humidity, etc., for example, by using an environment sensor (not shown) or by inquiring to an external server. Further, the second information processing terminal 103 may detect the position information indicating the current position of the own terminal as the external environment information indicating the location by the GPS unit 309 (see FIG. 3).
  • the first output control unit 701 determines the movement of the avatar based on the received operation and the external environment information detected by the second information processing terminal 103. Then, the first output control unit 701 changes the output of the avatar based on the determined movement.
  • the first information processing terminal 102 accepts an operation of selecting an option for designating the movement of the avatar, which is "waving", by the operation of the first user using the input device 304. ..
  • the first output control unit 701 accepts an operation of selecting an option for designating the movement of the avatar, "waving", from the first information processing terminal 102.
  • the first output control unit 701 determines how the avatar "waving" according to the external environment specified from the external environment information detected by the second information processing terminal 103. Change the output of the avatar. How to adjust the movement of the avatar according to the external environment is set in advance, for example.
  • the first output control unit 701 refers to the storage unit 710 and determines the content of the avatar's movement of "waving" according to the external environment.
  • the storage unit 710 stores information indicating the correspondence between the content of the movement and the external environment for each movement of the avatar.
  • the external environment specified from the external environment information detected by the second information processing terminal 103 is defined as "time zone: daytime”.
  • the first output control unit 701 may change the output of the avatar so as to wave the hand with a brighter and larger motion than the time zone at night, for example.
  • the external environment specified from the external environment information detected by the second information processing terminal 103 is defined as "temperature: below freezing point".
  • the first output control unit 701 may change the output of the avatar so as to wave the hand as if trembling in the cold, for example.
  • the external environment specified from the external environment information detected by the second information processing terminal 103 is defined as "location: hotel".
  • the first output control unit 701 changes the output of the avatar so that, for example, the hotel man waved his hand as if he were seeing off the customer.
  • the first output control unit 701 may change the output of the face portion of the avatar based on the received operation.
  • the first information processing terminal 102 accepts an operation of selecting an option for designating an operator OP by an operation of the first user using the input device 304.
  • the first output control unit 701 receives an operation of selecting an option for designating the operator OP from the first information processing terminal 102. Then, the first output control unit 701 changes the face portion of the avatar to the face of the avatar corresponding to the designated operator OP. This makes it possible to change the face of the avatar according to the operator who responds to the user.
  • the first output control unit 701 may switch the first information processing terminal 102 to which the second information processing terminal 103 is connected when the first user is switched based on the received operation. good. For example, when the first user switches from "operator OP1" to "operator OP2", the first output control unit 701 connects the first information processing terminal 102 to which the second information processing terminal 103 is connected to the operator. Switch to the first information processing terminal 102 used by OP2.
  • the first output control unit 701 may change the output of the body part of the avatar based on the external environment information detected by the second information processing terminal 103.
  • the external environment specified from the external environment information detected by the second information processing terminal 103 is defined as "season: summer”.
  • the first output control unit 701 changes, for example, the body part of the avatar to a body wearing summer clothes.
  • the external environment specified from the external environment information detected by the second information processing terminal 103 is defined as "location: hotel".
  • the first output control unit 701 changes, for example, the body part of the avatar to a body wearing a hotel uniform.
  • the clothes of the avatar can be changed according to the place where the second information processing terminal 103 is installed and the season.
  • the first output control unit 701 specifies the avatar management information corresponding to the received operation with reference to the avatar management table 130 shown in FIG. As an example, it is assumed that an operation for selecting option ch1 is accepted. Further, the external environment specified from the external environment information detected by the second information processing terminal 103 is referred to as "E2".
  • the first output control unit 701 refers to the avatar management table 130 and specifies the avatar management information 500-2 corresponding to the option ch1 and corresponding to the external environment E2. Then, the first output control unit 701 changes the face portion of the avatar based on the face data fc1 with reference to the avatar management information 500-2.
  • the first output control unit 701 changes the body part of the avatar based on the body data bd2 with reference to the avatar management information 500-2.
  • the first output control unit 701 refers to the avatar management information 500-2 and uses the first information processing terminal 102 to which the second information processing terminal 103 is connected to be used by the operator OP2. Switch to terminal 102.
  • the first output control unit 701 may not change the output of the face portion and the body portion of the avatar.
  • the first output control unit 701 may determine whether or not there is history information corresponding to the second user. .. Then, when there is history information corresponding to the second user, the first output control unit 701 may control the output of the avatar based on the information of the avatar included in the history information.
  • the history information is information indicating that the second user has used the second information processing terminal 103.
  • the history information includes, for example, identification information that uniquely identifies the second user and information that can identify the avatar output when the second user uses the second information processing terminal 103.
  • the identification information that uniquely identifies the second user is, for example, a member ID.
  • the second information processing terminal 103 is installed in a corporate facility and an employee ID is input when the second information processing terminal 103 is used.
  • the identification information that uniquely identifies the second user is, for example, an employee ID.
  • the member ID and employee ID are included in the chat start instruction, for example.
  • the user attribute may be specified from the member ID and the employee ID.
  • the gender, age, years of service, annual income, etc. of the second user can be specified as user attributes.
  • Information that can identify an avatar includes, for example, information that identifies a face part and a body part of an avatar.
  • the information for specifying the face portion and the body portion of the avatar is, for example, the face data and the body data of the avatar displayed when the conversation through the avatar in the second information processing terminal 103 is completed.
  • the information that can identify the avatar may include information that identifies the operator corresponding to the avatar.
  • the information that identifies the operator is, for example, the operator ID of the operator who is responding to the second user immediately before the end of the conversation.
  • the history information is stored in, for example, a history information DB (not shown) of the information processing apparatus 101 when the conversation through the avatar in the second information processing terminal 103 is completed.
  • the first output control unit 701 refers to the history information DB and determines whether or not there is history information including the employee ID included in the chat start instruction. Here, if there is no history information including the employee ID, the first output control unit 701 does not change the output of the avatar based on the history information.
  • the first output control unit 701 changes the face part of the avatar to be output to the second information processing terminal 103 based on the face data included in the history information. Further, the first output control unit 701 changes the body part of the avatar based on the body data 2 included in the history information. Further, the first output control unit 701 switches the first information processing terminal 102 to which the second information processing terminal 103 is connected to the first information processing terminal 102 corresponding to the operator ID included in the history information. .. As a result, the same operator as the previous time can respond to the second user through the avatar having the same appearance as the previous time, and the character can be made consistent.
  • the first output control unit 701 detects a second user existing around the second information processing terminal 103
  • the first output control unit 701 provides information for prompting an operation on the second information processing terminal 103. It may be output to the information processing terminal 103 of.
  • the information prompting the operation on the second information processing terminal 103 may be a message such as "Hello. You there! That makes you aware of the existence of the second information processing terminal 103.
  • the second user existing around the second information processing terminal 103 is, for example, from the image information taken by the camera 306 of the second information processing terminal 103, or the temperature sensor of the second information processing terminal 103. Detected by (not shown). Specifically, for example, the first output control unit 701 decides to output a message such as "Hello. You there! As an avatar's remark from the speaker 307 of the second information processing terminal 103. May be good. As a result, for example, it is possible to actively call on visitors to commercial facilities and stores to increase customer service opportunities using the second information processing terminal 103.
  • each functional unit of the information processing apparatus 101 described above may be realized by a plurality of computers in the conversation control system 100.
  • the functional unit of the information processing apparatus 101 described above may be realized by the information processing apparatus 101, the first information processing terminal 102, and the second information processing terminal 103, respectively, by sharing the functions.
  • FIG. 8 is an explanatory diagram showing an embodiment of the conversation control system 100.
  • the information processing apparatus 101 outputs an avatar av that moves in response to an operation of the operator OP1 in the first information processing terminal 102 to the second information processing terminal 103.
  • the information processing apparatus 101 detects the user U1 in the second information processing terminal 103, the information processing apparatus 101 outputs options # 1 to # 3 for prompting an operation corresponding to the avatar av to the first information processing terminal 102.
  • Options # 1 to # 3 are options for designating the movement of the avatar av and the operator who responds to the user U1 through the avatar av.
  • the information processing device 101 accepts the operation of selecting the output options # 1 to # 3 in the first information processing terminal 102, the accepted operation and the external environment information detected by the second information processing terminal 103 are used. Based on this, the output of the avatar av is controlled.
  • the information processing apparatus 101 determines the movement of the avatar av based on the received operation and the external environment information detected by the second information processing terminal 103, and outputs the avatar av based on the determined movement. Control. For example, if the option to specify the movement of the avatar "waving" is selected and the external environment specified from the external environment information is "temperature: below freezing point", the hand feels like trembling in the cold. The movement of the avatar av can be controlled as if it were shaken.
  • the information processing device 101 changes the output of the face portion fc of the avatar av based on the received operation. For example, when the option for designating the operator OP is selected, the face portion fc of the avatar av can be changed to the face of the avatar av corresponding to the designated operator OP.
  • the information processing device 101 changes the output of the body part bd of the avatar av based on the external environment information detected by the second information processing terminal 103. For example, when the external environment specified from the external environment information is "temperature: below freezing point", the body part bd of the avatar can be changed to a body wearing winter clothes.
  • the second information processing terminal 103 is a digital board installed in a corporate facility and used by employees.
  • manned support is performed through an avatar by escalation from chatbot support.
  • the user attribute specified from the attribute information of the second user acquired by the second information processing terminal 103 is defined as "30s x male”.
  • the information processing apparatus 101 refers to the script table 140 and starts a conversation in the chatbot on the second information processing terminal 103 by using the talk script corresponding to the user attribute "30s x male".
  • the talk script corresponding to the user attribute "30s x male” is, for example, a script for having a conversation about career advancement.
  • FIG. 9 is an explanatory diagram (No. 1) showing a screen example of a talk screen displayed on the second information processing terminal 103.
  • the talk screen 900 is a talk screen displayed on the second information processing terminal 103 when the second user on the second information processing terminal 103 is detected and a conversation on the chatbot is started.
  • the avatar av represents a chatbot character or a character that moves in response to an operator operation.
  • the selection buttons 901 and 902 regarding career advancement are displayed in a conversational format.
  • the selection button 901 is selected by the operation input of the second user using the input device 304 of the second information processing terminal 103, it is possible to have a conversation about a course popular among people in their thirties.
  • the selection button 902 is selected on the talk screen 900, a conversation about a course related to DX (Digital Transformation) can be performed.
  • the language buttons b1, b2, b3, and b4 are selected on the talk screen 900, the language used can be switched. For example, if the language button b2 is selected, the language used can be switched from Japanese to English.
  • the talk screen 900 is updated.
  • the selection button 901 is selected on the talk screen 900.
  • the talk screen 900 displays the message m2 of the avatar av and the selection buttons 911 to 919 for listening to the desired content of the second user in a conversational format.
  • the second user can select the desired content for the course he / she wants to take by selecting an option such as the selection buttons 911 to 919.
  • the chatbot support is switched to the manned support through the avatar av.
  • the operator support screen 1000 as shown in FIG. 10 is displayed on the display 303 of the first information processing terminal 102, and the chatbot support is switched to the manned support through the avatar av.
  • FIG. 10 is an explanatory diagram showing a screen example of an operator support screen displayed on the first information processing terminal 102.
  • the operator support screen 1000 is an example of an operation screen that supports the operation of the operator.
  • the operator is a first user who uses the first information processing terminal 102.
  • the operator support screen 1000 includes a talk screen 900 and a conversation history 1010.
  • the talk screen 900 is a talk screen displayed on the second information processing terminal 103, and the display content is switched in synchronization with the talk screen displayed on the second information processing terminal 103.
  • the conversation history 1010 is information indicating the contents of the conversation performed in the second information processing terminal 103.
  • the operator can confirm the screen contents displayed on the second information processing terminal 103 by the talk screen 900 in the operator support screen 1000. Further, the operator can confirm the content of the conversation held in the second information processing terminal 103 by the conversation history 1010.
  • the operator support screen 1000 includes operation buttons b11 to b13.
  • the operation button b11 is selected by the operation input of the operator using the input device 304 of the first information processing terminal 102, the avatar av can be made to wave. At this time, the movement of waving the avatar av is controlled according to the external environment of the second information processing terminal 103.
  • the avatar av can be made to make a nod movement.
  • the nodding movement of the avatar av is controlled according to the external environment of the second information processing terminal 103.
  • the avatar av can be made to move quietly.
  • the happy movement of the avatar av is controlled according to the external environment of the second information processing terminal 103.
  • the operator support screen 1000 includes a voice synthesis button b14 and a QR sending button b15.
  • the voice synthesis button b14 is selected on the operator support screen 1000, the text input in the first information processing terminal 102 can be converted into voice and output to the second information processing terminal 103. ..
  • the QR (Quick Response) code can be output to the second information processing terminal 103.
  • the QR code is information that identifies, for example, a script that introduces a recommended course by a chatbot. QR code is a registered trademark.
  • the second user can read the QR code displayed on the digital board (second information processing terminal 103) by his / her smartphone (second information processing terminal 103), for example, to read the QR code displayed on the digital board (second information processing terminal 103).
  • You can start a chatbot conversation at.
  • the QR code to be output to the second information processing terminal 103 can be arbitrarily selected according to the operation of the first user in the first information processing terminal 102, for example.
  • options 1021 to 1023 are displayed together with the message 1020.
  • the message 1020 and the options 1021 to 1023 are examples of information prompting an operation corresponding to the avatar output to the second information processing terminal 103.
  • the message 1020 and the options 1021 to 1023 are information determined according to the conversation content on the talk screen 900 (for example, the desired content of the course that the second user wants to take).
  • the message 1020 is a message prompting to specify an operator who responds to the second user.
  • Options 1021 to 1023 are options for designating an operator who responds to the second user.
  • the operator candidates with abundant knowledge about the course suitable for the desired content of the second user are presented as options 1021 to 1023.
  • the operator can determine which operator should respond by referring to, for example, the talk screen 900 or the conversation history 1010.
  • the operator support screen 1000 may display an image of the second user taken by the camera 306 of the second information processing terminal 103. In this case, the operator can confirm what kind of person the second user is and determine which operator should respond.
  • option 1022 is selected on the operator support screen 1000.
  • the talk screen 1100 as shown in FIG. 11 is displayed on the display 303 of the second information processing terminal 103. ..
  • FIG. 11 is an explanatory diagram (No. 2) showing a screen example of the talk screen displayed on the second information processing terminal 103.
  • the talk screen 1100 is an example of a talk screen displayed on the second information processing terminal 103 when a “person in charge for a foreign language course” is designated as an operator.
  • the first information processing terminal 102 to which the second information processing terminal 103 is connected is used by the "person in charge of the foreign language course". It switches to the information processing terminal 102 of 1. Further, on the talk screen 1100, information 1110 indicating that manned support is in progress is displayed.
  • the avatar av represents a character that moves according to the operation of an operator who is a "person in charge of a foreign language course".
  • the face portion of the avatar av is modified based on the face data (for example, see FIG. 5) corresponding to the option 1022 selected on the operator support screen 1000 shown in FIG.
  • the body part of the avatar av is changed based on the body data (for example, see FIG. 5) corresponding to the external environment specified from the external environment information detected by the second information processing terminal 103.
  • the external environment specified from the external environment information is "season: summer"
  • the body is changed to a body wearing summer clothes.
  • the operator who is the "person in charge of the foreign language course" can have a conversation with the second user through the avatar av.
  • the face part and the body part are changed from the avatar av shown in FIG. 9, it is possible to direct the second user as if he / she is having a conversation with another person.
  • the message m11 from the operator who is the "person in charge for the foreign language course” is displayed through the avatar av. Further, the message m12 of the second user is input in response to the question from the operator by the message m11. Further, for the message m12 of the second user, the message m13 from the operator who is the "person in charge for the foreign language course" is displayed through the avatar av.
  • FIG. 12 and 13 are flowcharts showing an example of the avatar control processing procedure of the information processing apparatus 101.
  • the information processing apparatus 101 outputs an avatar that moves in response to an operation of the first user in the first information processing terminal 102 to the second information processing terminal 103 (step S1201).
  • the chat start instruction is a chat start instruction for starting a conversation with the chatbot.
  • the information processing apparatus 101 waits for receiving the attribute information and the external environment information of the second user together with the chat start instruction (step S1202: No).
  • the information processing apparatus 101 receives the attribute information of the second user and the external environment information together with the chat start instruction (step S1202: Yes)
  • the information processing apparatus 101 sets the user attribute based on the received attribute information of the second user. Specify (step S1203).
  • the information processing apparatus 101 refers to the script table 140 and determines the script corresponding to the specified user attribute as the talk script (step S1204).
  • the talk script is a script applied to a conversation in a chatbot at the second information processing terminal 103.
  • the information processing apparatus 101 starts a conversation with the chatbot on the second information processing terminal 103 using the determined talk script (step S1205).
  • the information processing apparatus 101 determines whether or not a specific condition is satisfied based on the information received in the conversation in the chatbot at the second information processing terminal 103 (step S1206).
  • the information processing apparatus 101 sets a specific condition when the conversation content in the chatbot in the second information processing terminal 103 corresponds to any conversation content in the option management table 120. Judge to meet. On the other hand, if it does not correspond to any conversation content in the option management table 120, the information processing apparatus 101 determines that the specific condition is not satisfied.
  • step S1205 determines whether or not to end the conversation in the chatbot at the second information processing terminal 103 (step S1207). ..
  • the conversation in the chatbot ends, for example, according to the talk script or according to the end operation of the second user.
  • step S1207: No if the conversation in the chatbot is not ended (step S1207: No), the information processing apparatus 101 returns to step S1206. On the other hand, when ending the conversation in the chatbot (step S1207: Yes), the information processing apparatus 101 ends a series of processes according to this flowchart.
  • step S1206 when a specific condition is satisfied (step S1206: Yes), the information processing apparatus 101 switches from chatbot correspondence to manned conversation (step S1208), and proceeds to step S1301 shown in FIG. ..
  • the information processing apparatus 101 refers to the option management table 120, and selects the options corresponding to the conversation contents in the chatbot in the second information processing terminal 103, the first information processing terminal 102. Is output to (step S1301). Specifically, for example, the information processing apparatus 101 displays options corresponding to the conversation content in the chatbot on the operator support screen 1000 as shown in FIG.
  • the information processing apparatus 101 determines whether or not any of the output options has been selected (step S1302).
  • the information processing apparatus 101 waits for one of the options to be selected (step S1302: No).
  • the information processing apparatus 101 executes the avatar output change process (step S1303).
  • the avatar output change process is a process of controlling the output of the avatar displayed on the second information processing terminal 103 according to the selected option.
  • the specific processing procedure of the avatar output change processing will be described later with reference to FIG.
  • the information processing apparatus 101 determines whether or not a specific condition is satisfied based on the information received in the conversation through the avatar in the second information processing terminal 103 (step S1304). Specifically, for example, the information processing apparatus 101 determines that a specific condition is satisfied when the conversation content through the avatar for manned correspondence corresponds to any conversation content in the option management table 120. On the other hand, if it does not correspond to any conversation content in the option management table 120, the information processing apparatus 101 determines that the specific condition is not satisfied.
  • step S1304 determines whether or not to end the conversation through the avatar by the manned correspondence in the second information processing terminal 103 (step). S1305). It should be noted that the conversation through the avatar by manned correspondence ends, for example, in response to the end operation of the second user.
  • step S1305: No when the conversation through the avatar by manned correspondence is not ended (step S1305: No), the information processing apparatus 101 returns to step S1304.
  • step S1305: Yes when ending the conversation through the avatar by manned correspondence (step S1305: Yes), the information processing apparatus 101 ends a series of processes according to this flowchart.
  • step S1304 when a specific condition is satisfied in step S1304 (step S1304: Yes), the information processing apparatus 101 refers to the option management table 120 and passes the manned avatar in the second information processing terminal 103. The option corresponding to the content of the conversation is output to the first information processing terminal 102 (step S1306), and the process returns to step S1302.
  • chatbot support enables manned support through avatars.
  • FIG. 14 is a flowchart showing an example of a specific processing procedure of the avatar output change processing.
  • the information processing apparatus 101 identifies the external environment of the second information processing terminal 103 based on the external environment information received in step S1202 shown in FIG. 12 (step S1401).
  • the information processing apparatus 101 determines whether or not the option selected in step S1302 shown in FIG. 13 specifies the movement of the avatar or the operator (step S1402).
  • step S1402 movement of the avatar
  • step S1403 movement of the avatar
  • step S1404 changes the output of the avatar displayed on the second information processing terminal 103 based on the determined movement
  • step S1402 operator
  • the information processing apparatus 101 refers to the avatar management table 130 and specifies the avatar management information corresponding to the selected option (step S1405).
  • the information processing apparatus 101 identifies the face data of the avatar with reference to the identified avatar management information (step S1406).
  • the information processing apparatus 101 refers to the specified avatar management information and specifies the body data of the avatar corresponding to the specified external environment (step S1407).
  • the information processing apparatus 101 refers to the specified avatar management information and switches the first information processing terminal 102 to which the second information processing terminal 103 is connected (step S1408).
  • the information processing apparatus 101 changes the output of the avatar displayed on the second information processing terminal 103 based on the specified face data and body data (step S1409), and calls the avatar output change process. return.
  • the face part and body part of the avatar can be changed according to the operator who responds to the user.
  • the avatar that moves in response to the operation of the first user in the first information processing terminal 102 is output to the second information processing terminal 103.
  • the information prompting the operation corresponding to the avatar can be output to the first information processing terminal 102.
  • the information for prompting an operation corresponding to the avatar is, for example, information for prompting an operation for designating an operation of the avatar, or information for prompting an operation for designating an operator who responds to a second user through the avatar.
  • the information processing apparatus 101 when the operation corresponding to the output information is received by the first information processing terminal 102, it is based on the received operation and the external environment information detected by the second information processing terminal 103. And you can control the output of the avatar.
  • the movement of the avatar is determined based on the received operation and the external environment information detected by the second information processing terminal 103, and the output of the avatar is output based on the determined movement. Can be changed.
  • the output of the face part of the avatar can be changed based on the received operation, and the output of the body part of the avatar can be changed based on the external environment information.
  • the body part (appearance) of the avatar can be changed according to the external environment of the second information processing terminal 103.
  • the clothes of the avatar can be changed according to the place where the second information processing terminal 103 is installed and the season.
  • the external environment of the second information processing terminal 103 is "place: hotel”
  • the body part of the avatar can be changed to dress like a hotel man.
  • the external environment of the second information processing terminal 103 is "season: summer”
  • the body part of the avatar can be changed to the clothes of summer clothes.
  • the information processing apparatus 101 when the second user in the second information processing terminal 103 is detected, it is determined whether or not there is history information corresponding to the second user, and the second user is dealt with. If there is history information, the output of the avatar can be controlled based on the information of the avatar included in the history information.
  • the same operator as the previous time can respond to the second user through the avatar having the same appearance as when the second user used the second information processing terminal 103 last time. It can be consistent.
  • the first information processing information prompts the operation corresponding to the avatar based on the attribute information of the second user. It can be output to the terminal 102.
  • the avatar corresponds to the avatar based on the information received in the conversation through the avatar in the second information processing terminal 103.
  • Information prompting the operation can be output to the first information processing terminal 102.
  • the conversation in the chatbot in the second information processing terminal 103 is started and accepted in the conversation in the chatbot.
  • information prompting an operation corresponding to the movement of the avatar can be output to the first information processing terminal 102.
  • the avatar control method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This avatar control program is recorded on a computer-readable recording medium such as a hard disk, flexible disk, CD-ROM, DVD, or USB memory, and is executed by being read from the recording medium by the computer. Further, this avatar control program may be distributed via a network such as the Internet.
  • the information processing apparatus 101 described in the present embodiment can also be realized by a standard cell, an IC for a specific use such as a structured ASIC (Application Specific Integrated Circuit), or a PLD (Programmable Logic Device) such as an FPGA.
  • a standard cell an IC for a specific use such as a structured ASIC (Application Specific Integrated Circuit), or a PLD (Programmable Logic Device) such as an FPGA.
  • Conversation control system 101
  • Information processing device 102
  • First information processing terminal 103
  • Second information processing terminal 110
  • Network 120
  • Choice management table 130 Avatar management table 140
  • Script table 200
  • 300 Bus 201 301
  • CPU 202 302
  • Memory 203 Disk drive 204
  • Disk 205 305
  • Communication I / F 206
  • Portable recording medium I / F
  • Portable recording medium 303
  • Display 304 Input device
  • Speaker 308 Microphone 309
  • Second output control unit 703
  • Conversation control unit 710 Storage unit 900, 1100 Talk screen 1000

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2020/019006 2020-05-12 2020-05-12 アバター制御プログラム、アバター制御方法および情報処理装置 WO2021229692A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/019006 WO2021229692A1 (ja) 2020-05-12 2020-05-12 アバター制御プログラム、アバター制御方法および情報処理装置
JP2022522145A JP7371770B2 (ja) 2020-05-12 2020-05-12 アバター制御プログラム、アバター制御方法および情報処理装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/019006 WO2021229692A1 (ja) 2020-05-12 2020-05-12 アバター制御プログラム、アバター制御方法および情報処理装置

Publications (1)

Publication Number Publication Date
WO2021229692A1 true WO2021229692A1 (ja) 2021-11-18

Family

ID=78525485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/019006 WO2021229692A1 (ja) 2020-05-12 2020-05-12 アバター制御プログラム、アバター制御方法および情報処理装置

Country Status (2)

Country Link
JP (1) JP7371770B2 (zh)
WO (1) WO2021229692A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024058439A1 (ko) * 2022-09-13 2024-03-21 삼성전자주식회사 가상 공간에 배치되는 아바타 오브젝트의 페르소나를 결정하는 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007018433A (ja) * 2005-07-11 2007-01-25 Okayama Prefecture アバター表示機能付き通話端末
JP2007199997A (ja) * 2006-01-26 2007-08-09 Advanced Telecommunication Research Institute International 対話支援システム
JP2015220534A (ja) * 2014-05-15 2015-12-07 株式会社リコー コミュニケーション補助装置、コミュニケーション補助システム、コミュニケーション補助方法及びプログラム
KR20170138907A (ko) * 2016-06-08 2017-12-18 한양대학교 산학협력단 파지 동작 계획 방법 및 장치
JP2019045928A (ja) * 2017-08-30 2019-03-22 株式会社国際電気通信基礎技術研究所 感覚刺激提示システム、プログラムおよび方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019105944A (ja) 2017-12-11 2019-06-27 トヨタ自動車株式会社 表示制御装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007018433A (ja) * 2005-07-11 2007-01-25 Okayama Prefecture アバター表示機能付き通話端末
JP2007199997A (ja) * 2006-01-26 2007-08-09 Advanced Telecommunication Research Institute International 対話支援システム
JP2015220534A (ja) * 2014-05-15 2015-12-07 株式会社リコー コミュニケーション補助装置、コミュニケーション補助システム、コミュニケーション補助方法及びプログラム
KR20170138907A (ko) * 2016-06-08 2017-12-18 한양대학교 산학협력단 파지 동작 계획 방법 및 장치
JP2019045928A (ja) * 2017-08-30 2019-03-22 株式会社国際電気通信基礎技術研究所 感覚刺激提示システム、プログラムおよび方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024058439A1 (ko) * 2022-09-13 2024-03-21 삼성전자주식회사 가상 공간에 배치되는 아바타 오브젝트의 페르소나를 결정하는 방법 및 장치

Also Published As

Publication number Publication date
JP7371770B2 (ja) 2023-10-31
JPWO2021229692A1 (zh) 2021-11-18

Similar Documents

Publication Publication Date Title
US20220254343A1 (en) System and method for intelligent initiation of a man-machine dialogue based on multi-modal sensory inputs
JP4395687B2 (ja) 情報処理装置
US11922934B2 (en) Generating response in conversation
CN109176535B (zh) 基于智能机器人的交互方法及系统
KR101992424B1 (ko) 증강현실용 인공지능 캐릭터의 제작 장치 및 이를 이용한 서비스 시스템
CN105144221B (zh) 在计算机游戏应用执行期间的游戏中实行移动电话呼叫和/或消息收发操作的系统及方法
CN110334352B (zh) 引导信息显示方法、装置、终端及存储介质
CN107294837A (zh) 采用虚拟机器人进行对话交互的方法和系统
CN109086860B (zh) 一种基于虚拟人的交互方法及系统
CN110598576A (zh) 一种手语交互方法、装置及计算机介质
CN113014471B (zh) 会话处理方法,装置、终端和存储介质
CN107480766B (zh) 多模态虚拟机器人的内容生成的方法和系统
WO2017163515A1 (ja) 情報処理システム、情報処理装置、情報処理方法、および記録媒体
US20190248001A1 (en) Conversation output system, conversation output method, and non-transitory recording medium
WO2020240838A1 (ja) 会話制御プログラム、会話制御方法および情報処理装置
US20230108256A1 (en) Conversational artificial intelligence system in a virtual reality space
WO2021229692A1 (ja) アバター制御プログラム、アバター制御方法および情報処理装置
JPWO2021229692A5 (zh)
CN112820265B (zh) 一种语音合成模型训练方法和相关装置
CN111292743B (zh) 语音交互方法及装置、电子设备
CN112700783A (zh) 通讯的变声方法、终端设备和存储介质
CN114449297A (zh) 一种多媒体信息的处理方法、计算设备及存储介质
JPH09305787A (ja) アニメーション作成・再生装置、及びアニメーション検索装置
WO2019142420A1 (ja) 情報処理装置および情報処理方法
JP2002041279A (ja) エージェント伝言システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935057

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022522145

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20935057

Country of ref document: EP

Kind code of ref document: A1