WO2017068816A1 - 情報処理システム、および情報処理方法 - Google Patents
情報処理システム、および情報処理方法 Download PDFInfo
- Publication number
- WO2017068816A1 WO2017068816A1 PCT/JP2016/070207 JP2016070207W WO2017068816A1 WO 2017068816 A1 WO2017068816 A1 WO 2017068816A1 JP 2016070207 W JP2016070207 W JP 2016070207W WO 2017068816 A1 WO2017068816 A1 WO 2017068816A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- message
- information processing
- control unit
- processing system
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/02—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2457—Query processing with adaptation to user needs
- G06F16/24575—Query processing with adaptation to user needs using context
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/21—Monitoring or handling of messages
- H04L51/216—Handling conversation history, e.g. grouping of messages in sessions or threads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M11/00—Telephonic communication systems specially adapted for combination with other electrical systems
Definitions
- This disclosure relates to an information processing system and an information processing method.
- each user can carry a communication terminal such as a smartphone, a tablet terminal, a mobile phone terminal, or a wearable device, and can easily exchange messages with friends, family, and lovers via the network wherever they are.
- a communication terminal such as a smartphone, a tablet terminal, a mobile phone terminal, or a wearable device
- Patent Document 1 the position information of a mobile phone is acquired by using a GPS (Global Positioning System) function of the mobile phone, and an alarm is notified to a person who is out of the area.
- GPS Global Positioning System
- Patent Document 2 when a human group behavior situation is analyzed by a position detection system installed in an environment, and there is a stray child, an interaction-oriented robot arranged in the environment talks to the child. A system is described that guides the child to the parent and notifies the parent searching for the child of the current position of the child.
- warnings and notifications to other people are performed according to the user's position information, and this is not sufficient from the viewpoint of performing an automatic response with high affinity according to the user's situation. It was. For example, in a situation where messages can be easily exchanged wherever you are, you can rest assured if you get a message from a reliable partner when you feel uneasy.
- the present disclosure proposes an information processing system and an information processing method capable of performing an automatic response with higher affinity according to the user's psychological state.
- the database that stores the relationship between the user himself / herself who is represented by the virtual agent and the partner user who is a partner of communication on the network, the psychological state of the partner user, the user himself / herself and the partner
- An information processing system including a control unit that generates a message according to a relationship with a user and transmits the message to the counterpart user from the virtual agent is proposed.
- the processor is responsive to the psychological state of the other user who is a communication partner on the network, and the relationship between the user himself / herself represented by the virtual agent and the other user stored in the database. Proposing an information processing method including generating a message and transmitting the message from the virtual agent to the counterpart user.
- FIG. 1 is a diagram illustrating an overview of an information processing system according to an embodiment of the present disclosure.
- the virtual agent 5B agent representing the user 5B (for example, a mother) related to the user 5A changes to the psychological state.
- the user 5 ⁇ / b> A can be relieved by controlling the corresponding message to arrive.
- the virtual agent 5B agent responds on behalf of the user B. Convenience is high.
- the user A can also feel relieved by seeing a message according to his / her psychological state (excited, excited, lonely and comforting, confused and in trouble). There is an effect of satisfaction.
- the message from the virtual agent 5B agent is a message that takes into account the relationship between the proxy user 5B and the user 5A as the transmission partner, so that the partner user A who received the message has a higher affinity that touches the heart. It will be a thing.
- FIG. 2 is a diagram illustrating the overall configuration of the information processing system according to the present embodiment.
- the information processing system includes information processing apparatuses 2A and 2B and a server 1 that each user has.
- the information processing apparatuses 2A and 2B and the server 1 are connected via the network 4 to transmit and receive data.
- the information processing apparatuses 2A and 2B can exchange messages (for example, text-based conversation) in real time via the network 4 under the control of the server 1.
- the information processing apparatus 2A acquires information such as the user's movement, vibration, pulse, pulse wave, heart rate, sweating, breathing, blood pressure, or body temperature from the wearable terminal 3 worn by the user, and the psychological state of the user (for example, Emotions such as anger, disgust, fear, joy, sadness and surprise).
- the wearable terminal 3 can be realized by, for example, a smart watch, a smart band, a smart eyeglass, a smart neck, or an implantable terminal. Note that user sensing is not limited to that by the wearable terminal 3, and may be sensed by the information processing apparatus 2A or a surrounding environmental sensor.
- the information processing apparatus 2A As a sensor or environment sensor provided in the information processing apparatus 2A, for example, a camera, a microphone, or the like is assumed, and it is possible to capture a user's face image (facial expression) or collect a user's utterance.
- the information processing apparatus 2 ⁇ / b> A is “lost” or “thinking” when it detects a state of moving around the same location based on, for example, user motion data detected by a motion sensor. , “I'm friendly”, or “I ’m concerned”. Further, the information processing apparatus 2A determines that the emotional state is “happy” or “anxiety / worried” by combining the facial expression extracted from the captured image obtained by capturing the user's face and the pulse detected by the pulse sensor. .
- the information processing apparatus 2 ⁇ / b> A can also determine the psychological state of “tired” by combining the user's sigh, eye movements, and facial expressions.
- the determination of the psychological state by the information processing apparatus 2A can be performed using machine learning, for example.
- the server 1 generates a message to the partner user according to the relationship between the psychological state of the partner user acquired from the information processing device 2A and the partner user and the user who acts as a proxy (user of the information processing device 2B). Controls transmission from the user's virtual agent to the other user. Thereby, it becomes possible to automatically transmit a message with high affinity according to the psychological state of the other user and the relationship with the user.
- a virtual agent that performs an automatic response is referred to as a bot.
- FIG. 3 is a block diagram illustrating an example of the configuration of the server 1 according to the present embodiment.
- the server 1 includes a control unit 10, a communication unit 11, a relationship information DB (database) 12, and a bot response DB 13.
- the communication unit 11 transmits / receives data to / from an external device via wired / wireless.
- the communication unit 11 is connected to the information processing apparatuses 2A and 2B and transmits and receives data.
- the control unit 10 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the server 1 according to various programs.
- the control unit 10 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
- the control unit 10 according to the present embodiment also functions as the message generation unit 101, the presentation control unit 102, the notification control unit 103, and the setting unit 104.
- the message generation unit 101 generates a message to the partner user according to the psychological state of the partner user acquired from the information processing apparatus 2A and the relationship between the partner user and the user acting as a proxy.
- the message generation unit 101 can also analyze the content of messages that the user has exchanged with the other user in the past, and generate messages using commonly used expressions and tone.
- the message generation unit 101 can also create a response with keywords used in past exchanges.
- the message generation unit 101 mines the context of past exchanges and generates a message with reference to the user's response when the other party was in the same psychological state in the past. It is also possible to generate a slightly altered response.
- the message generation unit 101 may analyze the contents of messages exchanged between the user and other users in the past, and may be used to generate a message for the other user. In particular, even when the user has not exchanged messages with the other user in the past, it can be made to resemble the user's tone and expression by generating based on past messages between the user and other users. Also, by considering past messages with other users, it is possible to change the bot response to the other user.
- the presentation control unit 102 controls to transmit the message generated by the message generation unit 101 to the other user as a proxy user's virtual agent. Specifically, the presentation control unit 102 controls to post a message on both chat screens together with a proxy user icon (or an icon obtained by processing a user icon for a bot), or information processing of the other user Control to send to the device as mail from the virtual agent.
- the notification control unit 103 controls the message generated by the message generation unit 101 to be notified to the proxy user before or after presenting the message to the other user.
- the user who received the notification can check what message the bot that is acting on his / her own has sent to the other party, and whether or not he / she wants to send it, and can edit it as necessary. is there.
- the setting unit 104 has a function of setting whether or not a bot response is possible. Specifically, the setting unit 104 sets whether to perform a bot response for each partner user, each psychological state, and each time period. Such a setting can be arbitrarily designated by the user, for example.
- the bot response setting will be specifically described with reference to FIGS. 4 and 5.
- FIG. 4 is a diagram showing an example of a bot response setting screen 200 that can be set for each partner user.
- the user can set whether or not to respond by a bot acting on his / her own behalf (that is, a virtual agent).
- ON / OFF of the bot response can be set by the buttons 201, 202, and the like.
- the bot response of the partner user 5 ⁇ / b> A is set to OFF by the button 201
- the bot response of the partner user 5 ⁇ / b> C is set to ON by the button 202.
- the bot response When the bot response is set to ON, the bot response can be set in more detail.
- FIG. 4 a description will be given with reference to FIG.
- FIG. 5 is a diagram showing an example of a detailed bot response setting screen 210 for each partner user.
- more detailed bot response setting can be performed when the bot response of the partner user a is set to ON.
- display distinction setting and emotion-specific bot response setting can be performed.
- the display distinction setting is a setting for whether or not to display the response by the bot and the response by the user himself / herself.
- “Make display distinct” on the bot response setting screen 210 is set to ON.
- the display distinction is set to ON, when the presentation control unit 102 presents the message generated by the message generation unit 101 (that is, a bot response) to the other user a, a message from the user himself / herself is presented Control is performed so that presentation is performed in a display mode different from that in normal times.
- the display distinction setting is not limited to the example shown in FIG. 5 and may be set for each emotion.
- FIG. 6 is a diagram illustrating an example of the chat screen of the other user when the display distinction of the bot response according to the present embodiment is not performed.
- the message 221 is a message input by the user 5B himself
- the message 223 is a message input by the bot of the user 5B himself, that is, generated by the message generation unit 101 of the server 1 and sent by the presentation control unit 102.
- the response is a presentation-controlled response, no distinction is made on the display.
- the bot response message 223 indicates that the user 5C's psychological state is “tired” or “hungry” when a certain time has elapsed after the message 222 input by the user 5C is displayed at 16:10. It is generated by the message generation unit 101 of the server 1 triggered by the fact that it is detected that the current state is detected, and the presentation control unit 102 controls the presentation.
- FIG. 7 is a diagram showing an example of the chat screen of the other user when the display distinction of the bot response according to the present embodiment is performed.
- the message 234 input by the bot acting on behalf of the user 5B is displayed on the chat screen 230 in a display mode different from the message 231 input by the user 5B himself.
- the color of the display area of the message 234 is different.
- the display mode of the message is not limited to this, and the presentation control unit 102 may control the font and the display frame of the message to be different, for example.
- FIG. 1 is a diagram showing an example of the chat screen of the other user when the display distinction of the bot response according to the present embodiment is performed.
- the message 234 input by the bot acting on behalf of the user 5B is displayed on the chat screen 230 in a display mode different from the message 231 input by the user 5B himself.
- the color of the display area of the message 234 is different.
- the display mode of the message is not limited to this, and the presentation control unit 102 may control
- the presentation control unit 102 processes the icon of the user 5B so that it can be understood that the message is a statement made by the bot acting on the user 5B's behalf.
- a bot response may be displayed together with 233.
- FIG. 7 a processing example in which the user 5B is converted into a bot (characterized) is used.
- the present embodiment is not limited to this, for example, processing that changes the color or tone of the user 5B icon. You may use the image which added.
- “prior confirmation”, “response time zone”, “consideration of other user response”, and the like may be set for each emotion (that is, psychological state) of the other user.
- “Advance confirmation” is a setting for whether or not the user himself / herself confirms the content of the bot response to the partner user before presenting the content to the partner user.
- the server 1 is generated by the message generator 101.
- the notification control unit 103 Before sending the message to the other user 5C, the notification control unit 103 notifies the user 5B himself / herself of the message content and the like. A case where the user 5B himself confirms in advance will be specifically described with reference to FIG.
- FIG. 8 is a diagram for explaining prior confirmation to the user of the bot response according to the present embodiment.
- the chat screen 240 shown in FIG. 8 is displayed on the information processing apparatus 2 of the user 5B himself (for example, the information processing apparatus 2B shown in FIG. 1).
- the message 241 displayed on the chat screen 240 in FIG. 8 is input by the user 5B himself, and the message 242 is input by the partner user 5C.
- the other user's 5C psychological state is such as “tired” or “hungry”.
- a message is generated by the message generation unit 101 of the server 1 triggered by the fact that it is detected.
- the server 1 notifies the user 5B himself / herself in advance of the message generated by the message generation unit 101 by the notification control unit 103.
- a message 244 shown in FIG. 8 is a notification message for such advance confirmation.
- the icon 243 the bot icon of the user 5B is used.
- the user 5B confirms the content of the bot response included in the message 244, taps the OK button if OK, and taps the cancel button to stop the transmission.
- the server 1 can pop up the menu screen 251 on the screen 250 as shown in FIG. 9 and allow the user 5B to select the editing of the content of the bot response. It can be.
- FIG. 9 is a diagram for explaining an example of a menu screen displayed when the bot response is permitted.
- the menu screen 251 includes a “send as it is” button, a “reference history” button, a “display other candidates” button, and an “edit” button.
- the server 1 performs control so that the notified bot response is presented to the partner user.
- the “history reference” button is tapped, the history reference screen 252 as shown in FIG. Control to display.
- FIG. 10 on the history reference screen 252, “self history” and “others history” are displayed as bot response candidates.
- the self history a history of messages that have been answered by the bot of the user (here, the user 5B) under the same situation (for example, the other user, the other user's psychological state, time zone, and contents) is displayed.
- the other user the other user's psychological state, time zone, and contents
- the other user a history of messages that bots of other people (users other than the user 5B in this case) responded to in the past under the same situation (for example, the psychological state, time zone, and contents of the other user) is displayed.
- the user 5B can edit the bot response by referring to the bot response history of the user or another person. If there is no history, the “reference history” button is not displayed on the menu screen 251 shown in FIG.
- the server 1 when the “display other candidates” button is tapped on the menu screen 251 of FIG. 9, the server 1 generates other candidates for the bot response by the message generation unit 101 and displays them.
- the server 1 displays an edit screen 253 as shown in FIG. As shown in FIG. 11, on the edit screen 253, the text of the bot response can be edited.
- the editing screen 253 keywords extracted by parsing past message exchanges by the message generation unit 101 can be presented as candidates. As a result, the user can easily edit the contents of the bot response.
- the “response time zone” setting sets whether or not to perform a bot response according to the time zone for each emotion. For example, in the example shown in FIG. 5, when the partner user 5C has a feeling of "tired” and a feeling of "enjoying", it is set to perform a bot response between 8:00 and 20:00 on weekdays. . In the case of other emotions not listed in the list as specific emotions ("Others" shown in FIG. 5), the bot response is set at any time on weekdays, weekends, and holidays. In addition to specifying the time zone, for example, a setting method of setting the bot response to be possible until eight hours later is possible because of work now.
- the message generator 101 parses past messages exchanged between the other user and the user himself, picks up keywords, and uses these keywords and usual expressions and tone to generate messages. Basically, if “considering other user response” is set to ON, a message is also generated using the bot response information of another user. In the example illustrated in FIG. 5, “consideration of other user response” is set to OFF, but when it is set to ON, the message generation unit 101 may, for example, send another user to the other user who has detected the same emotion. A message is also generated using the result of parsing the past message.
- the detailed bot response setting has been described above, but the setting items according to the present embodiment are not limited to the example illustrated in FIG. 5. For example, whether or not the tone of the message generated as the bot response is similar to the user's own tone. It is also possible to perform setting, setting whether or not to make it completely different from his own tone, and the like.
- the emotion of “others” shown in FIG. 5 is other emotions not listed as specific examples. Further, in any emotion, when the emotion that is the psychological state of the other user exceeds a threshold value, the emotion may be observed. As a result, “normal times” in which a specific emotion does not protrude are not included in any of the emotions shown in FIG. 5 including “others” and can be excluded from bot responses.
- the information regarding the setting of the bot response described above is stored in association with the user ID in the relation information DB 12.
- the relationship information DB 12 of the server 1 is a database that accumulates the relationship between the user acting as a bot (virtual agent) and the other user who communicates with the user on the network.
- the relationship information DB 12 stores, for example, a past history of communication between the users (for example, a bot response history, a transmitted / received message, a “commonly used keyword” based on message analysis, etc.).
- the relationship between the users may be registered in advance by the user, or may be automatically registered by the control unit 10 of the server 1 by analyzing the past message exchange between the two.
- the relationship between users for example, a spouse, a lover, a parent-child relationship (parent / child), a brother, a workplace (superior / subordinate), a friend (close friend / light relationship), a senior / junior, or the like is assumed.
- whether or not the bot response can be set for each specific partner can be set.
- the present embodiment is not limited to this and is registered in the relationship information DB 12. It is also possible to set bot response availability for each “relationship”.
- the “relationship” between the two is also used when a message is generated by the message generation unit 101, and the tone and expression of the message can be changed according to the relationship with the other party.
- FIG. 12 is a flowchart showing the first bot response generation process according to the present embodiment. As shown in FIG. 12, first, the other user's emotion and psychological state are detected by the other user's information processing apparatus 2A (step S103).
- the information processing apparatus 2A transmits the detection result to the server 1 (step S106).
- control unit 10 of the server 1 determines whether or not a bot response should be made to the other user (step S109). Specifically, for example, the control unit 10 refers to the relationship information DB 12 and refers to the bot response setting information registered in association with the ID of the user whose relationship with the other user is registered, and detects it. It is determined whether or not to respond to the bot based on the psychological state and the time zone.
- the message generation unit 101 refers to the relationship information DB 12, and based on the relationship between the partner user and the user, and the emotion and psychological state of the partner user. , A bot response is generated (step S112).
- the message generation unit 101 generates a message using, for example, a standard bot response table stored in the bot response DB 13.
- FIG. 13 shows an example of the standard bot response table.
- the standard bot response table 130 stores standard bot responses corresponding to the emotions of the other user and the relationship with the other user.
- the message generation unit 101 can generate a bot response according to the emotion of the other user and the relationship between the other user and the user according to the standard bot response table 130.
- FIG. 14 is a flowchart showing a second bot response generation process according to this embodiment.
- a bot response is generated using various information such as past history.
- steps S203 to S209 shown in FIG. 14 processing similar to that in steps S103 to S109 shown in FIG.
- step S212 when it is determined that a bot response should be made (step S212 / Yes), the message generation unit 101 of the server 1 receives and answers (specifically, the message) between the partner user and the user within the most recent predetermined time. It is determined whether or not there has been (exchange) (step S212).
- step S212 / Yes when there is an answer in the past predetermined time (step S212 / Yes), the message generator 101 sets a bot response generation flag considering the contents of the answer (step S215). On the other hand, when there is no answer (step S212 / No), the flag is not set.
- the message generation unit 101 determines whether or not there is a communication history (specifically message exchange) when the other user has the same emotion and psychological state in the past history between the other user and the user. (Step S218).
- step S218 / Yes when there is a past history at the same emotion and psychological state (step S218 / Yes), the message generation unit 101 sets a bot response generation flag that also considers responses at the same emotion and psychological state in the past (step S218 / Yes). S221). On the other hand, when there is no past history (step S218 / No), the flag is not set.
- the message generation unit 101 determines whether or not to accept other user's response contents (step S224). Specifically, the message generation unit 101 makes a determination based on ON / OFF of “consideration of other user response” in the setting information stored in association with the user ID.
- step S224 / Yes when the contents of other users 'responses are taken in (step S224 / Yes), the message generation unit 101 sets a bot response generation flag that takes into account the other users' responses in the same relationship, the same emotion, and the psychological state. (Step S227). On the other hand, when not taking in (step S224 / No), the flag is not set.
- the message generation unit 101 generates a bot response according to the set response generation flag (step S230). That is, a bot response is generated using at least one of the past answers of the past predetermined time, the same emotion, the past history of the psychological state, and the past history of other users having the same relationship and the same emotion and psychological state.
- the message generation unit 101 can automatically generate a bot response by performing syntax analysis based on various types of information stored in the relationship information DB 12.
- various information stored in the relationship information DB 12 will be described with reference to FIGS.
- FIG. 15 is a diagram illustrating an example of the relationship information stored in the relationship information DB 12 according to the present embodiment.
- the relationship information 120 includes a user ID, a partner user ID, a relationship between both, a bot response history of both, a transmission message history from the user to the partner user, a received message history from the partner user, and these histories "Frequently used keywords" information extracted by parsing based on
- the data structure shown in FIG. 15 is an example, and the present embodiment is not limited to this. An example of each table showing both the bot response history, the transmission message history, and the reception message history shown in FIG. 15 will be described below.
- FIG. 16 is a diagram illustrating an example of a bot response table.
- the bot response table 121 stores the contents, date and time of the past bot response that responded to the other user, and the emotion (or psychological state) of the other user at that time.
- FIG. 17 is a diagram illustrating an example of a transmission message table. As shown in FIG. 17, the transmission message table 122 stores the contents, date and time of the past message that the user responded to the other user, and the emotion (or psychological state) of the other user at that time. .
- FIG. 18 is a diagram illustrating an example of a received message table.
- the received message table 123 stores the contents, date and time of the past message received by the user himself / herself from the other user, and the emotion (or psychological state) of the other user at that time.
- the message generation unit 101 can parse the contents with reference to the transmission message table 122 and the reception message table 123, and generate a new automatic response message in consideration of the relationship between the two persons.
- FIG. 19 is a diagram showing an example of a frequently used keyword table.
- a keyword extracted from the syntax analysis of the transmission / reception message is stored in the frequently used keyword table 124.
- the example shown in FIG. 19 is in units of words, but messages that are frequently used (highly used) between the user and the other user may be extracted and stored for each scene. Note that parsing information, natural language processing information, and the like used when extracting words may be held in a storage unit (not shown) of the server 1.
- FIG. 20 is a flowchart showing a bot response presentation process according to this embodiment.
- the presentation control unit 102 of the server 1 confirms whether or not it is set to perform prior confirmation of the bot response (see FIG. 5) (step S ⁇ b> 303).
- the presentation control unit 102 inquires of the user whether the bot response is possible (step S306).
- the inquiry to the user is as described above with reference to FIG.
- step S309 / Yes the presentation control unit 102 checks whether or not the display distinction of the bot response (see FIG. 5) is on (step S312).
- step S312 / No when the display distinction of the bot response is not on (step S312 / No), the presentation control unit 102 transmits the generated message as a normal message to the other user (step S315).
- step S312 when the display distinction of the bot response is on (step S312 / Yes), the presentation control unit 102 transmits the generated message as a bot response to the other user (step S318).
- the information processing apparatus 2A that has received the message from the server 1 displays a bot response (step S321).
- the information processing apparatus 2A displays the bot response without distinguishing it from the normal message (message input by the user) as described with reference to FIG. Is transmitted as a bot response, as described with reference to FIG. 7, the bot response is displayed separately from the normal message (in a different display mode).
- the notification control unit 103 of the server 1 checks whether or not the notification of the bot response is on (step S324), and if it is on (step S324 / Yes), performs the bot response. Control is performed to notify the user himself / herself (step S327).
- a computer-readable storage medium storing the computer program is also provided.
- the information processing system is not limited to text chat, but can be applied to voice chat.
- a voice chat system that converts text into speech and plays it back is assumed.
- the server 1 when applied to a voice chat system, the server 1 generates synthesized speech using the phoneme data of the person associated with the bot when the message generator 101 generates a bot response message. It is also possible to control to output the bot message with a sound similar to
- this technique can also take the following structures.
- a database that accumulates the relationship between the user who is represented by the virtual agent and the other user who is the communication partner on the network; Generating a message according to the psychological state of the partner user and the relationship between the user and the partner user; A control unit for transmitting from the virtual agent to the counterpart user; Comprising Information processing system.
- the information processing system according to (1) wherein the control unit transmits the message to the counterpart user and also transmits the message to the user.
- the control unit automatically generates a message from the tendency of messages transmitted / received in the past between the user and the partner user when the partner user is in the psychological state, (1) or ( The information processing system according to 2).
- the control unit automatically generates a message from the tendency of messages transmitted and received between other users, which is the same relationship as the relationship between the user himself / herself and the counterpart user, (1) or (2 ) Information processing system.
- the control unit sets whether to allow the user himself / herself to automatically send a message from the virtual agent to the other user according to a setting input via a specific user interface.
- the information processing system according to any one of (1) to (4).
- the control unit determines whether or not to obtain permission from the user before sending a message to the other user according to a setting input via a specific user interface, (1) The information processing system according to any one of (5) to (5).
- the control unit sets whether to display the message from the user himself / herself or the message from the virtual agent in an identifiable manner according to a setting input via a specific user interface.
- the information processing system according to any one of 1) to (6).
- the control unit automatically sends a message to the partner user from the virtual agent for each psychological state of the partner user according to a setting input via a specific user interface.
- the information processing system according to any one of (1) to (7), wherein whether or not to permit is set.
- the control unit permits the user himself / herself to automatically send a message to the other user from the virtual agent according to a time zone according to a setting input through a specific user interface.
- the information processing system according to any one of (1) to (8), wherein whether to perform or not is set.
- control unit edits the generated message in response to a request from the user.
- (11) Processor Generating a message according to the psychological state of the partner user who is the partner of communication on the network, and the relationship between the user himself / herself represented by the virtual agent and the partner user stored in the database; Transmitting from the virtual agent to the counterpart user; Including an information processing method.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.本開示の一実施形態による情報処理システムの概要
2.サーバの構成
3.動作処理
3-1.第1のbot応答生成処理
3-2.第2のbot応答生成処理
3-3.bot応答提示処理
4.まとめ
図1は、本開示の一実施形態による情報処理システムの概要を説明する図である。図1に示すように、ユーザ5A(例えば子供)が不安な心理状態にあることをセンシングした際に、ユーザ5Aと関係するユーザ5B(例えば母親)を代理する仮想エージェント5Bagentから、心理状態に応じたメッセージが届くよう制御することで、ユーザ5Aを安心させることができる。特にユーザ5Bが忙しい状況にあって、ユーザ5Aやシステム側から通知があっても即座に応答することができないような場合、代理で仮想エージェント5Bagentが応答してくれることは、ユーザBにとっても利便性が高い。また、ユーザA側としても、自身の心理状態(うきうきうれしく興奮している、寂しくて慰めて欲しい、おどおどして困っている等)に応じたメッセージを見て安心することができたり、気持ちが満足するといった効果がある。さらに、仮想エージェント5Bagentからのメッセージは、代理するユーザ5Bと送信相手のユーザ5Aとの関係を考慮したメッセージとすることで、メッセージを貰った相手ユーザAにとって、より心に響く親和性の高いものとなる。
図3は、本実施形態によるサーバ1の構成の一例を示すブロック図である。図3に示すように、サーバ1は、制御部10、通信部11、関係情報DB(データベース)12、およびbot応答DB13を含む。
図5に示す例では、「他ユーザ応答考慮」がいずれもOFFに設定されているが、ONに設定されている場合、メッセージ生成部101は、例えば同じ感情が検知された相手ユーザに対する他ユーザの過去メッセージの構文解析結果も用いてメッセージを生成する。
続いて、本実施形態による情報処理システムの動作処理について説明する。
図12は、本実施形態による第1のbot応答生成処理を示すフローチャートである。図12に示すように、まず、相手ユーザの感情、心理状態が、相手ユーザの情報処理装置2Aで検知される(ステップS103)。
図14は、本実施形態による第2のbot応答生成処理を示すフローチャートである。第2のbot応答生成処理では、過去履歴等様々な情報をさらに用いてbot応答を生成する。
続いて、上記第1のbot応答生成処理または第2のbot応答生成処理により生成したbot応答のメッセージを相手ユーザへ提示する際の処理について図20を参照して説明する。図20は、本実施形態によるbot応答提示処理を示すフローチャートである。
上述したように、本開示の実施形態による情報処理システムでは、ユーザの心理状態に応じて、より親和性の高い自動応答を行うことが可能となる。
(1)
仮想エージェントによって代理されるユーザ本人と、ネットワーク上でのコミュニケーションの相手である相手ユーザとの関係を蓄積するデータベースと、
前記相手ユーザの心理状態と、前記ユーザ本人と前記相手ユーザとの関係に応じて、メッセージを生成し;
前記仮想エージェントから前記相手ユーザに送信する制御部と、
を備える、
情報処理システム。
(2)
前記制御部は、前記メッセージを前記相手ユーザに送信すると共に、前記ユーザ本人にも送信する、前記(1)に記載の情報処理システム。
(3)
前記制御部は、前記相手ユーザが前記心理状態であって、前記ユーザ本人と前記相手ユーザとの間で過去に送受信されたメッセージの傾向からメッセージを自動的に生成する、前記(1)または(2)に記載の情報処理システム。
(4)
前記制御部は、前記ユーザ本人と前記相手ユーザとの間の関係と同じ関係である、他のユーザ間において送受信されたメッセージの傾向からメッセージを自動的に生成する、前記(1)または(2)に記載の情報処理システム。
(5)
前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記ユーザ本人が、相手ユーザに対して前記仮想エージェントより自動的にメッセージを送信することを許可するか否かを設定する、前記(1)~(4)のいずれか1項に記載の情報処理システム。
(6)
前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記相手ユーザに対してメッセージを送信する前に前記ユーザ本人に許可を得るか否かを判断する、前記(1)~(5)のいずれか1項に記載の情報処理システム。
(7)
前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記ユーザ本人からのメッセージか、前記仮想エージェントからのメッセージかを識別可能に表示するか否かを設定する、前記(1)~(6)のいずれか1項に記載の情報処理システム。
(8)
前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記相手ユーザの心理状態毎に、前記ユーザ本人が相手ユーザに対して前記仮想エージェントより自動的にメッセージを送信することを許可するか否かを設定する、前記(1)~(7)のいずれか1項に記載の情報処理システム。
(9)
前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、時間帯に応じて、前記ユーザ本人が、相手ユーザに対して前記仮想エージェントより自動的にメッセージを送信することを許可するか否かを設定する、前記(1)~(8)のいずれか1項に記載の情報処理システム。
(10)
前記制御部は、前記ユーザ本人による要求に応じて、前記生成したメッセージを編集する、前記(1)~(9)のいずれか1項に記載の情報処理システム。
(11)
プロセッサが、
ネットワーク上でのコミュニケーションの相手である相手ユーザの心理状態と、データベースに蓄積されている、仮想エージェントによって代理されるユーザ本人と前記相手ユーザとの関係に応じて、メッセージを生成することと、
前記仮想エージェントから前記相手ユーザに送信することと、
を含む、情報処理方法。
10 制御部
101 メッセージ生成部
102 提示制御部
103 通知制御部
104 設定部
11 通信部
12 関係情報DB
13 bot応答DB
2 情報処理装置
3 ウェアラブル端末
4 ネットワーク
Claims (11)
- 仮想エージェントによって代理されるユーザ本人と、ネットワーク上でのコミュニケーションの相手である相手ユーザとの関係を蓄積するデータベースと、
前記相手ユーザの心理状態と、前記ユーザ本人と前記相手ユーザとの関係に応じて、メッセージを生成し;
前記仮想エージェントから前記相手ユーザに送信する制御部と、
を備える、
情報処理システム。 - 前記制御部は、前記メッセージを前記相手ユーザに送信すると共に、前記ユーザ本人にも送信する、請求項1に記載の情報処理システム。
- 前記制御部は、前記相手ユーザが前記心理状態であって、前記ユーザ本人と前記相手ユーザとの間で過去に送受信されたメッセージの傾向からメッセージを自動的に生成する、請求項1に記載の情報処理システム。
- 前記制御部は、前記ユーザ本人と前記相手ユーザとの間の関係と同じ関係である、他のユーザ間において送受信されたメッセージの傾向からメッセージを自動的に生成する、請求項1に記載の情報処理システム。
- 前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記ユーザ本人が、相手ユーザに対して前記仮想エージェントより自動的にメッセージを送信することを許可するか否かを設定する、請求項1に記載の情報処理システム。
- 前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記相手ユーザに対してメッセージを送信する前に前記ユーザ本人に許可を得るか否かを判断する、請求項1に記載の情報処理システム。
- 前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記ユーザ本人からのメッセージか、前記仮想エージェントからのメッセージかを識別可能に表示するか否かを設定する、請求項1に記載の情報処理システム。
- 前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、前記相手ユーザの心理状態毎に、前記ユーザ本人が相手ユーザに対して前記仮想エージェントより自動的にメッセージを送信することを許可するか否かを設定する、請求項1に記載の情報処理システム。
- 前記制御部は、特定のユーザーインタフェースを介して入力された設定に応じて、時間帯に応じて、前記ユーザ本人が、相手ユーザに対して前記仮想エージェントより自動的にメッセージを送信することを許可するか否かを設定する、請求項1に記載の情報処理システム。
- 前記制御部は、前記ユーザ本人による要求に応じて、前記生成したメッセージを編集する、請求項1に記載の情報処理システム。
- プロセッサが、
ネットワーク上でのコミュニケーションの相手である相手ユーザの心理状態と、データベースに蓄積されている、仮想エージェントによって代理されるユーザ本人と前記相手ユーザとの関係に応じて、メッセージを生成することと、
前記仮想エージェントから前記相手ユーザに送信することと、
を含む、情報処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020187010075A KR20180068975A (ko) | 2015-10-20 | 2016-07-08 | 정보 처리 시스템, 및 정보 처리 방법 |
CN201680060292.5A CN108139988B (zh) | 2015-10-20 | 2016-07-08 | 信息处理系统和信息处理方法 |
EP16857134.7A EP3367249A4 (en) | 2015-10-20 | 2016-07-08 | Information processing system and information processing method |
US15/755,361 US10673788B2 (en) | 2015-10-20 | 2016-07-08 | Information processing system and information processing method |
US16/839,060 US20200236070A1 (en) | 2015-10-20 | 2020-04-02 | Information processing system and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-206443 | 2015-10-20 | ||
JP2015206443 | 2015-10-20 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/755,361 A-371-Of-International US10673788B2 (en) | 2015-10-20 | 2016-07-08 | Information processing system and information processing method |
US16/839,060 Continuation US20200236070A1 (en) | 2015-10-20 | 2020-04-02 | Information processing system and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017068816A1 true WO2017068816A1 (ja) | 2017-04-27 |
Family
ID=58556912
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/070207 WO2017068816A1 (ja) | 2015-10-20 | 2016-07-08 | 情報処理システム、および情報処理方法 |
Country Status (5)
Country | Link |
---|---|
US (2) | US10673788B2 (ja) |
EP (1) | EP3367249A4 (ja) |
KR (1) | KR20180068975A (ja) |
CN (2) | CN113612677A (ja) |
WO (1) | WO2017068816A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019057168A (ja) * | 2017-09-21 | 2019-04-11 | 大日本印刷株式会社 | コンピュータプログラム、サーバ装置、指導システム、指導方法、指導者端末装置及び被指導者端末装置 |
WO2019116489A1 (ja) * | 2017-12-14 | 2019-06-20 | Line株式会社 | プログラム、情報処理方法、及び情報処理装置 |
WO2019172087A1 (ja) * | 2018-03-08 | 2019-09-12 | ソニー株式会社 | 情報処理装置、端末機器、情報処理方法、およびプログラム |
WO2019188523A1 (ja) * | 2018-03-27 | 2019-10-03 | 株式会社Nttドコモ | 応答システム |
US10748644B2 (en) | 2018-06-19 | 2020-08-18 | Ellipsis Health, Inc. | Systems and methods for mental health assessment |
US11120895B2 (en) | 2018-06-19 | 2021-09-14 | Ellipsis Health, Inc. | Systems and methods for mental health assessment |
CN114037467A (zh) * | 2015-10-20 | 2022-02-11 | 索尼公司 | 信息处理系统、信息处理方法和计算机可读存储介质 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102701868B1 (ko) | 2018-12-03 | 2024-09-03 | 삼성전자주식회사 | 전자 장치 및 전자 장치의 제어 방법 |
WO2022000256A1 (en) * | 2020-06-30 | 2022-01-06 | Ringcentral, Inc. | Methods and systems for directing communications |
CN113676394B (zh) * | 2021-08-19 | 2023-04-07 | 维沃移动通信(杭州)有限公司 | 信息处理方法和信息处理装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10222521A (ja) * | 1997-01-31 | 1998-08-21 | Toshiba Corp | 情報共有支援システム |
US5918222A (en) * | 1995-03-17 | 1999-06-29 | Kabushiki Kaisha Toshiba | Information disclosing apparatus and multi-modal information input/output system |
JP2005348167A (ja) * | 2004-06-03 | 2005-12-15 | Vodafone Kk | 移動体通信端末 |
US20080262982A1 (en) * | 2007-04-23 | 2008-10-23 | Sudhir Rajkhowa | System for therapy |
JP2013061889A (ja) * | 2011-09-14 | 2013-04-04 | Namco Bandai Games Inc | プログラム、情報記憶媒体、端末装置及びサーバ |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5727950A (en) * | 1996-05-22 | 1998-03-17 | Netsage Corporation | Agent based instruction system and method |
US6427063B1 (en) * | 1997-05-22 | 2002-07-30 | Finali Corporation | Agent based instruction system and method |
US6339774B1 (en) * | 1997-01-29 | 2002-01-15 | Kabushiki Kaisha Toshiba | Information sharing system and computer program product for causing computer to support the information sharing system |
US6615091B1 (en) * | 1998-06-26 | 2003-09-02 | Eveready Battery Company, Inc. | Control system and method therefor |
US7353295B1 (en) * | 2000-04-04 | 2008-04-01 | Motive, Inc. | Distributed services architecture through use of a dynamic service point map |
US20030158960A1 (en) * | 2000-05-22 | 2003-08-21 | Engberg Stephan J. | System and method for establishing a privacy communication path |
WO2003001413A1 (en) * | 2001-06-22 | 2003-01-03 | Nosa Omoigui | System and method for knowledge retrieval, management, delivery and presentation |
US7590589B2 (en) * | 2004-09-10 | 2009-09-15 | Hoffberg Steven M | Game theoretic prioritization scheme for mobile ad hoc networks permitting hierarchal deference |
US20070044539A1 (en) * | 2005-03-01 | 2007-03-01 | Bryan Sabol | System and method for visual representation of a catastrophic event and coordination of response |
WO2007052285A2 (en) * | 2005-07-22 | 2007-05-10 | Yogesh Chunilal Rathod | Universal knowledge management and desktop search system |
US8874477B2 (en) * | 2005-10-04 | 2014-10-28 | Steven Mark Hoffberg | Multifactorial optimization system and method |
US8151323B2 (en) * | 2006-04-12 | 2012-04-03 | Citrix Systems, Inc. | Systems and methods for providing levels of access and action control via an SSL VPN appliance |
JP4710705B2 (ja) | 2006-04-24 | 2011-06-29 | 日本電気株式会社 | 携帯無線端末システム及び携帯無線端末 |
CN101075435B (zh) * | 2007-04-19 | 2011-05-18 | 深圳先进技术研究院 | 一种智能聊天系统及其实现方法 |
JP5418938B2 (ja) | 2009-03-04 | 2014-02-19 | 株式会社国際電気通信基礎技術研究所 | グループ行動推定装置およびサービス提供システム |
US20180143989A1 (en) * | 2016-11-18 | 2018-05-24 | Jagadeshwar Nomula | System to assist users of a software application |
US8762468B2 (en) * | 2011-11-30 | 2014-06-24 | At&T Mobility Ii, Llc | Method and apparatus for managing communication exchanges |
EP2624180A1 (en) * | 2012-02-06 | 2013-08-07 | Xabier Uribe-Etxebarria Jimenez | System of integrating remote third party services |
US20140310379A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Vehicle initiated communications with third parties via virtual personality |
US9425974B2 (en) * | 2012-08-15 | 2016-08-23 | Imvu, Inc. | System and method for increasing clarity and expressiveness in network communications |
US9679300B2 (en) * | 2012-12-11 | 2017-06-13 | Nuance Communications, Inc. | Systems and methods for virtual agent recommendation for multiple persons |
WO2014093339A1 (en) * | 2012-12-11 | 2014-06-19 | Nuance Communications, Inc. | System and methods for virtual agent recommendation for multiple persons |
US9721086B2 (en) * | 2013-03-15 | 2017-08-01 | Advanced Elemental Technologies, Inc. | Methods and systems for secure and reliable identity-based computing |
US9288274B2 (en) * | 2013-08-26 | 2016-03-15 | Cellco Partnership | Determining a community emotional response |
CN103400054A (zh) * | 2013-08-27 | 2013-11-20 | 哈尔滨工业大学 | 计算机辅助心理咨询自动问答机器人系统 |
JP2015069455A (ja) * | 2013-09-30 | 2015-04-13 | Necソリューションイノベータ株式会社 | 会話文生成装置、会話文生成方法、及びプログラム |
CN104639420B (zh) * | 2013-11-15 | 2019-06-07 | 腾讯科技(深圳)有限公司 | 即时通讯的信息处理方法和系统 |
US10050926B2 (en) * | 2014-02-05 | 2018-08-14 | Facebook, Inc. | Ideograms based on sentiment analysis |
CN104951428B (zh) * | 2014-03-26 | 2019-04-16 | 阿里巴巴集团控股有限公司 | 用户意图识别方法及装置 |
WO2015176287A1 (zh) * | 2014-05-22 | 2015-11-26 | 华为技术有限公司 | 应用文本信息进行通信的方法及装置 |
CN104022942B (zh) * | 2014-06-26 | 2018-09-11 | 北京奇虎科技有限公司 | 处理交互式消息的方法、客户端、电子设备及系统 |
US20170046496A1 (en) * | 2015-08-10 | 2017-02-16 | Social Health Innovations, Inc. | Methods for tracking and responding to mental health changes in a user |
US10532268B2 (en) * | 2016-05-02 | 2020-01-14 | Bao Tran | Smart device |
WO2019161207A1 (en) * | 2018-02-15 | 2019-08-22 | DMAI, Inc. | System and method for conversational agent via adaptive caching of dialogue tree |
-
2016
- 2016-07-08 CN CN202110785593.7A patent/CN113612677A/zh not_active Withdrawn
- 2016-07-08 EP EP16857134.7A patent/EP3367249A4/en not_active Ceased
- 2016-07-08 KR KR1020187010075A patent/KR20180068975A/ko active IP Right Grant
- 2016-07-08 WO PCT/JP2016/070207 patent/WO2017068816A1/ja active Application Filing
- 2016-07-08 CN CN201680060292.5A patent/CN108139988B/zh active Active
- 2016-07-08 US US15/755,361 patent/US10673788B2/en active Active
-
2020
- 2020-04-02 US US16/839,060 patent/US20200236070A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5918222A (en) * | 1995-03-17 | 1999-06-29 | Kabushiki Kaisha Toshiba | Information disclosing apparatus and multi-modal information input/output system |
JPH10222521A (ja) * | 1997-01-31 | 1998-08-21 | Toshiba Corp | 情報共有支援システム |
JP2005348167A (ja) * | 2004-06-03 | 2005-12-15 | Vodafone Kk | 移動体通信端末 |
US20080262982A1 (en) * | 2007-04-23 | 2008-10-23 | Sudhir Rajkhowa | System for therapy |
JP2013061889A (ja) * | 2011-09-14 | 2013-04-04 | Namco Bandai Games Inc | プログラム、情報記憶媒体、端末装置及びサーバ |
Non-Patent Citations (1)
Title |
---|
See also references of EP3367249A4 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114037467B (zh) * | 2015-10-20 | 2024-08-13 | 索尼公司 | 信息处理系统、信息处理方法和计算机可读存储介质 |
CN114037467A (zh) * | 2015-10-20 | 2022-02-11 | 索尼公司 | 信息处理系统、信息处理方法和计算机可读存储介质 |
JP2019057168A (ja) * | 2017-09-21 | 2019-04-11 | 大日本印刷株式会社 | コンピュータプログラム、サーバ装置、指導システム、指導方法、指導者端末装置及び被指導者端末装置 |
JPWO2019116489A1 (ja) * | 2017-12-14 | 2020-12-17 | Line株式会社 | プログラム、情報処理方法、及び情報処理装置 |
JP7072584B2 (ja) | 2017-12-14 | 2022-05-20 | Line株式会社 | プログラム、情報処理方法、及び情報処理装置 |
WO2019116489A1 (ja) * | 2017-12-14 | 2019-06-20 | Line株式会社 | プログラム、情報処理方法、及び情報処理装置 |
CN111788566A (zh) * | 2018-03-08 | 2020-10-16 | 索尼公司 | 信息处理设备、终端装置、信息处理方法和程序 |
WO2019172087A1 (ja) * | 2018-03-08 | 2019-09-12 | ソニー株式会社 | 情報処理装置、端末機器、情報処理方法、およびプログラム |
US11330408B2 (en) | 2018-03-08 | 2022-05-10 | Sony Corporation | Information processing apparatus, terminal device, and information processing method |
WO2019188523A1 (ja) * | 2018-03-27 | 2019-10-03 | 株式会社Nttドコモ | 応答システム |
JPWO2019188523A1 (ja) * | 2018-03-27 | 2021-04-01 | 株式会社Nttドコモ | 応答システム |
JP7258013B2 (ja) | 2018-03-27 | 2023-04-14 | 株式会社Nttドコモ | 応答システム |
US10748644B2 (en) | 2018-06-19 | 2020-08-18 | Ellipsis Health, Inc. | Systems and methods for mental health assessment |
US11120895B2 (en) | 2018-06-19 | 2021-09-14 | Ellipsis Health, Inc. | Systems and methods for mental health assessment |
US11942194B2 (en) | 2018-06-19 | 2024-03-26 | Ellipsis Health, Inc. | Systems and methods for mental health assessment |
Also Published As
Publication number | Publication date |
---|---|
EP3367249A1 (en) | 2018-08-29 |
CN108139988A (zh) | 2018-06-08 |
CN113612677A (zh) | 2021-11-05 |
CN108139988B (zh) | 2021-07-30 |
KR20180068975A (ko) | 2018-06-22 |
US10673788B2 (en) | 2020-06-02 |
US20180248819A1 (en) | 2018-08-30 |
EP3367249A4 (en) | 2018-12-05 |
US20200236070A1 (en) | 2020-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017068816A1 (ja) | 情報処理システム、および情報処理方法 | |
US8903176B2 (en) | Systems and methods using observed emotional data | |
CN105320726B (zh) | 降低对手动开始/结束点和触发短语的需求 | |
US20200293526A1 (en) | Information processing system, information processing method, and storage medium | |
WO2016178329A1 (ja) | 情報処理システム、制御方法、および記憶媒体 | |
CN108139880B (zh) | 用于语音生成设备的定向个人通信 | |
KR20180073566A (ko) | 정보 처리 시스템 및 정보 처리 방법 | |
US20160128617A1 (en) | Social cuing based on in-context observation | |
EP3693847B1 (en) | Facilitating awareness and conversation throughput in an augmentative and alternative communication system | |
WO2017062163A1 (en) | Proxies for speech generating devices | |
JP2016103081A (ja) | 会話分析装置、会話分析システム、会話分析方法及び会話分析プログラム | |
CN109804407B (zh) | 关心维持系统以及服务器 | |
JP6291303B2 (ja) | コミュニケーション支援ロボットシステム | |
WO2018173383A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2018024058A (ja) | 接客装置、接客方法及び接客システム | |
JP2007334732A (ja) | ネットワークシステム及びネットワーク情報送受信方法 | |
Chen et al. | From Gap to Synergy: Enhancing Contextual Understanding through Human-Machine Collaboration in Personalized Systems | |
CN108885594B (zh) | 信息处理装置、信息处理方法和程序 | |
CN108351846B (zh) | 通信系统和通信控制方法 | |
KR20210088824A (ko) | 인공지능 대화 서비스를 이용한 채팅 시스템 및 그 동작방법 | |
JP7550335B1 (ja) | システム | |
JP5388923B2 (ja) | 教育サポート装置およびコンピュータプログラム | |
Yonezawa et al. | Anthropomorphic awareness of partner robot to user’s situation based on gaze and speech detection | |
CN114911346A (zh) | 一种终端设备的交互方法和装置 | |
JP2021086354A (ja) | 情報処理システム、情報処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16857134 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15755361 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20187010075 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016857134 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |