WO2016199464A1 - 情報処理装置、情報処理方法及びプログラム - Google Patents
情報処理装置、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2016199464A1 WO2016199464A1 PCT/JP2016/056481 JP2016056481W WO2016199464A1 WO 2016199464 A1 WO2016199464 A1 WO 2016199464A1 JP 2016056481 W JP2016056481 W JP 2016056481W WO 2016199464 A1 WO2016199464 A1 WO 2016199464A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- topic
- user
- behavior
- information
- information processing
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 89
- 238000003672 processing method Methods 0.000 title claims description 7
- 230000009471 action Effects 0.000 claims description 112
- 230000006854 communication Effects 0.000 claims description 70
- 238000004891 communication Methods 0.000 claims description 68
- 238000000034 method Methods 0.000 claims description 60
- 238000012545 processing Methods 0.000 claims description 49
- 230000008859 change Effects 0.000 claims description 27
- 230000007704 transition Effects 0.000 claims description 25
- 230000008569 process Effects 0.000 claims description 23
- 239000000284 extract Substances 0.000 claims description 22
- 238000000605 extraction Methods 0.000 claims description 22
- 238000004364 calculation method Methods 0.000 claims description 21
- 230000006870 function Effects 0.000 claims description 17
- 238000011156 evaluation Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 17
- 230000006399 behavior Effects 0.000 description 166
- 238000004458 analytical method Methods 0.000 description 51
- 238000009826 distribution Methods 0.000 description 28
- 238000010586 diagram Methods 0.000 description 23
- 238000006243 chemical reaction Methods 0.000 description 11
- 238000003860 storage Methods 0.000 description 10
- 230000000699 topical effect Effects 0.000 description 10
- 230000004913 activation Effects 0.000 description 7
- 230000003542 behavioural effect Effects 0.000 description 7
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000007423 decrease Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000008092 positive effect Effects 0.000 description 4
- 238000010187 selection method Methods 0.000 description 4
- 206010021703 Indifference Diseases 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000000052 comparative effect Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000008450 motivation Effects 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000287181 Sturnus vulgaris Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000013256 coordination polymer Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003631 expected effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/335—Filtering based on additional data, e.g. user or group profiles
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L15/18—Speech classification or search using natural language modelling
- G10L15/1815—Semantic context, e.g. disambiguation of the recognition hypotheses based on word meaning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
- G06F16/3329—Natural language query formulation or dialogue systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9035—Filtering based on additional data, e.g. user or group profiles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/226—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics
- G10L2015/228—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics of application context
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 As a technique for providing a user with a fun and natural conversation, for example, in Patent Document 1, information on a topic provided from a database is filtered by data related to the user's taste, and the remaining information is used for a conversation with the user.
- a conversation processing device is disclosed.
- the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of appropriately presenting a topic in consideration of the behavior of a user presenting the topic.
- a topic generation unit that generates a topic with a conversation partner and a topic candidate to be proposed to the user from the generated topic are determined.
- An information processing apparatus including a topic proposal unit is provided.
- the processor generates a topic with a conversation partner based on a comparison result between a user's action and comparison information related to the action, and a topic candidate to be proposed to the user from the generated topic.
- An information processing method is provided.
- a computer is proposed to a user from a topic generated by a topic generation unit that generates a topic with a conversation partner based on a comparison result between a user's action and comparison information regarding the action.
- a program that functions as an information processing apparatus, comprising a topic proposal unit that determines topic candidates.
- FIG. 1 is an explanatory diagram showing an overview of a topic providing system 1 according to the present embodiment.
- the topic providing system 1 is a system that provides a topic to a user who has a conversation via the server 10, as shown in FIG.
- the server 10 can acquire information regarding each user's behavior, and generates and provides an appropriate topic based on the information regarding each user's behavior. Therefore, for example, when the user A and the user B shown in FIG. 1 meet for the first time or have an exchange relationship that does not know each other well, such as participating in the same event on that day, the server 10 is a user. An appropriate topic can be provided to each user based on the actions of A and B.
- the server 10 can acquire information related to user behavior from other servers, various terminals owned by each user, and the like.
- information related to user behavior include sensor information acquired using various sensing techniques such as time information and position information, and user action behavior information recognized by analyzing sensor information.
- the server 10 functioning as an information processing apparatus for providing a topic in the topic providing system 1 according to the present embodiment, based on a comparison result between a user's action and comparison information related to the action, a topic with a conversation partner is determined. Generate. And the server 10 extracts an appropriate topic from the produced
- the configuration and function of the topic providing system 1 according to the present embodiment will be described in detail.
- FIG. 2 is a functional block diagram showing a functional configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 is provided in the server 10. As illustrated in FIG. 2, the information processing apparatus 100 includes a topic generation unit 110, a topic proposal unit 120, and a topic presentation processing unit 130.
- the topic generation unit 110 generates a topic with the conversation partner based on information regarding the behavior of the user having a conversation.
- the topic generation unit 110 acquires information about the behavior of the user who is having a conversation, extracts information that becomes a topic from a change in behavior that is not noticed by itself, a deviation from others, and the like, and sets it as a topic candidate.
- the topic generation unit 110 includes, for example, a self-behavior analysis unit 112, a common behavior analysis unit 114, and a profile analysis unit 116, as shown in FIG.
- the self-behavior analysis unit 112 analyzes the behavior of a user who intends to present a topic, extracts a characteristic behavior tendency of the user, a change in behavior, and the like, and sets it as a topic. For example, the self-behavior analysis unit 112 extracts a profile representing hobbies, trends, preferences, and the like from past behavior logs of only the user, or extracts a difference between the past behavior log of only the user and the latest behavior log. Or
- the common behavior analysis unit 114 analyzes the common behavior that each user having a conversation has performed together, extracts common points and deviations of the behavior of each user, and makes them a topic.
- the common behavior analysis unit 114 compares, for example, the overlap of profiles of each user, or extracts the behavior of a user who tries to present a topic with respect to the behavior frequently performed by the conversation partner.
- the profile analysis unit 116 generates a profile for each user, and generates a topic by matching the generated profiles. For example, the profile analysis unit 116 analyzes the stay position of the user and the transition between the stay positions from the time-series position information acquired as sensor information, and generates a profile that represents the user's hobbies, trends, preferences, and the like. At this time, the user's action recognition result recognized separately by the action recognition technique may be considered. The profile analysis unit 116 uses the generated profile to match profiles of each user who has a conversation and generates a common topic.
- the topic generation unit 110 may generate a topic by causing at least one of the self-behavior analysis unit 112, the common behavior analysis unit 114, and the profile analysis unit 116 to function.
- the topic generation unit 110 outputs the generated topic to the topic proposal unit 120.
- the topic suggestion unit 120 extracts a topic suitable for a place where a conversation is performed from the generated topic, and performs a process for proposing the topic candidate to the user.
- the topic proposal unit 120 includes, for example, a climax calculation unit 122, a topic extraction unit 124, and an expectation calculation unit 126.
- the excitement degree calculation unit 122 calculates the excitement degree of the communication place formed by the user and the conversation partner.
- the degree of excitement calculation unit 122 may calculate the degree of excitement based on, for example, an evaluation function related to the reaction of the speaking partner with respect to the user's utterance.
- the degree of excitement calculation unit 122 outputs the calculated degree of excitement to the topic extraction unit 124.
- the topic extraction unit 124 extracts topics according to the degree of excitement from the topics generated by the topic generation unit 110. For example, the topic extraction unit 124 increases related information related to the currently introduced topic to the topic candidate when the place of communication is excited by the introduced topic. On the other hand, when the place of communication does not rise due to the topic that was introduced, new topics that are different from the topic that was introduced this time are added to the topic candidates. Further, the topic extraction unit 124 weights each topic generated by the topic generation unit 110 based on state information determined according to a situation in which the user and the conversation partner have a conversation, and a more appropriate topic is a topic candidate. May be extracted.
- the topic extracted by the topic extraction unit 124 is output to the topic presentation processing unit 130 as a topic candidate proposed to the user.
- the extracted topic is output to the expectation degree calculation unit 126 as necessary.
- the expectation degree calculation unit 126 calculates an expectation degree that a place for communication is excited by the introduction of a topic for each topic extracted as a topic candidate.
- the degree of expectation is a prediction effect value that expresses the excitement of the field predicted by introducing a topic in a random variable.
- the expectation degree calculation unit 126 calculates an expectation degree for each topic and outputs it to the topic presentation processing unit 130.
- the topic presentation processing unit 130 performs processing for presenting the topic candidates extracted by the topic suggestion unit 120 to the user.
- the topic presentation processing unit 130 may change the way the topic is presented in a situation where the user and the conversation partner are talking. For example, when a topic is provided so as not to be noticed by the surroundings, the topic may be casually presented only to the user who intends to provide the topic. Alternatively, when presenting a topic is also an entertainment element that excites the venue, the topic may be presented to each user participating in the conversation.
- the topic presentation processing unit 130 performs predetermined processing so that the topic candidate can be presented on the device that presents the topic candidate to the user, and outputs it to each device.
- FIG. 3 is a flowchart showing a topic providing method using the topic providing system 1 according to the present embodiment.
- the topic providing system 1 first performs processing for generating a topic to be presented to the user by the topic generating unit 110 (S110).
- a topic with a conversation partner is generated based on information on the behavior of a user having a conversation.
- the topic generation unit 110 acquires information about the behavior of the user who is having a conversation, extracts information that becomes a topic from a change in behavior that is not noticed by itself, a deviation from others, and the like, and sets it as a topic candidate.
- Examples of a method for generating a topic using information related to user behavior include the following methods. (1) Generating topics from information related to own behavior (2) Generating topics from behavior related to common behavior among users (3) Generating topics by matching each user's profile I will explain.
- FIG. 4 is an explanatory diagram illustrating comparison of user behavior logs.
- FIG. 5 is an explanatory diagram illustrating an example of extracting topical behavior from the difference between the past behavior tendency of the user and the latest behavior.
- FIG. 6 is an explanatory diagram illustrating an example in which topical behavior is extracted from a change in user behavior tendency.
- FIG. 7 is an explanatory diagram illustrating an example of extracting a behavior tendency different from the past and the latest behavior tendency in a certain section in the past behavior tendency of the user.
- the topic generation by the self-behavior analysis is performed by the self-behavior analysis unit 112 using only information on the behavior of the user who wants to provide the topic. For example, as the information related to the user's action, an action log representing the user's operation action recognized by analyzing the sensor information in time series is used. For example, consider an action log indicating the user's action content on a certain day as shown in the upper side of FIG.
- the analysis of the user's self-behavior is performed by comparing data in some sections and data in other sections in such an action log.
- the behavior of the user may be compared with the latest behavior, and the trend or change of the user's behavior may be determined by comparing the behavior in the past with the latest behavior. (Comparison 1).
- the behavior in the same section on the previous day may be compared with the latest behavior to determine the tendency or change in the user's behavior (Comparison 2).
- the self-behavior analysis unit 112 abstracts the result of comparing the behaviors in different sections of the behavior log in order to extract the trend or change of the behavior that is likely to become a topic.
- Examples of behavior features obtained by abstracting behavior comparison results include the following. Frequency: e. g. I have been walking frequently since the last time. E. g. I have been walking for a long time since then: e. g. After walking for a while and then walking again Difference: e. g. This time is slower than the previous time. Total: e. g. The current time zone walks better than a certain time zone Protrusion: e. g. Compared to these days, the current time zone is doing a lot. Similar: e. g. The current time zone pattern is the same as last time. Change: e. g. Recently I started to run well. Period: e. g. Running every spring
- the behavior until the user arrives at a certain staying position usually arrives at the staying position by walking with a train, but in the most recent action, the staying position by train and running Is detected as having arrived at.
- a probability distribution is created for the frequency or number of times of normal actions in a predetermined time zone and at a predetermined position, for example, a normal distribution as shown in the lower side of FIG. 5 is obtained.
- the behavior in a predetermined range in the center can be said to be a daily behavior that does not become a topic because the behavior is frequently performed.
- the behavior outside the daily range can be said to be an unusual range of behavior different from usual.
- the average value of each distribution is within the standard range of the distribution in the lifetime, and the value itself is out of the comparison information. is not.
- the tendency of such behavior change tends to be a topic. For example, when a topic related to running rises, if there is a tendency of change in behavior that jogging time is getting longer, this may be used as a topic.
- the comparison information may be information other than behavioral trends throughout the lifetime, or may use behavioral trends recognized from the user's behavior for a period sufficiently longer than the behavioral trend section to be compared with the comparative information. Good.
- the past sections to be compared with the comparison information do not necessarily have the same length of time. For example, if the sections are classified according to life stages, they are easily provided as topics. For example, a section such as a period of childhood, elementary school, junior high school, high school, university, a period after becoming a member of society, or an event such as transfer or marriage may be set as a section.
- the self-behavior analysis unit 112 Extract as content that is likely to become a topic.
- FIG. 8 is an explanatory diagram illustrating an example in which topical behavior is extracted from a difference in behavior frequency common to users.
- FIG. 9 is an explanatory diagram illustrating an example in which topical behavior is extracted from a shift in information regarding behavior common to users.
- FIG. 10 is an explanatory diagram illustrating another example in which topical behavior is extracted from a shift in information regarding behavior common to users.
- FIG. 11 is an explanatory diagram illustrating an example in which a topic is extracted from a deviation of the behavior tendency of behavior common to users from the public.
- the topic generation by self-behavior analysis does not take into account the information about the behavior of the other party, so the topic develops if the other party has also talked about the behavior you talked about or is interested in it, If not, the topic will flow.
- searching for common actions that the user and the conversation partner have performed it is possible to generate a topic that the partner is more interested in.
- the topic generation based on such common behavior analysis is performed by the common behavior analysis unit 114.
- the frequency of the user's action is related to the user's goodness and interest / interest strength for the action. From this, it is better to generate as a topic an action that is more frequently performed by the conversation partner than the user who provides the topic. That is, the common behavior analysis unit 114 generates a topic so as to give a topic in the field of expertise of the conversation partner. For example, if a user who provides a topic shakes as a topic a behavior that is more frequent than that of the person they talk to, it will seem to distract the knowledge of the person who talks, so a topic to raise the place for such behavior It is better to refrain from.
- a difference in moving means up to a certain stay position can be used for topic generation.
- the transportation means to Hokkaido is different, the unexpectedness of the transportation means may be discussed.
- more than half of the people are moving to Hokkaido using airplanes due to the distribution of means of movement to Hokkaido in a certain group.
- the talker is also moving to Hokkaido using an airplane.
- the difference in the moving means may be used as the topic.
- the common behavior common to the talking user and the conversation partner may be a particular field of interest or content with strong interest. Therefore, the topic presented to the user who provides the topic may be a deep topic that suddenly becomes the core, not a topic of exploration that looks at the state of the other party. In this way, when a common behavior among users is deviated from the world, a certain sense of solidarity between the user and the other party can be provided by presenting the user with a core topic related to the behavior. It is possible to generate conversation and excite conversation.
- FIG. 12 is an explanatory diagram for explaining “stay position” and “transition between stay positions” obtained as a result of user's action recognition. From the time-series position information, “stay position” such as “stay 1” and “stay 2” shown in FIG. 12 and “transition between stay positions” such as “transition from stay 1 to stay 2” are shown. Can be distinguished. A profile representing the user's hobbies, trends, preferences, and the like can be generated using the information regarding the position. For example, it is possible to know in detail the purpose and hobby preference that the person has come to the position from the “stay position” and the action recognition result. Also, for example, the behavior tendency of the person can be known from the history of the moving means in “transition between staying positions”.
- the profile analysis unit 116 generates a profile representing such a user's hobbies, trends, preferences, and the like, matches the profile of each user who has a conversation using the generated profile, and generates a common topic.
- FIG. 13 is an explanatory diagram for explaining processing for generating a profile representing the purpose of the user at the stay position from the stay position and the action recognition result.
- FIG. 14 is an explanatory diagram showing an example of the user behavior information table 142 acquired by the profile generation process of FIG. 15 to 17 are explanatory diagrams showing examples of user profiles generated from the stay position and the action recognition result.
- FIG. 18 is an explanatory diagram illustrating an example of a profile generated from the application activation history and the action recognition result.
- the profile can be generated based on the stay position and action recognition result, for example.
- the latitude and longitude which are the user's position information
- place name information and place category are converted into place name information and place category.
- the place name is a name such as a place name or a facility name
- the place category is information representing the genre of the place.
- the profile analysis unit 116 performs a place name conversion process and a place category conversion process based on the latitude and longitude with reference to a place DB (not shown) that holds the correspondence between the latitude and longitude, the place name, and the place category.
- the location DB may be provided in the server 10 provided with the information processing apparatus 100, or may be provided in an external server connected to the information processing apparatus 100 so as to be communicable.
- the profile analysis unit 116 records the acquired place name and place category in the profile DB 140 as user behavior information.
- the profile analysis unit 116 acquires the action recognition result obtained by performing the action recognition process using the sensor information as the user action.
- behavior recognition processing in which user behavior such as walking and running is acquired is performed using technology for recognizing user behavior from sensor information acquired using various sensing technologies.
- the action recognition process may be performed in the server 10 including the information processing apparatus 100 or may be performed in an external server connected to the information processing apparatus 100 so as to be communicable.
- the profile analysis unit 116 records the acquired action recognition result in the profile DB 140 as the user action.
- the profile analysis unit 116 When the place name, place category, and user behavior information are acquired, the profile analysis unit 116 generates a profile based on these pieces of information.
- the profile analysis unit 116 acquires the user's purpose at the location as a profile depending on how the user is moving at the location. Further, the profile category to which the profile belongs is specified based on the relationship between the preset profile and the profile category.
- FIG. 14 shows an example of the user behavior information table 142 recorded in the profile DB 140.
- the user behavior information table 142 includes, for example, a latitude and longitude 142a, a place name 142b, a place category 142c, action information 142d, a profile 142e, and a profile category 142f.
- Results analyzed by the profile analysis unit 116 based on the place name 142b, the place category 142c, and the behavior information 142d are recorded as a profile 142e and a profile category 142f.
- the profile analysis unit 116 analyzes that “snow sports” were performed. “Snow Sports” belongs to the category of “Active Sports”.
- the profile analysis unit 116 acquires the user's purpose at the position based on the stay position and the action recognition result.
- Profile generation based on transition of staying position In addition, a profile can be generated based on, for example, the history of moving means in transition between staying positions.
- the profile obtained by the profile generation process based on the transition of the staying position represents the behavior tendency of the person.
- the location where the points staying at the same location for a while from the latitude and longitude, which is the location information, is gathered and collected. Then, it is possible to specify how the user has moved from one staying position to the next staying position from the history of the moving means at the time of staying position transition. At this time, a characteristic movement when the transition of the stay position is seen in time series, such as reciprocation between two stay positions or circulation of a plurality of stay positions, may be recognized as the characteristic action.
- the transition of the staying position, the name of the staying position, and the moving unit may be performed in the server 10 including the information processing apparatus 100, or performed in an external server connected to the information processing apparatus 100 so as to be communicable. May be.
- the profile analysis unit 116 acquires these pieces of information as processing results. Further, the profile analysis unit 116 may acquire time information such as season and time.
- the profile analysis unit 116 analyzes the stay position transition and moving means, identifies the behavior tendency, and sets it as the user profile. Further, the profile category to which the profile belongs is specified based on the relationship between the preset profile and the profile category. The profile analysis unit 116 records these pieces of information in the profile DB 140.
- the behavior tendency tables 144A to 144C include, for example, stay position transition (latitude and longitude) 144a, stay place transition (place name) 144b, moving means 144c, characteristic behavior 144d, and time information. 144e, profile 144f, and profile category 144g.
- the results analyzed by the stay position transition 144a and the moving means 144c are recorded as a profile 144f and a profile category 144g.
- the profile analysis unit 116 analyzes “a person who uses a train for commuting”. “A person who uses a train for commuting” belongs to the category of “Train commuter family”.
- the user's behavior tendency such as lifestyle can be acquired.
- the user A knows that there is a behavior tendency of “Driving on weekdays for commuting and driving on holidays for cars”.
- the user B has a behavior tendency of “Weekday commuting is a train, detour on the way home from work”.
- the user C has an action tendency that “a commuting or work on weekdays uses a car and a train is used when going out on a holiday”.
- FIG. 18 shows an example of a user behavior information table 146 that records a profile obtained based on an application activation history and a behavior recognition result.
- the user behavior information table 146 includes, for example, an application activation history 146a, behavior information 146b, characteristic behavior 146c, time information 146d, profile 146e, and profile category 146f.
- the results analyzed by the application activation history 146a and the behavior information 146b are recorded as a profile 146e and a profile category 146f.
- the profile analysis unit 116 analyzes that “jogging” is being performed. “Jogging” belongs to the category of “active sports”.
- the profile analysis unit 116 can specify in detail the action being performed by the user based on the application activation history and the action recognition result.
- FIG. 19 is an explanatory diagram for explaining processing for determining a topic to be presented by profile matching.
- FIG. 20 is an explanatory diagram illustrating a relationship between closeness between users and disclosure information.
- the profile analysis unit 116 uses the profile acquired as described above to match the profile of the user who is having a conversation and generates an appropriate topic.
- a topic is provided using latitude / longitude information that is the user's position information, it is common to process each user so as to provide a topic related to the same position information.
- the position information is the same, for example, when the user's purpose at the position is different, the topic regarding the position information is unlikely to be a topic common to the user, and may not be appropriate as the content to be provided. Therefore, in the present embodiment, it is possible to provide a more appropriate topic by using a profile of a user having a conversation and specifying a common profile.
- the profile analysis unit 116 matches the profiles of the user A and the user B and generates a topic.
- the profiles of user A and user B have a common content “surfing”. Although the places where the users A and B surfed are different, a common hobby “surfing” can be found by matching the profiles. Special fields such as hobbies and content with strong interests and interests create a kind of solidarity between the user and the other party, making it easier to excite conversation. In this way, by matching the profiles, it is possible to provide an appropriate topic to the user.
- the profile analysis unit 116 performs profiles with the user A and the user C for the user B and the user C, respectively, and recommends a user whose profile or profile category matches as a conversation partner.
- the user A and the user B having the same contents of “surfing” and “active sports” are recommended as the talking partner. Thereby, it is possible to recommend to the user A a person who is considered to have many common topics.
- the transition between the place name, the place category, and the staying position is information that shows personal characteristics. May not want to be disclosed. For example, a user may think that he does not want to disclose where he often goes. Even in such a case, since the granularity of information of profiles and profile categories is coarse and it is difficult to identify an individual with this information alone, it is appropriate as information used when generating a topic. Therefore, in a situation where it is not desired to disclose a lot of information to the other party, the profile category should be used as matching information.
- the profile analysis unit 116 may perform control so that the information is disclosed step by step. For example, specific information may be disclosed as the familiarity with the speaking partner increases. As shown in FIG. 20, while the intimacy of the talking user is low, profile categories and profiles are disclosed, and a topic is generated based on these information. When the intimacy becomes high, location categories, location transitions, and location names are disclosed in stages, so that topics that are more personal can be generated. Such a stepwise disclosure of information makes it possible to present a natural topic that matches the intimacy between users, taking into account the privacy of each user.
- the topic generation processing in step S110 has been described above. Note that the topic generation processing executed in the topic providing system 1 according to the present embodiment is not limited to the above example, and the topic may be generated by another method.
- the conversation partner may provide changes in the conversation partner as a topic using sensor information.
- the recognizer may compare the image of the other party with the previous meeting with the current image of the other party and detect the highlight so that the user can notice the change of the other party.
- the opponent is a type of person who is difficult to notice the change
- his / her own action log may be given as a hint to the opponent so as to notice the change.
- you may give a hint to the other party if you want to be aware of it, such as changing your hairstyle.
- FIG. 21 is a conceptual diagram of a process of conversation communication between users.
- conversational communication takes place as a time lapse, but when the user becomes interested in the person, the user listens to the other person's story or asks a question. It becomes like this. Furthermore, when the user wants to be interested in the other party, he / she wants to hear his / her story and explains. That is, as shown in FIG. 21, conversation communication is easy in a state of “interested in the other party” or “want to be interested from the other party”, and the topic providing system as the interest in the other party increased. Even if you don't receive the topic from 1, the conversation will bounce. This state can be said to be a step in which both parties are interested.
- the topic providing system 1 provides an appropriate topic, and provides support for putting the strength of interest in the conversation partner in a virtuous cycle of conversational communication.
- “the degree of excitement of the place” is introduced as an evaluation index for favorably continuing communication.
- “Venue” represents a communication event that is temporarily formed.
- FIG. 22 shows a concept of a communication place and participants in the communication.
- a temporary communication place is formed by three participants A, B, and C.
- the degree of excitement varies depending on the degree of excitement of each participant.
- degree of upsurge H P field is expressed by the following equation (1).
- env (t) is an environmental change that represents the fall of the field over time.
- Degree of upsurge H P field is excitement H A of each participant, H B, it varies by H C.
- participant A is parties of the party, and the participant C is a facilitator who is a secretary. Further, it is assumed that the participant A has a favor for the participant B.
- Participant A's excitement level HA is expressed by the following expression (2): Participant A's interest level I AB from Participant A indicating that Participant A favors Participant B, and Participant A It can be expressed by the interest level I BA from B to the participant A and the excitement level H P (t ⁇ 1) of the field at a certain time t ⁇ 1.
- ⁇ is an estimated value including an error
- W AP is a weighting coefficient.
- the swell level H B of the participant B is also expressed by the following formula (3).
- the excitement level HC of the manager C participant C is expressed by the following formula (4).
- the coefficients a, b, and W CP are coefficients that change depending on the motivation of the participant C. For example, when the participant C contributes only to the excitement of the place, the interests of the participants A and B are irrelevant, so the coefficients b and c are zero.
- predetermined values are set for the coefficients a, b, and WCP .
- the interest level I BA of the participant B in the equations (2) to (4) with respect to the participant A is expressed by the following equation (5).
- the terms related to conversational communication are the topic Topic AP that the participant A puts in the field and the entry timing TopicTiming AP .
- ⁇ is an element other than conversational communication, such as an appearance.
- the degree of interest I AB for participant B estimated by participant B (refers to the one attached with ⁇ on I AB in equation (6)) is expressed by equation (6) below.
- Reaction AB is the reaction of Participant A to the topic that Participant B put into the field.
- the topic participation rate can be expressed based on, for example, the number and frequency of remarks and discussions.
- Formula (1) can be represented as following formula (7).
- Topic XP represents the topic that each participant puts into the field
- TopicTiming XP represents the timing of the entered topic
- functions including Reaction are collected as Topic P functions.
- W is a weighting coefficient.
- the topic providing system 1 uses the above values as evaluation values for topic selection. These values are calculated by the excitement degree calculation unit 122 of the topic proposal unit 120 of the information processing apparatus 100. The degree of excitement calculation unit 122 outputs the calculated value to the topic extraction unit 124.
- FIG. 23 is a flowchart showing the topic selection method according to this embodiment.
- the topic extraction unit 124 extracts the topic to be presented to the user from the topic generated by the topic generation unit 110 using the evaluation value calculated by the climax degree calculation unit 122.
- the processing shown in FIG. 23 is executed, for example, at a timing when a topic is put into the place.
- the topic extraction unit 124 first acquires the reaction information of the talking partner (S121).
- the reaction information is information indicating the reaction of the talking partner with respect to the input topic, and includes, for example, a smile, a laughter, a nod, a companion, and the like. It is assumed that these pieces of reaction information are information that can be acquired at any time as various sensor information. Specifically, the other person's nod, such as “Yeah.”, “That?”, “Interesting”, “Hea”, “Oh,” etc. This is the presentation frequency.
- Topic extraction unit 124 acquires the response information of the conversation partner for thrown-in topic, calculates the degree of upsurge H P field at this time (S123). Degree of upsurge H P field can be calculated by the equation (6).
- the topic extraction unit 124, calculating the field upsurge degree H P determines whether the field is raised by introduction of this topic (S125).
- Step S125 is field excitement H P is a predetermined threshold value (hereinafter, also referred to as "excitement threshold”.) Is determined by whether or not exceeded.
- the swell degree threshold value is appropriately set according to the situation.
- field excitement H P is greater than or equal excitement threshold at step S125, the determining that the field is raised by the entered topic, topic extraction unit 124, the related information of the current topic, participants (S127). On the other hand, it if field excitement H P is smaller than the excitement threshold, determines that the place was not Moriagara by the entered topic, topic extraction unit 124, the switching of topics from the current topic Is added, and new topics are added to the topic options presented to the participants (S129).
- topics corresponding to the excitement of the place are extracted from various topics generated by the topic generation unit 110 and presented to the participants.
- the topic duration may be further taken into account in the evaluation in step S125.
- the expectation of the excitement of the place due to the topic input may be presented to the participant as an expectation degree. Thereby, the participant can select the topic to be introduced while assuming the size of the excitement of the place at the time of introduction for each topic.
- FIG. 24 shows an example of the distribution of expectations regarding the topic.
- a topic with a tendency of distribution 2 on the lower side of FIG. Specifically, it is a general topic such as the weather, and a topic that does not have a very high effect of excitement but has a low possibility of coming off.
- a topic having the tendency of the distribution 1 on the lower side of FIG. A topic in which the spread of the distribution is large and a positive effect or a negative effect is assumed as in distribution 1 is, for example, a specialized topic.
- Such a topic can have a great excitement, but it can also be a big excitement.
- related information on topics that the conversation partner showed a strong interest in is also generally shifted to the right. That is, distribution 2 which is a general topic distribution shows a tendency to surely excite the field even if it is not so large, and the related information of the topic in which the other party has shown high interest is more surely enlivened. Therefore, it can be considered that the distribution 4 is shifted to the right as a whole.
- the degree of expectation of related information shifts to the right as the correlation with the presented topic increases. Note that if the number of repetitions is large, the other party's interest may also decrease, and in this case, there is a possibility of shifting to the left as a whole.
- the calculation of the degree of expectation that expresses the expectation of the excitement of the place is performed by the expectation degree calculation unit 126 of the topic proposal unit 120.
- the expectation degree calculation unit 126 outputs the calculated result to the topic presentation processing unit 130.
- FIG. 25 shows a relationship between the introduction of a topic and the degree of excitement of the place.
- Degree of upsurge H P field as described above, are calculated based on equation (1).
- the environmental term env (t) decreases the degree of excitement in proportion to time, and the degree of excitement is increased by the topic that is discretely input. Therefore, as shown in FIG. 25, the time transition of the degree of excitement may be such that the degree of excitement rises temporarily when a topic is introduced, and then the degree of excitement decreases as time passes. Recognize.
- the excitement level exceeds 40 at the timing when the third topic is introduced in the example shown in FIG.
- the degree of excitement enters a virtuous circle
- the degree of excitement due to the environmental term env (t) is maintained or increased, and a good communication place can be maintained without support by the present system.
- the third topic is not introduced, the place of communication will naturally rise.
- the topic providing system 1 aims to create a situation in which good communication can be achieved even if a topic is not input by appropriately inputting the topic.
- Support for participants in the communication place by the topic providing system 1 may be changed according to the situation. For example, if it is not a situation where the other party wants to be interested, the support to the participant may be performed to the extent that the other party can be interested. Further, for example, when it is desired to be interested from the other party, the participant is supported so that the place is excited until the situation is reached.
- the topic providing system 1 has a degree of excitement of the place, a target expected degree Pa (for example, a step in which the participants are interested) Support the participants until the excitement is 40-100). If a certain participant wants to be interested in the other party, the topic providing system 1 has a degree of excitement of the place, a target expectation degree Pb (for example, excitement) at a step where the participants want to be interested in each other. Support participants until it reaches 100).
- a target expected degree Pa for example, a step in which the participants are interested
- Pb for example, excitement
- the topic proposal unit 120 may analyze the situation and weight the topics to be proposed. Thereby, it becomes possible to propose a topic more suitable for the situation.
- the situation of the communication place is the state before the common event (state A), the common event (state B), and the common event with respect to the common event common to the participants participating in the place. It can be roughly divided into the following three (state C).
- FIG. 28 shows a specific situation example.
- the situation is a situation in which a conversation is held in order to avoid a silent state, and the situation is after the end of the common event called the welcome party, and therefore corresponds to state C in FIG.
- a joint party is a party in which people who seek encounters participate, and usually there is a facilitator as a secretary who organizes the place. Although there are people who meet for the first time in the joint party, the motivation to become intimate seems to be high.
- the situation of joint event corresponds to the state B in FIG.
- weighting is applied to the direction of the topic to be presented, whether the topic should be continued, or the topic should be added or changed. Can be adjusted.
- FIG. 29 shows the directionality of the topic for the states A to C shown in FIG.
- the state A before the common event is a phase of searching for topics that attract each other's interest. Therefore, instead of continuing the topic, it is better to suggest a topic by adding a new topic to the topic to be proposed in order to search for a topic that the other party is interested in. It is also effective to encourage the conversion to If you find an appropriate topic, try to continue that topic.
- the topic providing system 1 presents related information so as to continue the topic.
- the topic providing system 1 preferably functions to present related information and continue the topic.
- the topic providing system 1 can provide a more appropriate topic by knowing the state of the communication place.
- the topic presentation processing unit 130 performs processing for providing the user with the topic (S130).
- the topic presentation processing unit 130 presents a topic to the person or a plurality of people sharing the topic, depending on the situation at the spot.
- the topic providing system 1 presents a topic to an individual.
- a topic is presented on a display panel of a glass-type wearable terminal, or a topic or a device held by an individual or an LED in a space is made to emit light.
- an auditory presentation method for example, an earphone or an open speaker can be used, and as an auditory or haptic presentation method, for example, vibration or temperature can be used.
- FIG. 30 shows an example in which a topic is presented on the display panel of a glass-type wearable terminal.
- FIG. 30 is an example in which a topic is displayed in a card format on a display panel of a glass-type wearable terminal worn by a user.
- a card Tnow of the current topic On the display panel, a card Tnow of the current topic, a card Tr of related information on the current topic, and new topic cards Tnew-A, Tnew-B, and Tnew-C are displayed.
- the depths of the related information card Tr, the new topic cards Tnew-A, Tnew-B, and Tnew-C indicate the spread of the related information related to the topic.
- the degree of excitement of the place, the degree of expectation at the time of introducing the presented topic, and the like may be presented to the user by a display panel or other presentation methods.
- the user can provide a topic with reference to the presented topic card.
- the topic selected by the user becomes a new topic that is currently in progress, and after the degree of excitement of the place due to the topic input is calculated, new related information and new topic are displayed on the display panel.
- Topic presentation to a plurality of people Next, a case where the topic providing system 1 presents a topic to a plurality of people participating in a communication place will be described.
- a method for presenting a topic to a plurality of people for example, it is conceivable that a topic is presented on a projection surface such as a table top by a projector as a visual presentation method.
- an auditory presentation method for example, a speaker or the like can be used, and as a tactile presentation method, for example, temperature adjustment of an air conditioner or a table can be performed.
- an entertainment element that excites the place including the presentation of the topic.
- FIG. 31 shows an example in which a topic is presented as a topic card on the table top by a projector.
- a card of a topic that is currently in progress a related information card of a topic that is currently in progress, and a card of a new topic are displayed on the table top.
- the user can provide a card of related information or a card of a new topic displayed at the hands of the users A and B who are participating in the communication. Swipe to the place to bring the topic into play.
- the topic of the cards each user A, B has is visible. In this way, entertainment can be given by making it possible to see each other the topic candidates to be put into the venue.
- information such as the degree of excitement of this place, the name of the user who controls the topic, and the degree of contribution to the excitement of the topic may be presented to the participants.
- the transition of the climax calculated by the climax calculator 122 may be presented.
- the transition of the degree of excitement of the field may be displayed in a graph.
- the topic proposer may be displayed when the speaker who has input the topic can be recognized.
- the topic may be displayed as a tag.
- the topic display may be automatically recognized based on the sensor information, or may be manually input by a participant or the like.
- the transition of the degree of excitement may be a shared display that can be seen by all the participants. If there is a facilitator in this place, it may be displayed personally only on the facilitator. The facilitator can take action to further excite the venue with reference to this information.
- the degree of contribution to the topic may be presented for each participant.
- Such information may also be a shared display that can be viewed by all the participants. If a facilitator is present at this location, it may be displayed personally only on the facilitator. The facilitator can take action to direct the conversation to a person who does not speak casually while looking at the degree of contribution to the topic.
- the topic presentation processing unit 130 of the topic providing system 1 may be provided with a function of giving up the person when a topic that is excited is presented. For example, “Good!” May be displayed on a display panel of a glass-type wearable terminal or a vibration that can be stroked may be reproduced. It is also possible to give up the user by displaying a smile on the avatar displayed on the display panel or lighting the LED in green.
- the above-mentioned graph display and contribution display itself may become a topic of conversation.
- You may make it produce the production which specializes here and enlivens the place.
- the sound of the air conditioning may be increased or the air conditioning may be turned on in order to emphasize the fact that it has been shining.
- you may produce the atmosphere of a place with a spotlight using a light source.
- the place has a warm atmosphere, it may illuminate warm-colored lighting, or darken when a topic that has been thrown off.
- using the sound for example, during a serious conversation, the participants in the venue may hear their own beat or a slightly faster beat.
- the topic presentation method by the topic providing system 1 is not limited to the examples shown in FIG. 30 and FIG. It is not limited to.
- the topic may be presented to the user by icon display. Specifically, to show topics related to bicycles, the icon of the picture of the train is displayed, and to show that the past episode is a topic, the photos at the event are displayed. Also good.
- topics may be presented with arrows.
- the topic may be presented by displaying an arrow 22 pointing to the hair. In this way, if there is information that is a topic in the information displayed on the display unit, the topic can be presented by pointing it with an arrow.
- the topics to be presented need not be displayed in equal columns as shown in FIG. 30.
- the topics suitable for each talking partner are associated with each person. You may make it display. As a result, it is possible to visually recognize a topic suitable for each speaking partner. If the topic is common to all speaking partners, it may be displayed in the center of the screen, for example.
- the topic does not have to be presented at all times, and the card may be automatically presented on the glass-type wearable terminal or table top only when the topic should be presented. Alternatively, the topic may be presented only when a user instruction is received. For example, the topic may be presented on the glass-type wearable terminal or the table top only when the user focuses on a specific icon for displaying the topic or moves the cursor with the controller.
- the topic presentation destination is not limited to a glass-type wearable terminal or a table top.
- a list of extracted topic candidates may be displayed on a personally-owned smartphone, tablet terminal, band, or other device.
- the user may not be aware of the topic presentation. Therefore, for example, the user may be notified when the topic preparation is completed, and the user may be guided to view the presented topic.
- the voice notification and the display notification may be combined to convey only the index by voice, and the list acquisition may be acquired as an image.
- the card group may be expanded when the user stares at his / her own card, and the card group may be aggregated when the user removes his / her eyes from the card group.
- a panel 30 31, 33, 35, 37
- each card is displayed on the panel. It may be turned after 30 and hidden.
- the expansion, aggregation, hidden display, etc. of the card group may be performed based on the user's line of sight or may be operated using a controller or the like.
- a virtual moderator may be set to emphasize game characteristics. For example, a system may be used in which each player (participant) submits a topic to a virtual moderator, and the moderator selects a topic submitted from each. The moderator may choose topics by majority or random. Of course, this moderator may be replaced with a facilitator and selected manually.
- the topic providing system 1 needs to know what topic has been selected so far. This recognition may be performed by automatic recognition based on a word or the like. Alternatively, manually recognize various operations such as touch operations on table tops, slide operations such as bracelets worn by users, tap operations on devices, line-of-sight operations, operations to select topics with nodding, etc. The selected topic may be recognized.
- topic selections it is possible to perform “production based on topic selection” such as emphasis expression when a topic is posted but not selected.
- topic selection such as emphasis expression when a topic is posted but not selected.
- the selected purchased topic can be emphasized by shaking or blinking the displayed card.
- a vibration can be given to the user to prompt the user to select a topic.
- Such an effect when a topic is not selected may be executed, for example, when a certain period of time has elapsed since the topic was presented, or when there is no conversation.
- the information processing apparatus 100 of the topic providing system 1 generates a topic with a conversation partner based on a comparison result between a user's action and comparison information regarding the action, and the generated topic is sent to the user. Determine the candidate topic to be proposed. Then, the information processing apparatus 100 proposes an appropriate topic according to the communication place that provides the topic. Thereby, the user can obtain a topic for ice breaking with the other party even in communication with a person who is not so acquainted, and can perform good communication.
- FIG. 36 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the illustrated information processing apparatus 900 can realize the server 10 in the above-described embodiment, for example.
- the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
- the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
- the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for connecting a device to the information processing apparatus 900.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
- the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone).
- the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
- Sensor 935 may also include a GPS receiver that receives GNSS (Global Navigation Satellite System (s)) signals and measures the latitude, longitude, and altitude of the device.
- GNSS Global Navigation Satellite System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- Embodiments of the present disclosure include, for example, an information processing device (for example, a server), a system, an information processing method executed by the information processing device or system, a program for causing the information processing device to function, and a program, as described above May include a non-transitory tangible medium on which is recorded.
- a topic generation unit that generates a topic with a conversation partner based on a comparison result between a user's action and comparison information regarding the action, A topic suggestion unit for determining a topic candidate to be proposed to the user from the generated topic;
- An information processing apparatus comprising: (2) The user's daily behavior regarding a certain behavior is used as the comparison information.
- the information processing apparatus according to (1), wherein the topic generation unit generates the content of the latest user action as a topic when the latest user action is different from the daily action of the user.
- the user's past behavior tendency regarding a certain behavior is used as the comparison information.
- the topic generation unit generates a change in the user's behavior tendency as a topic when the latest behavior tendency of the user has changed from the past behavior tendency of the user.
- the user's past behavior tendency regarding a certain behavior is used as the comparison information.
- the topic generation unit The information processing apparatus according to any one of (1) to (3), generated as: (5) As the comparison information, the behavior tendency of the speaking partner regarding the common behavior that the user and the speaking partner have performed, In any one of (1) to (4), the topic generation unit generates information related to the common behavior as a topic when the user's behavior related to a certain common behavior differs from the behavior tendency of the conversation partner. The information processing apparatus described. (6) The information processing apparatus according to (5), wherein the topic generation unit determines a difference in behavior between the user and the speaking partner based on the occurrence frequency of the common behavior.
- the information processing apparatus determines a difference in behavior between the user and the speaking partner based on lower information related to the common behavior.
- the topic generation unit determines a difference in behavior between the user and the speaking partner based on lower information related to the common behavior.
- the topic generation unit generates information on the common behavior as a topic when the user's behavior tendency regarding the common behavior and the behavior tendency of the speaking partner are different from the behavior tendency of the public, the above (1) to (7)
- the information processing apparatus according to any one of claims.
- the topic suggestion unit A degree of excitement calculating unit for calculating the degree of excitement of the communication place formed by the user and the conversation partner; From the topic generated by the topic generation unit, a topic extraction unit that extracts a topic according to the degree of excitement,
- the information processing apparatus according to any one of (1) to (8), comprising: (10) The information processing apparatus according to (9), wherein the degree of excitement calculation unit calculates the degree of excitement based on an evaluation function related to a response of the speaking partner to the user's utterance.
- the topic extraction unit changes a ratio of related information related to a topic spoken this time in a topic to be extracted as the topic candidate according to a degree of excitement of the communication place with respect to the user's utterance, (9) or The information processing apparatus according to (10).
- the topic extraction unit weights each topic generated by the topic generation unit based on state information determined according to a situation in which the user and the conversation partner have a conversation.
- the information processing apparatus according to any one of the above.
- the topic suggestion unit includes any one of the above (9) to (12), which includes an expectation degree calculation unit that calculates an expectation degree that the communication place is excited by the introduction of a topic for each topic extracted as the topic candidate.
- the information processing apparatus according to item.
- the information processing apparatus according to any one of (1) to (13), further including a topic presentation processing unit that performs processing for presenting the topic candidate to a user.
- the information processing apparatus performs a process of presenting the topic candidate to at least one person participating in a communication place formed by the user and the conversation partner.
- the topic presentation processing unit performs a process of presenting at least one person who participates in a communication place formed by the user and the conversation partner, the transition of the degree of excitement of the place of communication.
- the information processing apparatus according to 15).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computational Linguistics (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Tourism & Hospitality (AREA)
- Acoustics & Sound (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computing Systems (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Information Transfer Between Computers (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
1.概要
2.機能構成
3.話題提供方法
3.1.話題生成処理
3.2.話題提案処理
3.3.話題提示処理
4.まとめ
5.ハードウェア構成
6.補足
まず、図1を参照して、本開示の一実施形態に係る話題提供システム1の概要について説明する。図1は、本実施形態に係る話題提供システム1の概要を示す説明図である。
まず、図2に基づいて、本実施形態に係る話題提供システム1において話題を提供するための処理を行う情報処理装置100の機能構成を説明する。図2は、本実施形態に係る情報処理装置100の機能構成を示す機能ブロック図である。
話題生成部110は、会話をしているユーザの行動に関する情報に基づいて、話し相手との話題を生成する。話題生成部110は、会話をしているユーザの行動に関する情報を取得して、自分では気づいていない行動の変化や他人とのずれ等から、話題となる情報を抽出し、話題候補とする。話題生成部110は、例えば、図2に示すように、自己行動分析部112と、共通行動分析部114と、プロファイル分析部116を有する。
話題提案部120は、生成された話題から会話を行っている場に適した話題を抽出し、ユーザに話題候補として提案するための処理を行う。話題提案部120は、図2に示すように、例えば、盛り上がり度算出部122と、話題抽出部124と、期待度算出部126とを有する。
話題提示処理部130は、話題提案部120により抽出された話題候補をユーザに提示するための処理を行う。話題提示処理部130は、ユーザと話し相手とが会話しているシチュエーションにより、話題の提示の仕方を変化させてもよい。例えば、周囲に気付かれないように話題を提供する場合には、提供しようとするユーザのみにさりげなく話題を提示するようにしてもよい。あるいは、話題を提示することも場を盛り上げるエンターテインメント要素とする場合には、会話に参加している各ユーザに話題を提示するようにしてもよい。話題提示処理部130は、話題候補をユーザに提示する機器にて提示可能なように所定の処理を行い、各機器へ出力する。
以下、本実施形態に係る話題提供システム1を用いた話題提供方法について、図3に沿って説明する。図3は、本実施形態に係る話題提供システム1を用いた話題提供方法を示すフローチャートである。
図3に示すように、本実施形態に係る話題提供システム1は、まず、話題生成部110により、ユーザに提示する話題を生成する処理を行う(S110)。本実施形態に係る話題提供システム1では、会話をしているユーザの行動に関する情報に基づいて、話し相手との話題を生成する。話題生成部110は、会話をしているユーザの行動に関する情報を取得して、自分では気づいていない行動の変化や他人とのずれ等から、話題となる情報を抽出し、話題候補とする。
(1)自己の行動に関する情報から話題を生成
(2)ユーザ間に共通する共通行動に関する行動から話題を生成
(3)各ユーザのプロファイルをマッチングすることにより話題を生成
以下、各方法について詳細に説明していく。
まず、図4~図7に基づいて、話題を提供しようとするユーザの行動に関する情報のみを用いて話題を生成する方法を説明する。図4は、ユーザの行動ログの比較を説明する説明図である。図5は、ユーザの過去の行動傾向と直近行動とのずれから話題性のある行動を抽出する例を示す説明図である。図6は、ユーザの行動傾向の変化から話題性のある行動を抽出する例を示す説明図である。図7は、ユーザの過去の行動傾向において、ある区間で過去及び直近の行動傾向と異なる行動傾向を抽出する例を示す説明図である。
頻度:e.g.さっきから頻繁に歩いていた
継続:e.g.さっきからずっと歩き続けていた
間隔:e.g.さっき歩いたときからしばらく経って、また歩いた
差分:e.g.今回は前回までと比べて、電車に乗った時間が遅い
合計:e.g.ある時間帯より、今の時間帯の方がよく歩いている
突出:e.g.ここ最近に比べて今の時間帯は色々やっている
類似:e.g.今の時間帯のパターンは前回と同じだ
変化:e.g.最近よく走るようになった
周期:e.g.毎年春になると走っている
具体的に説明すると、まず、「突出」という行動特徴からは、自己の過去の行動傾向に対する直近行動のずれを抽出することができる。例えば、図5上側に示すように、ユーザがある滞留位置に到着するまでの行動について、普段は電車と歩きにより滞留位置に到着することが多いのに、直近の行動では電車と走りにより滞留位置に到着したことが検出されたとする。所定の時間帯かつ所定の位置での普段の行動の頻度あるいは回数について確率分布を作成すると、例えば図5下側のような正規分布となる。この分布において、中央の所定の範囲の行動は、その行動は頻繁に行われるものであり、話題にはならない日常的範囲の行動といえる。一方、日常的範囲外の行動は、普段とは異なる非日常的範囲の行動といえる。
次に、「変化」という行動特徴からは、自己の過去の行動傾向からの直近行動の変化の傾向を抽出することができる。例えば、ユーザのジョギングという行動について、徐々に長時間走るようになっているとき、所定の期間毎(例えば3か月単位)にユーザの一日あたりの走行時間の分布をみると、図6下側に示すように分布が徐々に右側へ移動していることがわかる。
また、「差分」という行動特徴からは、自己の過去のある区間における行動傾向が、過去の他の区間及び直近の行動傾向と異なることを抽出することができる。これは、過去における「突出」という行動特徴ともいえる。例えば、図7に示すように、ユーザのジョギングという行動について、生涯を通しての行動傾向と直近行動とはほぼ相違ないが、過去のある期間(例えば高校時代)には生涯での分布の標準範囲を超える行動傾向があった場合等がこの行動特徴に該当する。この例では、生涯を通しての行動傾向が比較情報となる。このように、全体において一部のみが突出して他と異なる行動をとっていたことを抽出し、話題としてもよい。
次に、図8~図11に基づいて、ユーザ間で共通する行動に関する情報を用いて話題を生成する方法を説明する。図8は、ユーザ間で共通する行動の頻度の差分から話題性のある行動を抽出する例を示す説明図である。図9は、ユーザ間で共通する行動に関する情報のずれから話題性のある行動を抽出する例を示す説明図である。図10は、ユーザ間で共通する行動に関する情報のずれから話題性のある行動を抽出する他の例を示す説明図である。図11は、ユーザ間で共通する行動の行動傾向の世間とのずれから話題を抽出する例を示す説明図である。
共通行動分析により話題を生成する具体的な方法として、例えば、ユーザ間で共通する行動の頻度の差分から話題性のある行動を抽出する方法がある。例えば、会話するユーザがともに複合施設Mへ行ったことがある場合に、複合施設Mに行く頻度に差があることを話題にすることができる。一例として、図8下側に示すように、世代や所属等でまとめられたある集団において直近1年の間に複合施設Mに行った回数が、大多数は5回以下であったとする。ここで、話題を提供するユーザが複合施設Mに先日初めて行った場合、このユーザは大多数に含まれるが、話し相手は月2、3回複合施設Mに行っている場合、この話し相手は少数側に含まれる。このように、複合施設Mに行く頻度に差がある場合、その頻度の差に関することを話題にしてもよい。
共通行動分析により話題を生成する他の方法として、会話するユーザ間の共通行動に関する下位情報の差分を利用した話題の生成もある。例えば、図9下側に示すように、世代や所属等でまとめられたある集団において、ある時間帯に複合施設Mで行った行動をみたとき、多くの人は買い物をしていることがわかったとする。話し相手も、大多数の人と同様、複合施設Mでは買い物をしている。ここで、話題を提供するユーザも通常は買い物で複合施設Mに行くが、研修を受けに複合施設Mに行った場合、複合施設Mに行く目的が通常とは異なっていたことを話題としてもよい。
共通行動分析により話題を生成する他の方法として、例えば、ユーザ間で共通する行動の世間からのずれを抽出する方法がある。すなわち、会話するユーザが共に世間から外れた行動傾向があるような場合である。例えば、会話するユーザがともに競技自転車をやっている場合、日常の移動で自転車を利用するような一般的な人とは自転車利用時間が大きく異なる。例えば図11下側に示すような、世代や所属等でまとめられたある集団において1日あたりの自転車利用時間の分布において、自転車利用時間がある時間以下であるのが大多数であるとする。このとき、会話するユーザの自転車利用時間は共に大多数の人の時間よりもはるかに多く、少数側であるとき、この行動傾向の世間との相違を話題としてもよい。
次に、会話をするユーザのプロファイルに基づき話題を生成する方法を説明する。本実施形態に係るプロファイルの生成においては、時系列な位置情報と、行動認識結果とを用いる。
まず、図13~図18に基づいて、プロファイル生成処理について詳細に説明する。なお、図13は、滞留位置及び行動認識結果からユーザのその滞留位置での目的を表すプロファイルを生成する処理を説明する説明図である。図14は、図13のプロファイル生成処理により取得されたユーザ行動情報テーブル142の一例を示す説明図である。図15~図17は、滞留位置及び行動認識結果から生成されたユーザのプロファイル例を示す説明図である。図18は、アプリ起動履歴と行動認識結果とから生成されたプロファイルの一例を示す説明図である。
プロファイルは、例えば滞留位置と行動認識結果とに基づき生成することができる。滞留位置と行動認識結果とに基づくプロファイル生成処理では、図13に示すように、まず、ユーザの位置情報である緯度経度を場所名情報と場所カテゴリとに変換する。場所名は、地名や施設名等の名称であり、場所カテゴリは、場所のジャンルを表す情報である。
また、プロファイルは、例えば滞留位置間の遷移における移動手段の履歴に基づき生成することができる。滞留位置の遷移に基づくプロファイル生成処理で得られるプロファイルは、その人の行動傾向を表している。
上記以外に、例えばユーザが利用するスマートフォンやウェアラブル端末等にインストールされたアプリケーションの起動履歴と行動認識結果とを用いてプロファイルを生成することも可能である。例えば、図18に、アプリ起動履歴及び行動認識結果に基づき得られたプロファイルを記録するユーザ行動情報テーブル146の一例を示す。
次に、図19及び図20に基づいて、本実施形態に係るプロファイルマッチング処理について詳細に説明する。なお、図19は、プロファイルのマッチングにより提示する話題を決定する処理を説明する説明図である。図20は、ユーザ同士の親密度と開示情報との関係を示す説明図である。
ステップS110にて話題が生成されると、図3に示すように、話題提案部120により生成された話題からユーザに提供する話題が抽出される(S120)。
ステップS120の処理の考え方として、まず、図21に基づいて、ユーザ間の会話コミュニケーションのプロセスを説明する。図21は、ユーザ間の会話コミュニケーションのプロセスの概念図である。
・「無関心」:相手に関心がないので、コミュニケーションが成立しない状態
・「相手への関心を持つ」:相手に関心をもち、相手を理解しようと試みる状態
・「相手から関心を持たれたい」:相手にも自分を理解してもらおうと試みる状態
本実施形態では、コミュニケーションを良好に継続させるための評価指標として、「場の盛り上がり度」を導入する。「場」とは、一時的に形成されるコミュニケーションイベントを表す。図22に、コミュニケーションの場と当該コミュニケーションへの参加者との概念を示す。図22では、参加者A、B、Cの3名によって一時的なコミュニケーションの場が形成されている。各参加者の盛り上がり度によって、場の盛り上がり度が変化する。ここで、場の盛り上がり度HPは、以下の式(1)で表される。env(t)は、時間の経過による場の盛り下がりを表す環境変化である。場の盛り上がり度HPは、各参加者の盛り上がり度HA、HB、HCによって変化する。
次に、図23に基づき、話題抽出部124によるユーザへ提示する話題の選択方法について説明する。なお、図23は、本実施形態に係る話題選択方法を示すフローチャートである。話題抽出部124は、盛り上がり度算出部122により算出された上記評価値を用いて、話題生成部110にて生成された話題からユーザへ提示する話題を抽出する。図23に示す処理は、例えば話題が場へ投入されたタイミングで実行される。
ここで、話題投入による場の盛り上がりへの期待を期待度として参加者へ提示してもよい。これにより、参加者は、各話題について、投入した際の場の盛り上がりの大きさを想定しつつ投入する話題を選択することが可能となる。
図23の処理に沿って上述したように、話題提供システム1は、コミュニケーションの場への話題の投入と評価を繰り返して、会話が好循環する状態となるようにサポートする。図25に、話題の投入と場の盛り上がり度との一関係を示す。場の盛り上がり度HPは、上述したように、式(1)に基づき算出される。式(1)において、環境項env(t)は時間に比例して盛り上がり度を下げ、離散的に投入される話題により盛り上がり度は押し上げられる。したがって、盛り上がり度の時間推移をみると、図25に示すように、話題が投入されると一時的に盛り上がり度が上昇し、その後時間が経過するにつれ盛り上がり度が低下するという動きを取ることがわかる。
話題提案部120は、話題抽出部124により話題を抽出するにあたり、シチュエーションを分析して、提案する話題に重み付けを行ってもよい。これにより、より状況に適した話題を提案することが可能となる。
[3.3.話題提示処理]
ステップS130にて提示する話題が抽出されると、図3に示すように、話題提示処理部130により、ユーザに話題を提供するための処理が行われる(S130)。話題提示処理部130は、その場の状況により、本人、または話題を共有している複数人に話題を提示する。
まず、話題提供システム1がある個人へ話題を提示する場合について説明する。個人への話題の提示方法としては、例えば、視覚的な提示方法として、グラス型ウェアラブル端末の表示パネルに話題を提示したり、個人が保持する機器や空間にあるLEDを発光させたりして話題を提示することが考えられる。また、聴覚的な提示方法として、例えばイヤホンやオープンスピーカ等を用いることもでき、聴覚、力覚的な提示方法として、例えば振動や温度等を利用することもできる。個人に対して話題を提示する場合は、周囲に気付かれないようにさりげなく話題を提示できるようにしてもよい。
次に、話題提供システム1がコミュニケーションの場に参加している複数人へ話題を提示する場合について説明する。複数人への話題の提示方法としては、例えば、視覚的な提示方法として、プロジェクタによりテーブルトップ等の投影面に話題を提示することが考えられる。また、聴覚的な提示方法として、例えばスピーカ等を用いることもでき、触覚的な提示方法として、例えば空調やテーブル等の温度の調整を行うことも可能である。複数人に対して話題を提示する場合は、話題の提示まで含めて場を盛り上げるようなエンターテインメント要素も付与することも可能である。
・話題の提示方法
本実施形態に係る話題提供システム1による話題の提示方法は、図30や図31に示した例に限定されず、例えば、情報の提示は文字に限定されない。例えば、アイコン表示によりユーザに話題を提示するようにしてもよい。具体的には、自転車に関する話題を示す場合には自電車の絵のアイコンを表示させたり、過去のエピソードを話題とすることを提示する場合には、そのイベント時の写真を表示させたりしてもよい。
また、話題は常時提示しなくともよく、話題を提示すべきときだけカードがグラス型ウェアラブル端末やテーブルトップ上に自動的に提示されるようにしてもよい。あるいは、ユーザの指示を受けた場合にのみ、話題を提示するようにしてもよい。例えば、ユーザが話題を表示させるための特定のアイコンに対して目線を合わしたりコントローラでカーソルを合わせたりした時だけ、話題がグラス型ウェアラブル端末やテーブルトップ上に提示されてもよい。
さらに、話題の提示先は、グラス型ウェアラブル端末やテーブルトップに限定されるものではない。例えば、抽出された話題候補の一覧を個人所有のスマートフォンやタブレット端末、バンド等の機器上に表示させてもよい。なお、これら機器に話題を提示した場合、ユーザが話題の提示に気付かない可能性もある。そこで、例えば話題準備完了時にユーザに通知し、ユーザが提示された話題を見るように誘導してもよい。具体的な通知方法としては、当人の機器のみ音を鳴らすようにしたり、鐘の音等によって全員に話題切り替えタイミングを教えるような演出をしたりしてもよい。あるいは、ユーザが装着するバンドを振動させて通知してもよい。また、音声通知と表示通知とを組み合わせて、インデックスだけ音声で伝え、一覧取得を画像として取得させるようにしてもよい。
また、図31に示したテーブルトップ上に話題を表示させた場合等のように、会話に参加する複数人がそれぞれカードを見ることができる場合に、各参加者の目線や動作を基に、カード表示に演出を加えてもよい。例えば、ユーザが自分の手持ちのカードをじっと見るとそのカード群を展開させ、ユーザがカード群から目線を外すとカード群が集約されるようにしてもよい。あるいは、図35に示すように、他のユーザが自分の手持ちのカードを覗き込む動作を検出したとき、カードを隠すパネル30(31、33、35、37)を表示させて、各カードをパネル30の後に回して隠すようにしてもよい。なお、カード群の展開、集約、隠し表示等は、ユーザの視線に基づき行ってもよく、コントローラ等を用いて操作してもよい。
さらに、ゲーム性を強調するために仮想の司会役を設定してもよい。例えば、各プレーヤー(参加者)は仮想の司会役に話題を提出し、司会がそれぞれから提出された話題を選択するようなシステムでもよい。司会役は、多数決やランダムで話題を選んでもよい。もちろん、この司会をファシリテータに置き換え、手動により選択するようにしてもよい。
上述した関連話題の提示や統計表示のためには、これまで何の話題が選択されたかを話題提供システム1が知る必要がある。この認識は、単語等をもとにした自動認識により行ってもよい。あるいは、テーブルトップ等へのタッチ操作や、ユーザが装着するブレスレット等のスライド操作、機器のタップ操作、視線操作、目線で選択してうなずきで話題を決定する操作等といった各種操作を手動認識し、選択された話題を認識してもよい。
以上、本実施形態に係る話題提供システム1の構成と、これによる話題提供方法について説明した。本開示によれば、話題提供システム1の情報処理装置100は、ユーザのある行動と、当該行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成し、生成した話題からユーザに提案する話題候補を決定する。そして、情報処理装置100は、話題を提供するコミュニケーションの場に応じた適切な話題を提案する。これにより、ユーザはあまり面識のない人とのコミュニケーションにおいても、相手とアイスブレイクするための話題を得ることができ、良好なコミュニケーションを取ることが可能となる。
次に、図36を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図36は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態におけるサーバ10を実現しうる。
本開示の実施形態は、例えば、上記で説明したような情報処理装置(例えばサーバ)、システム、情報処理装置またはシステムで実行される情報処理方法、情報処理装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。
(1)
ユーザのある行動と、前記行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成する話題生成部と、
生成された前記話題からユーザに提案する話題候補を決定する話題提案部と、
を備える、情報処理装置。
(2)
ある行動に関する前記ユーザの日常的な行動を前記比較情報として、
前記話題生成部は、直近の前記ユーザの行動が当該ユーザの日常的な行動と異なる場合、前記直近のユーザの行動の内容を話題として生成する、前記(1)に記載の情報処理装置。
(3)
ある行動に関する前記ユーザの過去の行動傾向を前記比較情報として、
前記話題生成部は、直近の前記ユーザの行動傾向が当該ユーザの過去の行動傾向から変化している場合、当該ユーザの行動傾向の変化を話題として生成する、前記(1)または(2)に記載の情報処理装置。
(4)
ある行動に関する前記ユーザの過去の行動傾向を前記比較情報として、
前記話題生成部は、前記ユーザの過去のある区間における行動傾向が、当該ユーザの過去の他の区間における行動傾向及び直近の前記ユーザの行動傾向と異なる場合、当該ユーザの行動傾向の変化を話題として生成する、前記(1)~(3)のいずれか1項に記載の情報処理装置。
(5)
前記ユーザと前記話し相手とが行ったことのある共通行動に関する、前記話し相手の行動傾向を前記比較情報として、
前記話題生成部は、ある共通行動に関する前記ユーザの行動と前記話し相手の行動傾向とが異なる場合、前記共通行動に関する情報を話題として生成する、前記(1)~(4)のいずれか1項に記載の情報処理装置。
(6)
前記話題生成部は、前記共通行動の発生頻度に基づいて、前記ユーザと前記話し相手との行動の差分を判定する、前記(5)に記載の情報処理装置。
(7)
前記話題生成部は、前記共通行動に関する下位情報に基づいて、前記ユーザと前記話し相手との行動の差分を判定する、前記(5)に記載の情報処理装置。
(8)
前記ユーザと前記話し相手とが行ったことのある共通行動に関する、前記話し相手の行動傾向を前記比較情報として、
前記話題生成部は、ある共通行動に関する前記ユーザの行動傾向及び前記話し相手の行動傾向が世間の行動傾向と異なる場合、前記共通行動に関する情報を話題として生成する、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
前記話題提案部は、
前記ユーザと前記話し相手とにより形成されるコミュニケーションの場の盛り上がり度を算出する盛り上がり度算出部と、
前記話題生成部により生成された前記話題から、前記盛り上がり度に応じた話題を抽出する話題抽出部と、
を備える、前記(1)~(8)のいずれか1項に記載の情報処理装置。
(10)
前記盛り上がり度算出部は、前記ユーザの発話に対する前記話し相手の反応に関する評価関数に基づいて、前記盛り上がり度を算出する、前記(9)に記載の情報処理装置。
(11)
前記話題抽出部は、前記ユーザの発話に対する前記コミュニケーションの場の盛り上がり度に応じて、前記話題候補として抽出する話題における今回発話した話題に関連する関連情報の割合を変化させる、前記(9)または(10)に記載の情報処理装置。
(12)
前記話題抽出部は、前記ユーザと前記話し相手とが会話するシチュエーションに応じて決定される状態情報に基づいて、前記話題生成部により生成された各話題に重み付けを行う、前記(9)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記話題提案部は、前記話題候補として抽出された各話題について、話題投入により前記コミュニケーションの場が盛り上がる期待度を算出する期待度算出部を備える、前記(9)~(12)のいずれか1項に記載の情報処理装置。
(14)
前記話題候補をユーザに提示するための処理を行う話題提示処理部を備える、前記(1)~(13)のいずれか1項に記載の情報処理装置。
(15)
前記話題提示処理部は、前記ユーザと前記話し相手とにより形成されるコミュニケーションの場に参加する少なくとも1人に対して前記話題候補を提示する処理を行う、前記(14)に記載の情報処理装置。
(16)
前記話題提示処理部は、コミュニケーションの場の盛り上がり度の推移を前記ユーザと前記話し相手とにより形成されるコミュニケーションの場に参加する少なくとも1人に対して提示する処理を行う、前記(14)または(15)に記載の情報処理装置。
(17)
前記話題提示処理部は、盛り上がり度の変化に応じて、コミュニケーションの場の環境設定を変化させる、前記(14)~(16)のいずれか1項に記載の情報処理装置。
(18)
プロセッサにより、ユーザのある行動と、前記行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成すること、
生成された前記話題からユーザに提案する話題候補を決定すること、
とを含む、情報処理方法。
(19)
コンピュータを、
ユーザのある行動と、前記行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成する話題生成部と、
生成された前記話題からユーザに提案する話題候補を決定する話題提案部と、
を備える、情報処理装置として機能させるプログラム。
110 話題生成部
112 自己行動分析部
114 共通行動分析部
116 プロファイル分析部
120 話題提案部
122 盛り上がり度算出部
124 話題抽出部
126 期待度算出部
130 話題提示処理部
140 プロファイルDB
142、146 ユーザ行動情報テーブル
144 行動傾向テーブル
Claims (19)
- ユーザのある行動と、前記行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成する話題生成部と、
生成された前記話題からユーザに提案する話題候補を決定する話題提案部と、
を備える、情報処理装置。 - ある行動に関する前記ユーザの日常的な行動を前記比較情報として、
前記話題生成部は、直近の前記ユーザの行動が当該ユーザの日常的な行動と異なる場合、前記直近のユーザの行動の内容を話題として生成する、請求項1に記載の情報処理装置。 - ある行動に関する前記ユーザの過去の行動傾向を前記比較情報として、
前記話題生成部は、直近の前記ユーザの行動傾向が当該ユーザの過去の行動傾向から変化している場合、当該ユーザの行動傾向の変化を話題として生成する、請求項1に記載の情報処理装置。 - ある行動に関する前記ユーザの過去の行動傾向を前記比較情報として、
前記話題生成部は、前記ユーザの過去のある区間における行動傾向が、当該ユーザの過去の他の区間における行動傾向及び直近の前記ユーザの行動傾向と異なる場合、当該ユーザの行動傾向の変化を話題として生成する、請求項1に記載の情報処理装置。 - 前記ユーザと前記話し相手とが行ったことのある共通行動に関する、前記話し相手の行動傾向を前記比較情報として、
前記話題生成部は、ある共通行動に関する前記ユーザの行動と前記話し相手の行動傾向とが異なる場合、前記共通行動に関する情報を話題として生成する、請求項1に記載の情報処理装置。 - 前記話題生成部は、前記共通行動の発生頻度に基づいて、前記ユーザと前記話し相手との行動の差分を判定する、請求項5に記載の情報処理装置。
- 前記話題生成部は、前記共通行動に関する下位情報に基づいて、前記ユーザと前記話し相手との行動の差分を判定する、請求項5に記載の情報処理装置。
- 前記ユーザと前記話し相手とが行ったことのある共通行動に関する、前記話し相手の行動傾向を前記比較情報として、
前記話題生成部は、ある共通行動に関する前記ユーザの行動傾向及び前記話し相手の行動傾向が世間の行動傾向と異なる場合、前記共通行動に関する情報を話題として生成する、請求項1に記載の情報処理装置。 - 前記話題提案部は、
前記ユーザと前記話し相手とにより形成されるコミュニケーションの場の盛り上がり度を算出する盛り上がり度算出部と、
前記話題生成部により生成された前記話題から、前記盛り上がり度に応じた話題を抽出する話題抽出部と、
を備える、請求項1に記載の情報処理装置。 - 前記盛り上がり度算出部は、前記ユーザの発話に対する前記話し相手の反応に関する評価関数に基づいて、前記盛り上がり度を算出する、請求項9に記載の情報処理装置。
- 前記話題抽出部は、前記ユーザの発話に対する前記コミュニケーションの場の盛り上がり度に応じて、前記話題候補として抽出する話題における今回発話した話題に関連する関連情報の割合を変化させる、請求項9に記載の情報処理装置。
- 前記話題抽出部は、前記ユーザと前記話し相手とが会話するシチュエーションに応じて決定される状態情報に基づいて、前記話題生成部により生成された各話題に重み付けを行う、請求項9に記載の情報処理装置。
- 前記話題提案部は、前記話題候補として抽出された各話題について、話題投入により前記コミュニケーションの場が盛り上がる期待度を算出する期待度算出部を備える、請求項9に記載の情報処理装置。
- 前記話題候補をユーザに提示するための処理を行う話題提示処理部を備える、請求項1に記載の情報処理装置。
- 前記話題提示処理部は、前記ユーザと前記話し相手とにより形成されるコミュニケーションの場に参加する少なくとも1人に対して前記話題候補を提示する処理を行う、請求項14に記載の情報処理装置。
- 前記話題提示処理部は、コミュニケーションの場の盛り上がり度の推移を前記ユーザと前記話し相手とにより形成されるコミュニケーションの場に参加する少なくとも1人に対して提示する処理を行う、請求項14に記載の情報処理装置。
- 前記話題提示処理部は、盛り上がり度の変化に応じて、コミュニケーションの場の環境設定を変化させる、請求項14に記載の情報処理装置。
- プロセッサにより、ユーザのある行動と、前記行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成すること、
生成された前記話題からユーザに提案する話題候補を決定すること、
とを含む、情報処理方法。 - コンピュータを、
ユーザのある行動と、前記行動に関する比較情報との比較結果に基づいて、話し相手との話題を生成する話題生成部と、
生成された前記話題からユーザに提案する話題候補を決定する話題提案部と、
を備える、情報処理装置として機能させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16807171.0A EP3309738A4 (en) | 2015-06-12 | 2016-03-02 | Information processing device, information processing method, and program |
JP2017523130A JP6729571B2 (ja) | 2015-06-12 | 2016-03-02 | 情報処理装置、情報処理方法及びプログラム |
US15/566,876 US10665229B2 (en) | 2015-06-12 | 2016-03-02 | Information processing device, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-118930 | 2015-06-12 | ||
JP2015118930 | 2015-06-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016199464A1 true WO2016199464A1 (ja) | 2016-12-15 |
Family
ID=57503154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/056481 WO2016199464A1 (ja) | 2015-06-12 | 2016-03-02 | 情報処理装置、情報処理方法及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10665229B2 (ja) |
EP (1) | EP3309738A4 (ja) |
JP (1) | JP6729571B2 (ja) |
WO (1) | WO2016199464A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112243197A (zh) * | 2019-07-18 | 2021-01-19 | 丰田自动车株式会社 | 车辆用沟通装置及车辆用沟通系统 |
JP2021086234A (ja) * | 2019-11-25 | 2021-06-03 | 株式会社Aill | コミュニケーション支援サーバ、コミュニケーション支援システム、コミュニケーション支援方法、及びコミュニケーション支援プログラム |
US11240191B2 (en) | 2019-03-22 | 2022-02-01 | Fujifilm Business Innovation Corp. | Socializing support apparatus and non-transitory computer readable medium |
WO2022030322A1 (ja) * | 2020-08-07 | 2022-02-10 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP2022528021A (ja) * | 2018-09-14 | 2022-06-08 | ライク,フィリップ | 交流推薦システム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9500865B2 (en) * | 2013-03-04 | 2016-11-22 | Alex C. Chen | Method and apparatus for recognizing behavior and providing information |
US10206572B1 (en) | 2017-10-10 | 2019-02-19 | Carrot, Inc. | Systems and methods for quantification of, and prediction of smoking behavior |
US11699039B2 (en) * | 2017-06-28 | 2023-07-11 | Microsoft Technology Licensing, Llc | Virtual assistant providing enhanced communication session services |
US10585991B2 (en) | 2017-06-29 | 2020-03-10 | Microsoft Technology Licensing, Llc | Virtual assistant for generating personalized responses within a communication session |
JP6743108B2 (ja) * | 2018-10-31 | 2020-08-19 | 西日本電信電話株式会社 | パターン認識モデル及びパターン学習装置、その生成方法、それを用いたfaqの抽出方法及びパターン認識装置、並びにプログラム |
CN110176055B (zh) * | 2019-05-28 | 2023-04-18 | 重庆大学 | 一种用于在3d虚拟引擎中模拟实时全局光照的自适应方法 |
CA3166398A1 (en) * | 2019-12-30 | 2021-07-08 | Cilag Gmbh International | Systems and methods for assisting individuals in a behavioral-change program |
US11461405B2 (en) * | 2020-01-13 | 2022-10-04 | International Business Machines Corporation | Technology based commonality detection system |
CN117493511B (zh) * | 2023-11-02 | 2024-06-07 | 国农(重庆)生猪大数据产业发展有限公司 | 基于大语言模型的生猪领域自动问答系统及方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001195430A (ja) * | 1999-11-02 | 2001-07-19 | Atr Media Integration & Communications Res Lab | 知識共有促進システム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3790602B2 (ja) * | 1997-04-25 | 2006-06-28 | 富士ゼロックス株式会社 | 情報共有装置 |
JP2001188787A (ja) | 1999-12-28 | 2001-07-10 | Sony Corp | 会話処理装置および方法、並びに記録媒体 |
US6721704B1 (en) * | 2001-08-28 | 2004-04-13 | Koninklijke Philips Electronics N.V. | Telephone conversation quality enhancer using emotional conversational analysis |
US8255421B2 (en) * | 2008-04-02 | 2012-08-28 | Panasonic Corporation | Communication assistance device, communication assistance method, and communication assistance program |
US8473512B2 (en) * | 2009-11-06 | 2013-06-25 | Waldeck Technology, Llc | Dynamic profile slice |
US20120296991A1 (en) * | 2011-02-23 | 2012-11-22 | Nova Spivack | Adaptive system architecture for identifying popular topics from messages |
WO2013059199A1 (en) * | 2011-10-17 | 2013-04-25 | Disintermediation Services, Inc. | Two-way real time communication allowing asymmetric participation across multiple electronic platforms |
US10204026B2 (en) * | 2013-03-15 | 2019-02-12 | Uda, Llc | Realtime data stream cluster summarization and labeling system |
FR3011375B1 (fr) * | 2013-10-01 | 2017-01-27 | Aldebaran Robotics | Procede de dialogue entre une machine, telle qu'un robot humanoide, et un interlocuteur humain, produit programme d'ordinateur et robot humanoide pour la mise en œuvre d'un tel procede |
US20150156268A1 (en) * | 2013-12-04 | 2015-06-04 | Conduit Ltd | Suggesting Topics For Social Conversation |
US20150261867A1 (en) * | 2014-03-13 | 2015-09-17 | Rohit Singal | Method and system of managing cues for conversation engagement |
US20150281371A1 (en) * | 2014-03-26 | 2015-10-01 | Conversant Intellectual Property Management Incorporated | Apparatus, system, and method for connecting devices |
US9645703B2 (en) * | 2014-05-14 | 2017-05-09 | International Business Machines Corporation | Detection of communication topic change |
US9296396B2 (en) * | 2014-06-13 | 2016-03-29 | International Business Machines Corporation | Mitigating driver fatigue |
US10375129B2 (en) * | 2014-06-17 | 2019-08-06 | Microsoft Technology Licensing, Llc | Facilitating conversations with automated location mapping |
US20160164813A1 (en) * | 2014-12-04 | 2016-06-09 | Intel Corporation | Conversation agent |
US10395552B2 (en) * | 2014-12-19 | 2019-08-27 | International Business Machines Corporation | Coaching a participant in a conversation |
US9628415B2 (en) * | 2015-01-07 | 2017-04-18 | International Business Machines Corporation | Destination-configured topic information updates |
US9639770B2 (en) * | 2015-03-26 | 2017-05-02 | Konica Minolta Laboratory U.S.A., Inc. | System and method for improving communication productivity |
-
2016
- 2016-03-02 EP EP16807171.0A patent/EP3309738A4/en not_active Ceased
- 2016-03-02 US US15/566,876 patent/US10665229B2/en active Active
- 2016-03-02 JP JP2017523130A patent/JP6729571B2/ja active Active
- 2016-03-02 WO PCT/JP2016/056481 patent/WO2016199464A1/ja active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001195430A (ja) * | 1999-11-02 | 2001-07-19 | Atr Media Integration & Communications Res Lab | 知識共有促進システム |
Non-Patent Citations (3)
Title |
---|
AKIRA IMAYOSHI: "Detection of Active Parts of Conversation in Real Time Using Laughter", FIT2011 DAI 10 KAI FORUM ON INFORMATION TECHNOLOGY KOEN RONBUNSHU, SADOKUTSUKI RONBUN· IPPAN RONBUN, GAZO NINSHIKI·MEDIA RIKAI, GRAPHICS·GAZO, HUMAN COMMUNICATION & INTERACTION, KYOIKU KOGAKU·FUKUSHI KOGAKU· MULTIMEDIA OYO, vol. 3, 22 August 2011 (2011-08-22), pages 551 - 554, XP009507769 * |
See also references of EP3309738A4 * |
TSUYOSHI TANAKA: "Kodo Suitei o Mochiita Taimen Communication Sien System no Teian", SYMPOSIUM ON MULTIMEDIA, DISTRIBUTED, COOPERATIVE AND MOBILE SYSTEMS (DICOM02012) RONBUNSHU IPSJ SYMPOSIUM SERIES, vol. 2012, no. 1, 4 July 2012 (2012-07-04), pages 2359 - 2366, XP009507731, ISSN: 1882-0840 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022528021A (ja) * | 2018-09-14 | 2022-06-08 | ライク,フィリップ | 交流推薦システム |
JP7278213B2 (ja) | 2018-09-14 | 2023-05-19 | ライク,フィリップ | 交流推薦システム |
US11240191B2 (en) | 2019-03-22 | 2022-02-01 | Fujifilm Business Innovation Corp. | Socializing support apparatus and non-transitory computer readable medium |
CN112243197A (zh) * | 2019-07-18 | 2021-01-19 | 丰田自动车株式会社 | 车辆用沟通装置及车辆用沟通系统 |
JP2021018546A (ja) * | 2019-07-18 | 2021-02-15 | トヨタ自動車株式会社 | 車両用コミュニケーション装置および車両用コミュニケーションシステム |
JP2021086234A (ja) * | 2019-11-25 | 2021-06-03 | 株式会社Aill | コミュニケーション支援サーバ、コミュニケーション支援システム、コミュニケーション支援方法、及びコミュニケーション支援プログラム |
JP7444430B2 (ja) | 2019-11-25 | 2024-03-06 | 株式会社Aill | コミュニケーション支援サーバ、コミュニケーション支援システム、コミュニケーション支援方法、及びコミュニケーション支援プログラム |
WO2022030322A1 (ja) * | 2020-08-07 | 2022-02-10 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6729571B2 (ja) | 2020-07-22 |
EP3309738A1 (en) | 2018-04-18 |
US20180108347A1 (en) | 2018-04-19 |
US10665229B2 (en) | 2020-05-26 |
JPWO2016199464A1 (ja) | 2018-04-05 |
EP3309738A4 (en) | 2018-11-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016199464A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP6777201B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
US11640589B2 (en) | Information processing apparatus, control method, and storage medium | |
US10731992B2 (en) | Information processing device, information processing method, and program | |
JP6756328B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20160275175A1 (en) | Information processing apparatus, information processing method, and program | |
CN108293073B (zh) | 沉浸式临场感 | |
WO2016136104A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
US20200005784A1 (en) | Electronic device and operating method thereof for outputting response to user input, by using application | |
JPWO2019116658A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
WO2015190141A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2016189121A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2018186326A (ja) | ロボット装置及びプログラム | |
WO2018155052A1 (ja) | 情報処理装置、情報処理方法および情報処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16807171 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017523130 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15566876 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016807171 Country of ref document: EP |