WO2017029848A1 - 情報処理システム、および情報処理方法 - Google Patents
情報処理システム、および情報処理方法 Download PDFInfo
- Publication number
- WO2017029848A1 WO2017029848A1 PCT/JP2016/064837 JP2016064837W WO2017029848A1 WO 2017029848 A1 WO2017029848 A1 WO 2017029848A1 JP 2016064837 W JP2016064837 W JP 2016064837W WO 2017029848 A1 WO2017029848 A1 WO 2017029848A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- happiness
- specific
- emotion
- user
- history
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0202—Market predictions or forecasting for commercial activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
Definitions
- This disclosure relates to an information processing system and an information processing method.
- Patent Document 1 describes a system in which an impression of a television program is prompted by an emotional choice such as anger, crying, and laughing, and is aggregated and shared by a social service friend.
- Patent Document 2 describes a system that evaluates a restaurant that provides a dish based on the scoring according to the feature amount of the dish image and the smile level of the person who eats the dish.
- Patent Document 3 describes a system that estimates an emotional state of a user for online content or an event and generates an average emotional score for each user.
- the present disclosure proposes an information processing system and an information processing method capable of predicting a specific emotion level at a specific time and place.
- the accumulation unit that accumulates the emotion history associated with the level of the specific emotion calculated based on the user's behavior history, the date and position when the specific emotion occurred, and a related keyword; Based on the relationship between the specific date and time, the location, and at least one of the keywords associated with the date and time or the location and the emotion history stored in the storage unit, the specific emotion at the specific date and location
- An information processing system including a control unit that predicts a level is proposed.
- the processor stores, in the storage unit, a specific emotion level calculated based on a user's behavior history, a date and position when the specific emotion has occurred, and an emotion history in which related keywords are associated with each other. Based on the relationship between the accumulation and at least one of the specific date and time, the keyword associated with the date and time, and the emotion history stored in the storage unit, the specific date and time and location Proposing an information processing method including predicting a level of a specific emotion in
- the level of the specific emotion calculated based on the user's behavior history is accumulated together with information such as the time, place, or related keyword of the emotion occurrence, and the specific emotion at the specific time and place It is possible to predict the level of.
- the user can know in advance what kind of emotion he / she is likely to be today, as in the weather forecast. It becomes possible to grasp in advance the feelings of each place that changes with time.
- the “specific emotion” that is predicted and predicted by the information processing system according to the present embodiment is at least one of basic six emotions (specifically, anger, disgust, fear, joy, sadness, and surprise), Or a combination thereof.
- a specific emotion may occur when a person is confronting a person or thing in a one-to-one, one-to-many, or many-to-many manner.
- the method of confrontation does not have to be face-to-face, and may be, for example, a telephone, telegraph, mail, or IP (Internet protocol) message.
- IP Internet protocol
- This “place” may be a large place such as a meeting or a concert, or a small place such as a meal or a telephone.
- one-on-one communication is a face-to-face conversation
- one-to-one person-to-thing communication is a person watching a painting alone.
- fans are watching solo singer concerts
- soccer team members and supporters share joy. Etc. are assumed.
- the “place” of person-to-person may be a place created through things such as watching entertainers on television.
- the emotion prediction it is possible to predict not only whether or not a specific emotion is generated, but also what level (which means degree or strength) of the specific emotion is generated.
- the intensity may be expressed by normalizing a specific emotion to be generated as a value per unit time from 0 (minimum) to 1 (maximum), or from -1 (minimum) to 1 (maximum).
- a specific emotion for example, a case where an emotion of “joy” (specifically, a situation where a feeling of smile or joy occurs, hereinafter referred to as “happiness”) is predicted and predicted. To do. In the present embodiment, not only the occurrence of happiness but also the strength of happiness (also referred to as happiness strength) is predicted.
- happiness strength also referred to as happiness strength
- FIG. 1 is a diagram illustrating an overview of an information processing system according to an embodiment of the present disclosure.
- the happiness intensity at a specific time and place is predicted based on the accumulated happiness history data, and is presented to the user as a happiness forecast.
- the user can check the national happiness forecast by the user terminal 1 such as a smartphone.
- the happiness forecast screen as shown in FIG. 1, an icon corresponding to the happiness intensity is superimposed and displayed on the map.
- the icon corresponding to the happiness intensity may be an application of a weather forecast mark familiar to the user, such as a clear mark, a cloudy mark, a rain mark, and the degree of expression expressed by the facial expression included in the icon. May be.
- the happiness forecast screen also displays a date and time display and a time slider for designating a specific time.
- the user can grasp the nationwide happiness level at a desired time by operating the time slider.
- the happiness level of the whole country is shown on the map, but this embodiment is not limited to this, and the happiness level of the region, region, or town is displayed on a more detailed map. Alternatively, the happiness level of the whole world may be displayed.
- the user can check the happiness forecast of the whole country on the morning of a business trip in Japan, grasp the happiness level at a predetermined time at a business trip destination or waypoint, and be prepared in advance.
- FIG. 2 is a diagram illustrating the overall configuration of the information processing system according to the present embodiment.
- the information processing system according to the present embodiment includes a user terminal 1 and a happiness management server 2. Both are connected via the network 3.
- the happiness management server 2 performs a happiness analysis or prediction, as necessary, a schedule management server 4a, a life log management server 4b, a communication server 4c, a program information providing server 4d, an audio management server 4e, an image management server.
- Information is acquired from an external server 4 such as 4f, fan club management server 4g, performance information management server 4h, or rainbow information management server 4i.
- the external server 4 (4a to 4i) specifically illustrated in FIG. 2 is an example, and the present embodiment is not limited to this.
- the user terminal 1 is an information processing terminal that presents a happiness forecast to the user using information for displaying the happiness forecast transmitted from the happiness management server 2.
- the user terminal 1 is not limited to the smart phone as shown in FIGS. 1 and 2, for example, a mobile terminal such as a mobile phone terminal or a tablet terminal, or a smart watch, a smart band, a smart eyeglass, a smart neck, or the like It may be a wearable terminal.
- the schedule management server 4a has a function of registering schedule information (schedule content, time, location, etc.) transmitted from the user terminal 1 together with user information. Further, the schedule management server 4a has a function of presenting and editing the registered user's schedule information in response to a user operation. The schedule information can be input by the user using the user terminal 1.
- the life log management server 4b has a function of accumulating position information transmitted from the user terminal 1 on a daily basis together with user information and time information.
- the position information is acquired by the position information acquisition unit 120 (see FIG. 3) of the user terminal 1.
- the communication server 4c has a function of sharing text or images with friends and acquaintances.
- the user routinely transmits text or an image from the user terminal 1 to the communication server 4c, and the transmitted information is stored together with the user information and time information in the communication server 4c and can be viewed by other users.
- the program information providing server 4d has a function of providing information on TV programs or Internet programs.
- the program information providing server 4d accumulates information such as broadcast / distribution time, title, performer, and program content as program information.
- the voice management server 4e stores the voice information transmitted from the user terminal 1 in association with the user information and time information.
- the sound information is collected by the microphone 140 (see FIG. 3) of the user terminal 1.
- the voice management server 4e has a function of storing voice information transmitted by the user for personal use and a function of sharing with other users.
- the image management server 4f accumulates image information (including moving images) transmitted from the user terminal 1 in association with user information and time information.
- the image information is captured by the camera 130 (see FIG. 3) of the user terminal 1.
- the user may tag related names with respect to the face recognized by the face recognition and object recognition of the image information.
- the image management server 4f associates and accumulates image information, tag information, user information, and time information.
- the image management server 4f has a function of storing image information transmitted by the user for personal use and a function of sharing with other users.
- the fan club management server 4g accumulates fan club member information (name, address, identification number, telephone number, etc.) such as a predetermined artist or talent, and regularly or irregularly by e-mail or SNS (social media) or the like. It has a function to distribute information.
- fan club member information name, address, identification number, telephone number, etc.
- SNS social media
- the performance information management server 4h accumulates and distributes performance information (date, place, performance title, performers, performance details, etc.) of a predetermined artist or talent.
- the rainbow information management server 4i accumulates information on the location and time at which a rainbow is expected to occur.
- FIG. 3 is a block diagram illustrating a configuration example of the user terminal 1 according to the present embodiment.
- the user terminal 1 includes a control unit 100, a communication unit 110, a position information acquisition unit 120, a camera 130, a microphone 140, an operation input unit 150, a storage unit 160, a display unit 170, and a speaker 180. .
- the control unit 100 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the user terminal 1 according to various programs.
- the control unit 100 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
- the communication unit 110 transmits / receives data to / from an external device via wired / wireless.
- the location information acquisition unit 120 has a function of acquiring location information of the user terminal 1.
- the position information acquisition unit 120 may be a GPS processing unit that processes a GPS (Global Positioning System) antenna and a GPS signal received by the GPS antenna.
- the position information acquisition unit 120 estimates the distance to each base station from the Wi-Fi antenna that receives Wi-Fi (registered trademark) radio waves from a plurality of base stations and the received intensity of the received Wi-Fi radio waves.
- a position calculation unit that calculates the current position based on the principle of triangulation using the distance to each base station and the position of each base station may be used.
- the camera 130 images the surroundings according to a user operation, and outputs the captured image information to the control unit 100.
- the microphone 140 collects surrounding sounds according to user operations and outputs the collected sound information to the control unit 100.
- the operation input unit 150 is realized by a touch panel, a switch, a button, or the like, detects an operation input by the user, and outputs the detected input signal to the control unit 100.
- the storage unit 160 is realized by a ROM (Read Only Memory) that stores programs and calculation parameters used for the processing of the control unit 100, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- ROM Read Only Memory
- RAM Random Access Memory
- the display unit 170 is an example of an output unit, and is realized by a display device such as a liquid crystal display (LCD: Liquid Crystal Display) device or an organic EL (OLED: Organic Light Emitting Diode) display device.
- a display device such as a liquid crystal display (LCD: Liquid Crystal Display) device or an organic EL (OLED: Organic Light Emitting Diode) display device.
- the display unit 170 displays a happiness forecast screen received from the happiness management server 2.
- the speaker 180 is an example of an output unit, and reproduces an audio signal.
- the speaker 180 outputs the happiness forecast received from the happiness management server 2 by voice.
- FIG. 4 is a block diagram illustrating a configuration example of the happiness management server 2 according to the present embodiment.
- the happiness management server 2 includes a control unit 200, a communication unit 210, a happiness history storage unit 220, and a happiness prediction result storage unit 230.
- the control unit 200 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the happiness management server 2 according to various programs.
- the control unit 200 is realized by an electronic circuit such as a CPU or a microprocessor.
- the control unit 200 also functions as an information analysis unit 201, a happiness analysis unit 202, a happiness prediction unit 203, a storage control unit 204, and a happiness information presentation control unit 205.
- the information analysis unit 201 has a function of acquiring and analyzing various information (including user behavior history) from the external server 4 (4a to 4i) and the user terminal 1 for performing happiness analysis. For example, when the external server 4 (4a to 4i) provides Web API (Application Programming Interface) for inquiring and utilizing data by calling data from another server with the permission of the user, the information analysis unit 201 Whether or not new data is added to the external server 4 at regular intervals (for example, once an hour) is confirmed using the Web API, and if added, the added data is acquired and analyzed. Specifically, for example, the information analysis unit 201 acquires user post data (an example of a user action history) from the communication server 4c, and performs language analysis.
- Web API Application Programming Interface
- the information analysis unit 201 acquires user voice data (an example of a user action history) from the voice management server 4e, and performs emotion recognition and language analysis. In addition, the information analysis unit 201 acquires a user life log (an example of a user action history) from the life log management server 4b, and performs a user action analysis.
- the happiness analysis unit 202 determines whether or not happiness has occurred in the user based on the information analyzed by the information analysis unit 201, and calculates the strength of happiness when happiness has occurred. For example, the happiness analysis unit 202 calculates the happiness intensity (for example, mapping to ⁇ 1 to 1) according to the presence / absence of the happiness and the smile level based on the smile recognition of the captured image. The happiness analysis unit 202 determines that negative happiness has occurred when an angry face is recognized. Further, the happiness analysis unit 202 analyzes the text output by language analysis from the user's mail content, SNS or blog posting data, and when the user has a feeling of “joy” or “joy”, the happiness It can also be determined that the problem has occurred.
- the happiness intensity for example, mapping to ⁇ 1 to 1
- the happiness analysis unit 202 determines that negative happiness has occurred when an angry face is recognized.
- the happiness analysis unit 202 analyzes the text output by language analysis from the user's mail content, SNS or blog posting data, and when the user has a feeling of “
- the happiness analysis unit 202 can calculate the happiness intensity according to the definition.
- the happiness analysis unit 202 can also analyze emotions from the user's voice data, determine whether or not happiness has occurred, and calculate the happiness intensity.
- Happiness analysis is not limited to being performed automatically based on the analyzed information, and the user may be allowed to input the happiness and the happiness intensity. Further, the happiness analysis may be performed in real time, or may be automatically analyzed later based on the stored captured image or audio data.
- the happiness analysis unit 202 detects a keyword related to the occurrence of happiness when the happiness occurs. Keywords related to happiness are detected by machine learning, for example. More specifically, the means by which the happiness occurred (conversation, watching TV program, e-mail, telephone, etc.), the content of the happiness (event, person / thing or situation facing, etc.), and the location (location) of the happiness. Information, or location attribute). Then, the happiness analysis unit 202 outputs the analyzed happiness occurrence date / time, location, happiness intensity, and user ID together with the associated keyword to the storage control unit 204 as an analysis result. Note that the happiness analysis unit 202 may analyze an item that the user possesses (or wears or wears) at the time of occurrence of the happiness, and may include it as a happiness item in the analysis result (that is, the happiness history).
- the happiness prediction unit 203 includes at least one of a specific date and time, a location, and a keyword associated with the specific date and time, and a happiness history (for example, user ID, date and time, location) stored in the happiness history storage unit 220. , The happiness intensity, the keyword, and the like), the happiness occurring at the specific date and place is predicted. For example, based on the happiness history, the happiness prediction unit 203 performs the same operation for a specific date and place based on the time, place, and associated keyword (for example, a concert of a predetermined entertainer). When a keyword is associated (for example, a concert of a predetermined entertainer is scheduled to be held), it can be predicted that happiness will occur.
- a keyword for example, a concert of a predetermined entertainer is scheduled to be held
- the happiness prediction unit 203 calculates happiness intensity based on happiness histories of people who are scheduled to gather at a specific place at a specific date and time based on the schedule information of a plurality of users, and sums these to calculate the happiness intensity of the place. It is also possible to calculate the happiness intensity.
- the happiness prediction unit 203 can perform happiness prediction at a specific date and time by analyzing the happiness occurrence situation.
- happiness occurrence status may be analyzed on a rule basis, or a hidden relationship between happiness and status may be extracted by machine learning or the like.
- the happiness prediction unit 203 can also predict that happiness will be generated in the surrounding area where a specific person, group, event, or object that propagates these happinesses is present.
- the storage control unit 204 accumulates the result analyzed by the happiness analysis unit 202 in the happiness history storage unit 220 as a happiness history. Further, the storage control unit 204 accumulates the result predicted by the happiness prediction unit 203 in the happiness prediction result storage unit 230 as a happiness prediction result.
- the happiness information presentation control unit 205 controls to present happiness information (that is, happiness forecast) to the user based on the prediction result by the happiness prediction unit 203. For example, when the communication unit 210 receives a request from an external device (for example, the user terminal 1) that requests a happiness forecast for a specific date and time, the happiness information presentation control unit 205 may generate the corresponding happiness forecast. Good. More specifically, for example, the happiness information presentation control unit 205 generates a happiness forecast map image of the specified date and time, and requests information for displaying the generated image from the communication unit 210 via the network 3. Send to the original.
- an external device for example, the user terminal 1
- the happiness information presentation control unit 205 may generate the corresponding happiness forecast.
- the happiness information presentation control unit 205 generates a happiness forecast map image of the specified date and time, and requests information for displaying the generated image from the communication unit 210 via the network 3. Send to the original.
- the communication unit 210 transmits / receives data to / from an external device via wired / wireless.
- the communication unit 110 is connected to the user terminal 1 via the network 3 and transmits information related to the happiness forecast.
- the happiness history storage unit 220 stores the happiness history analyzed by the happiness analysis unit 202.
- the happiness history includes, for example, a user ID, a happiness occurrence time, an end time, a happiness factor (means, contents, location) (an example of an associated keyword), a happiness intensity, a happiness ID, and a possessed item (also referred to as a happiness item). included.
- the happiness ID is identification information for managing happiness occurring in the same “field” in common with a plurality of people.
- the happiness prediction result storage unit 230 stores the happiness prediction result predicted by the happiness prediction unit 203.
- the happiness prediction result includes, for example, the place where the occurrence of happiness is predicted, the time, the content of the happiness that occurs, the happiness intensity, and the possessed item.
- the configuration of the user terminal 1 and the happiness management server 2 according to this embodiment has been specifically described above. Next, the operation process according to the present embodiment will be specifically described.
- FIG. 5 is a flowchart showing happiness analysis processing according to the present embodiment.
- the happiness management server 2 acquires the voice data of each user from the external server 4 (step S103).
- the happiness management server 2 checks whether or not new data has been added to the external server 4 once every hour, for example, and acquires the new data if it has been added.
- the voice data of each user is collected from, for example, a conversation or a cheer, and is acquired from the voice management server 4e.
- the information analysis unit 201 performs emotion recognition based on the acquired voice data (step S106). For example, the information analysis unit 201 extracts a laughing voice or voice color from voice data and recognizes the emotion of the speaker (user).
- the happiness analysis unit 202 calculates the happiness intensity of each user (the magnitude of emotion such as “joy” or “happiness”) based on the emotion recognition result (for example, normalized by ⁇ 1 to 1), and the happiness is calculated.
- the date and time of occurrence is extracted (step S109).
- the information analysis unit 201 performs language analysis based on the acquired voice data (step S112). For example, the information analysis unit 201 converts voice data into text, and performs positive analysis (for example, “happy”, “fun”, “thank you”) or negative word (for example, “sad”) by linguistic analysis of the converted voice data. , “Not boring”, “dislike”, etc.).
- positive analysis for example, “happy”, “fun”, “thank you”
- negative word for example, “sad”
- the happiness analysis unit 202 calculates the happiness intensity based on the language analysis result, and extracts the happiness occurrence date and related keywords (step S115). For example, the happiness analysis unit 202 normalizes and calculates the happiness intensity of the user to ⁇ 1 to 1, for example, based on the extracted positive word or negative word.
- the happiness occurrence date and time is extracted from the sound collection date and time information associated with the audio data.
- the related keyword relates to a happiness occurrence factor, and can be extracted from, for example, text data. For example, when voice data “Amusement park was fun! I want to go again!” Is acquired, the happiness intensity is calculated from the positive word “It was fun”, and “Amusement park” is extracted as a keyword related to the occurrence of happiness. Is done.
- the happiness management server 2 acquires the image data of each user from the external server 4 (step S118).
- the image data of each user is a captured image, for example, and is acquired from the image management server 4f.
- the captured image may include tag information associated with the subject.
- the information analysis unit 201 performs image analysis based on the acquired image data (step S118). For example, the information analysis unit 201 extracts the face image of the user himself / herself from the image data based on the tag information, and performs facial expression analysis.
- the happiness analysis unit 202 calculates the happiness intensity based on the facial expression analysis result, and extracts the happiness occurrence location, date and time, and possessed items (step S124). For example, if the user's facial expression is a smile, the happiness analysis unit 202 calculates the happiness intensity according to the smile level. Further, the happiness occurrence location and the date / time can be extracted from the shooting location information and the shooting date / time included in the metadata of the image data. Also, the possessed item is an item that the user himself possesses (or wears or wears) and may be one of happiness occurrence factors, and can be extracted from object information of image data or linked tag information.
- the happiness management server 2 acquires post data to the community site of each user from the external server 4 (step S127).
- Posting data of each user is, for example, daily whispering, diary, comment, or captured image, and is acquired from the communication server 4c.
- the information analysis unit 201 performs language analysis based on the acquired post data (step S130). For example, the information analysis unit 201 performs linguistic analysis on text included in the posted data, and positive words (for example, “happy”, “fun”, “thank you”, etc.) or negative words (for example, “sad”, “bottom”). , “I do n’t like it”).
- the happiness analysis unit 202 calculates the happiness intensity based on the language analysis result, and extracts the happiness occurrence place, the date and time, and related keywords (step S133). For example, the happiness analysis unit 202 normalizes and calculates the happiness intensity of the user to ⁇ 1 to 1, for example, based on the extracted positive word or negative word. Further, the happiness occurrence location and date / time can be extracted from the position information (current position information detected at the time of posting) and the posting date / time information associated with the posting data, or the contents of the posting data. Further, the related keyword relates to a happiness occurrence factor, and can be extracted from the content of the posted data. For example, if the post data “The concert of artist X was really the best!” Is acquired, the happiness intensity is calculated from the positive word “highest”, and “artist X concert” is a keyword related to the occurrence of happiness. Extracted as
- the happiness analysis unit 202 acquires information on the place where the happiness occurred (or “place” where the happiness occurred) from each data (step S136).
- the location where the happiness has occurred may be acquired from the location information of the user at the happiness occurrence time, for example, based on the user's life log acquired from the life log management server 4b.
- Information on the happiness occurrence location can be extracted from, for example, user schedule information acquired from the schedule management server 4a, performance information acquired from the performance information management server 4h, or program information acquired from the program information providing server 4d. Further, in the captured image, when a subject (person, thing) other than the user himself / herself is tagged, it can be extracted as peripheral information of the person / thing related to the happiness occurrence location.
- the storage control unit 204 stores information related to the occurrence of the happiness analyzed by the happiness analysis unit 202 (user ID, happiness occurrence location, occurrence location information, occurrence date and time, happiness intensity, related keywords, items, and the like). Control is performed to store the happiness history in the happiness history storage unit 220 (step S139).
- FIG. 6 is a flowchart showing the happiness forecast processing for unspecified majority according to the present embodiment. As shown in FIG. 6, first, the happiness prediction unit 203 of the happiness management server 2 acquires schedule information for all users from the schedule management server 4a (step S153).
- the happiness prediction unit 203 refers to the happiness history of all users stored in the happiness history storage unit 220 (step S156).
- the happiness prediction unit 203 predicts the happiness intensity of all users at a specific time and place (step S159). For example, when outputting the happiness forecast for tomorrow's T capital, the happiness prediction unit 203 first identifies the users who are scheduled to be in T city tomorrow based on the schedule information of all users. Next, the happiness prediction unit 203 calculates the happiness intensity of the identified user for one day tomorrow or every hour with reference to the happiness history of the identified user. For example, the happiness prediction unit 203, based on the identified user happiness history, the happiness intensity of the happiness history of each user related to a keyword related to T city of the day tomorrow (for example, event information scheduled to be held in T city).
- the happiness prediction unit 203 adds up the happiness intensity of the identified user for one day tomorrow or every hour, and outputs the sum as the happiness intensity for every day of tomorrow in T city or every hour.
- the happiness prediction unit 203 extracts items related to the occurrence of happiness from the happiness history (step S162).
- the storage control unit 204 controls the prediction result by the happiness prediction unit 203 to be stored in the happiness prediction result storage unit 230 (step S165).
- the happiness information presentation control unit 205 generates a forecast map image based on the happiness prediction result (step S168), and causes the user terminal 1 to present the generated forecast map image to the user via the communication unit 210. Control is performed to transmit the forecast map image (step S171). Thereby, the user can browse the happiness forecast of T capital tomorrow, for example, and can grasp the happiness occurrence situation of tomorrow in advance.
- the forecast map image is generated based on the happiness prediction of all users, it can be said that the happiness forecast is for an unspecified majority (general), but the present disclosure is not limited to this.
- Users in a specific community are specified by, for example, extracting similar users whose preferences are similar to each other. This will be specifically described below with reference to FIGS.
- FIG. 7 is a diagram for explaining the happiness forecast for a specific community according to the present embodiment.
- a fan of artist X for example, a fan club member
- a happiness prediction for a specific community as a fan of artist X using the happiness history in which the happiness felt is accumulated.
- the happiness prediction unit 203 acquires, from the happiness history storage unit 220, the happiness history of the user who is in the fan club based on, for example, fan club member data acquired from the fan club management server 4g. However, it is assumed to be used for happiness prediction.
- happiness analysis and happiness history of users included in a specific community of artist X fans will be described with reference to FIGS.
- user A is a fan of artist X
- artist X's television program is checked regularly, he goes to a concert
- user A's schedule includes the schedule of artist X's concert. It is assumed that photographs of user A that are shown with a smile before and after the concert are included.
- One day when user A posts a topic related to artist X such as “There is a live X! Very happy!”
- To the communication server 4c on the SNS it is acquired by periodically checking additional data by the happiness management server 2.
- language analysis is performed (see steps S127 to S130 in FIG. 5).
- the happiness analysis unit 202 knows that the user A has responded positively with a topic related to the artist X.
- the happiness analysis unit 202 has the happiness shown in FIG.
- Happiness intensity is calculated using a coefficient table.
- the happiness intensity definition table shown in FIG. 8 is pre-defined with the happiness intensity corresponding to the language expression. For example, “I want to go but cannot go: Happiness intensity -0.2”, “Slightly happy: Happiness intensity 0.2”, “I am happy. !: Happiness strength 0.5 ”,“ I am very happy. I am happy! I am very happy !: Happiness strength 0.8 ”.
- the happiness analysis unit 202 calculates the happiness intensity 0.8 from the expression “very happy” extracted from the language analysis of the post data of the user A.
- the happiness analysis unit 202 further acquires the date and time of the posting data and the current position of the user A at the time of posting, and is stored in the happiness history storage unit 220 by the storage control unit 204 as a happiness history.
- FIG. 9 shows an example of a happiness history according to the present embodiment. The happiness history based on the analysis result of the post data is shown in the first row of FIG.
- the happiness analysis unit 202 analyzes a facial expression of the user A reflected in the captured image, and calculates a happiness intensity (for example, a happiness intensity of 0.7) from the smile level of the face of the user A reflected with the artist X or the face of the user A when the artist X appears. calculate.
- the happiness analysis unit 202 can acquire information on a program that the user is viewing from the program information providing server 4d based on the imaging time, and can extract performer information and the like.
- the happiness analysis unit 202 further acquires the date and time of the captured image and the current position of the user A at the time of imaging, and is stored in the happiness history storage unit 220 by the storage control unit 204 as a happiness history.
- the happiness history of user A when viewing the television program is shown in the second row of FIG.
- happiness analysis unit 202 uses the same television as the happiness intensity of artist X at the time of recording.
- a happiness ID common to the happiness intensity of the user A when viewing the television program may be assigned and accumulated as a happiness history.
- the happiness intensity at the time of recording of artist X can be calculated by, for example, recognition of a smile on a captured image taken by artist X during the recording of the program.
- the happiness history of the artist X is shown in the third row of FIG.
- the user A, the user B, and the user C who live in the T city know that the next performance of the artist X is performed in the O city far from the T city.
- each user makes the following remarks at the SNS, and is acquired by periodically confirming additional data by the happiness management server 2, and language analysis is performed.
- User A “I'm so happy to see X city show!
- User B "X's O city performance is happy! I will definitely go!
- User C “X city performance ... I want to go but I can't go to work.”
- the information analysis unit 201 receives schedule information, program information, performance information, and the like of the artist X fan from the schedule management server 4a, the program information providing server 4d, and the performance information management server 4h. Each is acquired (step S203).
- information related to the happiness of the fan of artist X that is, event information related to artist X is acquired.
- the information analysis unit 201 analyzes the acquired information. Specifically, for example, information on participation in an event related to artist X is extracted from schedule information, performance information of artist X is extracted from performance information, and program appearance information of artist X is extracted from program information.
- the happiness prediction unit 203 refers to the happiness history of each user (an artist X fan) (step S206).
- the happiness prediction unit 203 confirms whether or not there is a history related to the artist X in the happiness history (step S209), and if there is (step S209 / Yes), based on the related history, a specific time and The happiness intensity of the user at the place is calculated (ie, predicted) (step S212). For example, if the happiness intensity of the user A when viewing a TV program in which the artist X has previously appeared is “0.8”, the happiness prediction unit 203 also has the happiness intensity when viewing the TV program in which the artist X has appeared next time. It is predicted to be “0.8”.
- the happiness prediction unit 203 obtains the performance schedule data of artist X from the performance information, and knows that the O city performance of artist X is on May 20, 2015. Predict that happiness is likely to occur.
- the happiness prediction unit 203 checks the schedule information of each user on May 20 such as user A, user B, and user C who are fans of artist X, and predicts the happiness of each user on May 20 By summing up, it is possible to calculate the happiness intensity of fans of artist X in O City on May 20th when the performance is held.
- the schedule of each user regarding the performance of artist X on May 20 is extracted from the schedule information and the post data to the SNS, and the participation status of each user in the performance can be grasped.
- the happiness predicting unit 203 refers to the happiness coefficient table as shown in FIG. 11, and the happiness coefficient according to the user's situation is shown in the happiness intensity of the user when the television program in which the artist X has previously appeared is viewed. Can be used to calculate a predicted value of happiness intensity.
- the happiness coefficient shown in FIG. 11 is a coefficient that differs depending on the user's situation (for example, an event). For example, when the user cannot meet (cannot participate or cannot watch a program), “-0.1”, the television program is viewed. "1" for public viewing (indicating live broadcasts at a concert cinema), "2" for participation in a concert, "3” for direct signing, etc. Is defined.
- the happiness intensity is normalized to ⁇ 1 to 1, when the calculated predicted value exceeds ⁇ 1 to 1, “-1” and “1” are set as the maximum positive and negative limiters. May be provided.
- the happiness prediction unit 203 extracts items related to the predicted happiness occurrence from the happiness history (step S215).
- Items related to the occurrence of happiness are things that may contribute to the occurrence of a happiness. For example, in the case of the above concert, it is analyzed that there is a song that swells by shaking the psyllium at the concert of artist X, based on the data posted to SNS, the captured image of the audience after the concert, etc. : Psyllium ”.
- An example of such a happiness history is shown in FIG. 12A.
- FIG. 12A shows an example of a happiness history of an artist X's performance in the T city in the past.
- the happiness prediction unit 203 is likely to have a happiness with a concert of artist X together with psyllium. It can be judged that it has occurred. In this case, the happiness prediction unit 203 can extract “Psyllium” as an item related to the happiness of the fans of the artist X generated by the concert.
- FIG. 12B is a diagram illustrating an example of a happiness prediction result.
- FIG. 12B shows the predicted happiness intensity value of each user (which may include artist X) from 18:30 to 21:00 on May 20, 2015 (performing time of artist X). Note that the same happiness ID is given to the happiness that occurs in a concert that is the same “place”.
- happiness prediction presentation process presentation of the happiness prediction result to the user will be described with reference to FIGS.
- the happiness prediction result for example, it is conceivable to pay attention to a place and present an image representing a happiness intensity of a specific population at each place with a heat map.
- FIG. 13 is a flowchart showing a happiness prediction result presentation process according to the present embodiment.
- the happiness information presentation control unit 205 of the happiness management server 2 selects a happiness forecast population (step S223).
- a happiness forecast population for example, an artist X fan is selected.
- the fan club management server 4g refers to the fan club member data of artist X or the preference of each user (for example, the preference estimated based on the happiness history or the user information registered in advance), and the mother of the happiness forecast. Select artist X fans as a group.
- the happiness information presentation control unit 205 acquires the happiness prediction result of the user corresponding to the population from the happiness prediction result storage unit 230 (step S226).
- the happiness information presentation control unit 205 generates a forecast map image based on the acquired happiness prediction result (step S229).
- a specific example of the forecast map image according to this embodiment will be described later.
- the happiness information presentation control unit 205 controls the user terminal 1 to present the generated forecast map to the user (step S232). Thereby, the user can grasp in advance the happiness intensity of the artist X fan at each place at the concert holding time of artist X on May 20, for example. Similarly, the happiness information presentation control unit 205 can also generate a heat map image indicating past happiness intensity using the happiness history stored in the happiness history storage unit 220.
- FIG. 14 is a diagram showing a screen display example of a happiness forecast according to the present embodiment.
- the happiness forecast of artist X fan at 16:00 on May 20, 2015 is displayed on the screen 30 as a heat map.
- a heat map map 301 shown in FIG. 14 represents, for example, contour lines indicating the number of people that fans with happiness intensity higher than a predetermined threshold (for example, 0.5) gather in each place (see legend 302).
- a predetermined threshold for example, 0.5
- FIG. 15 is a diagram showing a screen transition example of the happiness forecast shown in FIG.
- the time slider 303 is operated by the user, and a happiness forecast at 19:00 (during performance) after the start of the concert is displayed.
- the user can easily visually grasp a gathering of people with high happiness of artist X fans around the venue and around the venue during the concert.
- a circle having a size corresponding to a predetermined radius centered at each venue is represented in a color corresponding to the number of people with high happiness intensity within a predetermined radius from each venue.
- the screen 31 also displays a happiness item image 305.
- the user can grasp in advance that the happiness will increase (rise) by going to the venue with the psyllium indicated by the happiness item image 305.
- FIG. 16 is a diagram showing a screen transition example of the happiness forecast shown in FIG. In the screen 32 of FIG. 16, the time slider 313 is operated by the user, and a happiness forecast at 22:00 after the concert is displayed. As a result, the user can easily visually grasp the situation where people with high happiness of artist X fans in the venues around the venue and the venues are dissolved after the concert ends.
- FIG. 17 is a diagram showing a screen display example of the happiness forecast of artist X fan on the next day of the concert.
- the screen 33 of FIG. 17 it is predicted that the happiness of the artist X fan will occur in S city where there should be no artist X event. This is because some of the fans of artist X are fans of artist Y and artist Y performs in S city the day after artist X's concert, so happiness due to the performance of artist Y is predicted.
- a happiness item image 335 showing a towel as a happiness item is also displayed on the screen 33.
- the fan of artist X generates happiness with an interest in artist Y, and the artist Y camp can acquire a new fan.
- the fan of artist X can know in advance that there is a performance of artist Y that many of the same fans are interested in and the happiness items thereof.
- the user instructs the happiness management server 2 from the user terminal 1 to change the happiness forecast population, for example, the happiness forecast for the unspecified majority, the happiness forecast for the artist X fan, and the artist It is also possible to compare and display the happiness forecast for X fans and artist Y fans.
- the display method of the happiness forecast according to the present embodiment is not limited to the heat map image display according to the number of people with high happiness intensity as described above, but the happiness intensity of each place (a specific community scheduled to visit each place)
- the heat map image display corresponding to the total happiness intensity of the other people may be used.
- the happiness forecast display method may be expressed by mapping a weather icon corresponding to the happiness intensity as in the weather forecast.
- FIG. 18 is a diagram showing a screen display example when the happiness forecast according to the present embodiment is expressed like a national weather forecast.
- a map image in which weather icons for example, a sunny icon, a cloudy icon, and a rain icon
- happiness intensity 1 to 0.5 is represented by a clear icon
- 0.5 to -0.5 is a cloudy icon
- -0.5 to -1 is represented by a rain icon (not shown).
- FIG. 19 is a diagram showing a screen display example when the happiness forecast according to the present embodiment is expressed like a national weather forecast.
- a weather icon for example, a sunny icon, a cloudy icon, and a rain icon
- the happiness forecast is displayed on the time axis of one week.
- the present embodiment is not limited to this, and the daily happiness forecast may be displayed as a weather icon every hour. Alternatively, a comprehensive daily happiness forecast may be displayed with a weather icon.
- the display method of the happiness forecast is not limited to the above-described chart, and the transition of the happiness forecast may be explained simply by sentences. For example, text such as “There is a concert of artist X in T city today, and the happiness intensity at a radius of Nkm will increase from 2 hours ago” is displayed on the user terminal 1 and presented to the user.
- any of the many screen display examples as described above can be applied to the display of the happiness forecast for the unspecified majority.
- FIG. 20 shows an example of the happiness history of the user A.
- linguistic analysis of user A's post data to the SNS calculates happiness intensity 0.7 from an expression such as “exciting” and is registered in the first stage.
- the user A writes “I see the rainbow again. The tension goes up.
- the camera is shattered.” Is written in the SNS, and the happiness management server 2 acquires post data and performs language analysis. .
- the happiness analysis unit 202 calculates, for example, a happiness intensity of 0.8 based on the expression “tension increases” and is registered in the second stage of the happiness history in FIG. At this time, the camera is extracted and registered as a happiness item.
- FIG. 21 is a flowchart showing happiness forecast processing for individual users according to the present embodiment.
- the happiness management server 2 acquires user schedule information and rainbow prediction information from the schedule management server 4a and the rainbow information management server 4i, respectively (step S303).
- the rainbow information management server 4i realizes a service that predicts when and where a rainbow will occur based on weather information and the like, accumulates the prediction information, and provides it to the user as needed.
- the happiness prediction unit 203 predicts the happiness intensity of the user when the rainbow occurs based on the rainbow occurrence prediction information at the place and time where the user A is scheduled to visit (step S306). For example, the happiness prediction unit 203 acquires that the user A is scheduled to go to Y city around 15:00 today based on the user's schedule information, and refers to the happiness history of the user A to If we guess that “rainbow” is related to A's happiness, we will refer to the rainbow generation information of Y city around 15:00 today. Then, when the possibility of rainbow generation is high, the happiness prediction unit 203 predicts that the possibility that happiness will occur when the user A looks at the rainbow, and calculates a happiness intensity prediction value.
- the predicted value of the happiness intensity is calculated using a happiness coefficient table as shown in FIG. 11 in the above embodiment, but this embodiment is not limited to this.
- the rainbow intensity is estimated in the past based on the happiness history.
- An average value of the happiness intensity at the time may be calculated as a predicted happiness intensity value.
- the predicted happiness intensity value is not limited to the average value of the past happiness intensity, and may be, for example, a median value or a happiness intensity value of the latest happiness history.
- the happiness prediction unit 203 weights the rainbow occurrence probability by multiplying the average value of the happiness intensity based on the past happiness history. May be used to predict the happiness intensity. Further, the happiness prediction unit 203 may multiply the weight as “1” when the rainbow occurrence probability exceeds a predetermined threshold, for example, 50%.
- happiness prediction may be performed periodically (for example, every hour) in each of the prediction processes described above including the present embodiment, and the prediction result may be continuously updated.
- the prediction may be performed after obtaining the necessary data from the external server 4 when there is a forecast request.
- the happiness prediction unit 203 refers to the happiness history of the user A and extracts a happiness item (here, “camera”) related to the occurrence of happiness (step S309).
- the storage control unit 204 accumulates the happiness prediction results in the happiness prediction result storage unit 230 (step S312).
- the happiness information presentation control unit 205 generates a forecast map image based on the happiness prediction result (step S315).
- a specific example of the forecast map image according to this embodiment will be described later.
- the happiness information presentation control unit 205 controls the user terminal 1 to present the generated forecast map to the user (step S318).
- the forecast map also includes happiness items, and the user sees today's happiness forecast for himself and knows in advance that if there is a camera around 15 o'clock, happiness will occur. Can do.
- FIG. 22 is a diagram showing a screen display example when the happiness forecast for individual users according to the present embodiment is expressed as a weather forecast for each hour of the day.
- a happiness forecast for the user A for example, a happiness forecast image 401 every two hours is displayed as a weather icon on the screen 40.
- the weather icons are represented by, for example, happiness intensity 1 to 0.5 as clear icons, 0.5 to -0.5 as cloudy icons, and -0.5 to -1 as rain icons (not shown).
- a happiness item image 402 is also displayed on the screen 40.
- the user can spend a day with a happiness item (camera in the example shown in FIG. 22) to obtain a happiness (for example, encounter a rainbow in Y city scheduled to visit at 15:00, and take a picture with the camera) Shooting will produce stronger happiness).
- FIG. 23 is a diagram showing a screen display example in the case where a comprehensive daily happiness forecast for individual users according to the present embodiment is expressed like a weather forecast.
- a happiness weather forecast image 411 and a happiness item image 412 of the user A are displayed on the screen 41.
- the happiness weather forecast image 411 simply displays a corresponding weather icon as a comprehensive happiness forecast for the day.
- FIG. 24 is a diagram showing a screen display example for displaying a happiness forecast for individual users according to the present embodiment like a biorhythm.
- the screen 42 displays a graph 421 indicating a change of happiness for each time and a text 422 regarding the forecast contents.
- the user can intuitively grasp in advance in advance what kind of emotion he is likely to spend today all day.
- both the biorhythm graph 421 and the text 422 regarding the forecast contents are displayed, but only one of them may be displayed.
- a computer-readable storage medium storing the computer program is also provided.
- embodiment mentioned above has a system structure containing the happiness management server 2 and the user terminal 1, this indication is not limited to this,
- the user terminal 1 has the function of the happiness management server 2
- the user terminal 1 may perform the above-described happiness analysis, accumulation of happiness history, and happiness prediction.
- a storage unit for storing a specific emotion level calculated based on a user's behavior history, a date and position when the specific emotion has occurred, and an emotion history associated with a related keyword; Based on the relationship between the specific date and time, the location, and at least one of the keywords associated with the date and time or the location and the emotion history stored in the storage unit, the specific emotion at the specific date and location A control unit for predicting the level;
- An information processing system comprising: (2) The control unit identifies a user located around a specific place at a specific date and time based on a schedule of a plurality of users, and relates to a keyword related to the specific date and place based on the emotion history of the specified user The information processing system according to (1), wherein an emotion level of an emotion history is extracted and a specific emotion level at the specific date and time is predicted.
- the control unit based on a future schedule of the specific user, a level of the specific emotion of the specific user stored as the emotion history, and the keyword, the specific emotion at the specific date and place of the specific user The information processing system according to (2), wherein the level is predicted.
- the control unit extracts a similar user whose preference is similar to that of the specific user from the emotion history accumulated in the accumulation unit, Based on the schedule of the similar user, a user who is located around a specific place at a specific date and time is specified, and an emotion history related to a keyword related to the specific date and place based on the emotion history of the specified user The information processing system according to (2), wherein an emotion level is extracted and a specific emotion level at the specific date and time is predicted.
- the happiness history as the emotion history accumulated in the accumulation unit includes items held by the user in association with the happiness intensity as the level of the specific emotion,
- the control unit predicts the happiness intensity of the specific user based on the future schedule of the specific user, the happiness intensity of the specific user, the keyword, and the item stored as the happiness history, and
- the information processing system according to any one of (2) to (4), wherein an item to be recommended is specified.
- the control unit is based on at least one of the user's facial expression, an emotion recognition result based on the user's voice, a result of language analysis of the user's voice, and a content posted by the user and social media.
- the information processing system according to any one of (1) to (5), wherein the level of the specific emotion is calculated.
- a communication unit that receives a request for an emotion forecast map for a specific date and time and location from an external device, The control unit determines the level of a specific emotion at the specific date and time and location from the relationship between the specific date and time, the keyword associated with the date and time, and the emotion history stored in the storage unit. Predicting, generating an emotion prediction map based on the level of the specific emotion, and controlling the generated emotion prediction map to be transmitted to the external device via the communication unit.
- the information processing system according to any one of claims.
- the control unit identifies a user located around a specific date and time and location based on a schedule of a plurality of users, and sets the level of the specific emotion included in the emotion history of the specified user and the specific date and time and location.
- the communication unit receives, from the external device, a request for requesting an emotion forecast map for a specific date and time and location of a specific user,
- the control unit predicts a specific emotion level of the specific user based on a future schedule of the specific user, a specific emotion level of the specific user accumulated as the emotion history, and the keyword,
- the communication unit receives, from the external device, a request for requesting an emotion prediction map of a specific date and place of a similar user having a similar preference to the specific user,
- the control unit extracts similar users whose preferences are similar to those of the specific user from the emotion history stored in the storage unit, and a user located around a specific place at a specific date and time based on the schedule of the similar user And predicting the level of the specific emotion at the specific date and location based on the relationship between the emotion history of the specified user and the keyword related to the specific date and location.
- the information processing system according to (8), wherein an emotion prediction map is generated in response, and control is performed to transmit the emotion prediction map directed to the similar user to the external device via the communication unit.
- (11) Processor Storing the emotion history associated with the level of a specific emotion calculated based on the user's behavior history, the date and position of occurrence of the specific emotion, and a related keyword; Based on the relationship between the specific date and time, the location, and at least one of the keywords associated with the date and time or the location and the emotion history stored in the storage unit, the specific emotion at the specific date and location Predicting the level, Including an information processing method.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Theoretical Computer Science (AREA)
- Economics (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Game Theory and Decision Science (AREA)
- Human Resources & Organizations (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
1.本開示の一実施形態による情報処理システムの概要
2.構成
2-1.全体構成
2-2.ユーザ端末の構成
2-3.ハピネス管理サーバの構成
3.動作処理
3-1.ハピネス解析処理
3-2.不特定多数向けハピネス予報処理
3-3.特定コミュニティ向けハピネス予報処理
3-4.個人向けハピネス予報処理
4.まとめ
本実施形態による情報処理システムでは、ユーザの行動履歴に基づいて演算した特定感情のレベルを、感情発生の時刻、場所、または関連するキーワード等の情報と共に蓄積し、特定の時刻および場所における特定感情のレベルを予測することを可能とする。また、予測した特定感情のレベルを「感情予報」としてユーザに提示することで、ユーザは天気予報を見るように、自分が今日どのような感情になる可能性が高いかを事前に把握したり、時刻と共に変化する各場所の感情を事前に把握することが可能となる。
<2-1.全体構成>
図2は、本実施形態による情報処理システムの全体構成について説明する図である。図2に示すように、本実施形態による情報処理システムは、ユーザ端末1およびハピネス管理サーバ2を含む。両者はネットワーク3を介して接続される。また、ハピネス管理サーバ2は、ハピネス解析または予測を行う際、必要に応じて、スケジュール管理サーバ4a、ライフログ管理サーバ4b、コミュニケーションサーバ4c、番組情報提供サーバ4d、音声管理サーバ4e、画像管理サーバ4f、ファンクラブ運営サーバ4g、公演情報管理サーバ4h、または虹情報管理サーバ4iといった外部サーバ4から情報を取得する。図2に具体的に図示した外部サーバ4(4a~4i)は一例であって、本実施形態はこれに限定されない。
図3は、本実施形態によるユーザ端末1の構成例を示すブロック図である。図3に示すように、ユーザ端末1は、制御部100、通信部110、位置情報取得部120、カメラ130、マイクロホン140、操作入力部150、記憶部160、表示部170、およびスピーカ180を有する。
図4は、本実施形態によるハピネス管理サーバ2の構成例を示すブロック図である。図4に示すように、ハピネス管理サーバ2は、制御部200、通信部210、ハピネス履歴記憶部220、およびハピネス予測結果記憶部230を有する。
<3-1.ハピネス解析処理>
図5は、本実施形態によるハピネス解析処理を示すフローチャートである。図5に示すように、まず、ハピネス管理サーバ2は、外部サーバ4から各ユーザの音声データを取得する(ステップS103)。ハピネス管理サーバ2は、上述したように、例えば1時間に1回、外部サーバ4に新たなデータが追加されたか否かを確認し、新たなデータが追加されている場合はこれを取得する。各ユーザの音声データは、例えば会話または歓声等が収音されたものであって、音声管理サーバ4eから取得される。
図6は、本実施形態による不特定多数向けハピネス予報処理を示すフローチャートである。図6に示すように、まず、ハピネス管理サーバ2のハピネス予測部203は、全ユーザの予定情報を、スケジュール管理サーバ4aから取得する(ステップS153)。
上述した実施形態では、全ユーザのハピネス予測に基づいて予報マップ画像を生成しているため、不特定多数(一般)向けのハピネス予報と言えるが、本開示はこれに限定されない。例えば、ハピネス予測の対象を限定することで特定コミュニティ向けのハピネス予報を実現することが可能である。特定コミュニティのユーザは、例えば嗜好が互いに類似する類似ユーザの抽出により特定される。以下、図7~図19を参照して具体的に説明する。
特定コミュニティ向けハピネス予報を行う際、ハピネス予測部203は、例えばファンクラブ運営サーバ4gから取得したファンクラブ会員データに基づいて、ファンクラブに入っているユーザのハピネス履歴をハピネス履歴記憶部220から取得し、ハピネス予測に用いることが想定される。ここで、アーティストXのファンという特定コミュニティに含まれるユーザのハピネス解析およびハピネス履歴例について図8~図9を参照して説明する。
また、アーティストXは人気のため、ユーザAと同様にアーティストXとの相関が高いハピネスが蓄積されていくユーザBおよびユーザCも存在し、例えば上記アーティストXが出演する同テレビ番組を見て発生したハピネスには、同じハピネスIDが付与される。このようなユーザBおよびユーザCのハピネス履歴は、図9の四段目および五段目に示される。
ユーザA「XのO市公演めっちゃ嬉しい!絶対に行く!」
ユーザB「XのO市公演、嬉しいな!絶対行くよ!」
ユーザC「XのO市公演…、行きたいけど仕事で行けないわー。」
続いて、特定コミュニティ(ここではアーティストXのファン)向けのハピネス予測処理について図10を参照して説明する。
続いて、ハピネス予測結果のユーザへの提示について、図13~図19を参照して説明する。ハピネス予測結果は、例えば場所に着目し、各場所における特定の母集団のハピネス強度をヒートマップで表現した画像をユーザに提示することが考えられる。
続いて、本実施形態によるハピネス予測の表示例について具体的に説明する。図14は、本実施形態によるハピネス予報の画面表示例を示す図である。図示された例では、画面30に、2015年5月20日16:00におけるアーティストXファンのハピネス予報がヒートマップ表示されている。図14に示すヒートマップ地図301は、例えばハピネス強度が所定の閾値(例えば0.5)より高いファンが各場所に集まる人数を等高線のように表現している(凡例302参照)。例えばアーティストXのコンサートが5月20日18:30からO市で開催され、当該コンサートのパブリックビューイング(映画館や競技場等でのコンサートの生中継)が全国で行われる予定の場合、アーティストXのファンは、コンサート開始の数時間前から、コンサートへの期待でハピネスが発生しつつ、会場へ移動をしていることが想定される。なおユーザの移動予定は、スケジュール情報から取得され得る。したがって、図14に示すように、コンサート開始前の16:00の時点では、各会場へ向けてハピネス値の高いユーザが多数移動して会場へ集まることが予測され、ユーザは、画面30を見ることで、コンサート前の会場周辺での盛り上がりを把握できる。また、ユーザは、画面30に含まれる時刻スライダー303を操作して、予測表示している時間帯を変更することができる。
続いて、さらにハピネス予測の対象(すなわちハピネス予報の母集団)を限定し、ユーザ個人に向けたハピネス予報を行うことも可能である。以下、図20~図24を参照して具体的に説明する。
続いて、本実施形態による個人向けのハピネス予測の表示例について具体的に説明する。個人向けのハピネス予測の表示例も、図14~図19を参照して説明した特定コミュニティ向けのハピネス予測の表示方法と同様に、ヒートマップや天気予報のように行うことが可能である。以下、図22~図24を参照して具体的に説明する。
上述したように、本開示の実施形態による情報処理システムでは、特定の時刻および場所における特定感情のレベルを予測することが可能となる。
(1)
ユーザの行動履歴に基づいて演算された特定の感情のレベル、当該特定の感情が発生した日時および位置、さらに関連するキーワードを対応付けた感情履歴を蓄積する蓄積部と、
特定の日時、場所、および当該日時または場所に関連付けられたキーワードのうちの少なくともいずれか一つと、前記蓄積部に蓄積される感情履歴との関係から、前記特定の日時および場所における特定の感情のレベルを予測する制御部と、
を備える、情報処理システム。
(2)
前記制御部は、複数ユーザのスケジュールに基づいて特定の日時に特定の場所周辺に位置するユーザを特定し、特定したユーザの感情履歴に基づいて、特定の日時および場所に関連するキーワードと関係する感情履歴の感情のレベルを抽出し、前記特定の日時および場所における特定の感情のレベルを予測する、前記(1)に記載の情報処理システム。
(3)
前記制御部は、特定ユーザの今後のスケジュールと、前記感情履歴として記憶される前記特定ユーザの特定の感情のレベルと、前記キーワードとに基づき、前記特定ユーザの特定の日時および場所における特定の感情のレベルを予測する、前記(2)に記載の情報処理システム。
(4)
前記制御部は、前記蓄積部に蓄積される感情履歴から、特定ユーザと嗜好が類似する類似ユーザを抽出し、
前記類似ユーザのスケジュールに基づいて、特定の日時に特定の場所周辺に位置するユーザを特定し、特定したユーザの感情履歴に基づいて、特定の日時および場所に関連するキーワードと関係する感情履歴の感情のレベルを抽出し、前記特定の日時および場所における特定の感情のレベルを予測する、前記(2)に記載の情報処理システム。
(5)
前記蓄積部に蓄積される感情履歴としてのハピネス履歴には、前記特定の感情のレベルとしてのハピネス強度に対応付けて前記ユーザが保持しているアイテムも含まれ、
前記制御部は、特定ユーザの今後のスケジュールと、前記ハピネス履歴として記憶される前記特定ユーザのハピネス強度、前記キーワード、および前記アイテムに基づいて、前記特定ユーザのハピネス強度を予測すると共に、ユーザに薦めるアイテムを特定する、前記(2)~(4)のいずれか1項に記載の情報処理システム。
(6)
前記制御部は、前記ユーザの表情、前記ユーザの音声に基づく感情認識結果、前記ユーザの音声をテキスト化し言語解析した結果、前記ユーザおよび前記ユーザがソーシャルメディアに投稿した内容の少なくともいずれかに基づいて、前記特定の感情のレベルを演算する、前記(1)~(5)のいずれか1項に記載の情報処理システム。
(7)
外部装置から、特定の日時および場所の感情予報マップを要求するリクエストを受信する通信部を備え、
前記制御部は、特定の日時、場所、および当該日時と場所に関連付けられたキーワードと、前記蓄積部に蓄積される感情履歴との関係から、前記特定の日時および場所における特定の感情のレベルを予測し、当該特定の感情のレベルに基づいて感情予報マップを生成し、生成した感情予報マップを前記通信部を介して前記外部装置に送信するよう制御する、前記(1)~(6)のいずれか1項に記載の情報処理システム。
(8)
前記制御部は、複数ユーザのスケジュールに基づいて、特定の日時、場所周辺に位置するユーザを特定し、特定したユーザの感情履歴に含まれる特定の感情のレベルと、当該特定の日時および場所に関連するキーワードとの関係から、前記特定の日時および場所における特定ユーザの特定の感情のレベルを予測し、当該特定の感情のレベルに応じて感情予報マップを生成し、前記通信部を介して前記外部装置に当該感情予報マップを送信するよう制御する、前記(7)に記載の情報処理システム。
(9)
前記通信部は、前記外部装置から、特定ユーザの特定の日時および場所の感情予報マップを要求するリクエストを受信し、
前記制御部は、特定ユーザの今後のスケジュールと、前記感情履歴として蓄積される前記特定ユーザの特定の感情のレベルと前記キーワードに基づいて、前記特定ユーザの特定の感情のレベルを予測し、該特定の感情のレベルに応じて感情予報マップを生成し、前記通信部を介して、前記外部装置に特定ユーザに向けた感情予報マップを送信するよう制御する、前記(8)に記載の情報処理システム。
(10)
前記通信部は、前記外部装置から、特定ユーザと嗜好が類似する類似ユーザの、特定の日時および場所の感情予報マップを要求するリクエストを受信し、
前記制御部は、前記蓄積部に蓄積される感情履歴から、特定ユーザと嗜好が類似する類似ユーザを抽出し、当該類似ユーザのスケジュールに基づいて、特定の日時に特定の場所周辺に位置するユーザを特定し、特定したユーザの感情履歴と、当該特定の日時および場所に関連するキーワードとの関係から、前記特定の日時および場所における特定の感情のレベルを予測し、該特定の感情のレベルに応じて感情予報マップを生成し、前記通信部を介して、前記外部装置に前記類似ユーザに向けた感情予報マップを送信するよう制御する、前記(8)に記載の情報処理システム。
(11)
プロセッサは、
ユーザの行動履歴に基づいて演算された特定の感情のレベル、当該特定の感情が発生した日時および位置、さらに関連するキーワードを対応付けた感情履歴を蓄積部に蓄積することと、
特定の日時、場所、および当該日時または場所に関連付けられたキーワードのうちの少なくともいずれか一つと、前記蓄積部に蓄積される感情履歴との関係から、前記特定の日時および場所における特定の感情のレベルを予測することと、
を含む、情報処理方法。
100 制御部
110 通信部
120 位置情報取得部
130 カメラ
140 マイクロホン
150 操作入力部
160 記憶部
170 表示部
180 スピーカ
2 ハピネス管理サーバ
200 制御部
201 情報解析部
202 ハピネス解析部
203 ハピネス予測部
204 記憶制御部
205 ハピネス情報提示制御部
210 通信部
220 ハピネス履歴記憶部
230 ハピネス予測結果記憶部
Claims (11)
- ユーザの行動履歴に基づいて演算された特定の感情のレベル、当該特定の感情が発生した日時および位置、さらに関連するキーワードを対応付けた感情履歴を蓄積する蓄積部と、
特定の日時、場所、および当該日時または場所に関連付けられたキーワードのうちの少なくともいずれか一つと、前記蓄積部に蓄積される感情履歴との関係から、前記特定の日時および場所における特定の感情のレベルを予測する制御部と、
を備える、情報処理システム。 - 前記制御部は、複数ユーザのスケジュールに基づいて特定の日時に特定の場所周辺に位置するユーザを特定し、特定したユーザの感情履歴に基づいて、特定の日時および場所に関連するキーワードと関係する感情履歴の感情のレベルを抽出し、前記特定の日時および場所における特定の感情のレベルを予測する、請求項1に記載の情報処理システム。
- 前記制御部は、特定ユーザの今後のスケジュールと、前記感情履歴として記憶される前記特定ユーザの特定の感情のレベルと、前記キーワードとに基づき、前記特定ユーザの特定の日時および場所における特定の感情のレベルを予測する、請求項2に記載の情報処理システム。
- 前記制御部は、前記蓄積部に蓄積される感情履歴から、特定ユーザと嗜好が類似する類似ユーザを抽出し、
前記類似ユーザのスケジュールに基づいて、特定の日時に特定の場所周辺に位置するユーザを特定し、特定したユーザの感情履歴に基づいて、特定の日時および場所に関連するキーワードと関係する感情履歴の感情のレベルを抽出し、前記特定の日時および場所における特定の感情のレベルを予測する、請求項2に記載の情報処理システム。 - 前記蓄積部に蓄積される感情履歴としてのハピネス履歴には、前記特定の感情のレベルとしてのハピネス強度に対応付けて前記ユーザが保持しているアイテムも含まれ、
前記制御部は、特定ユーザの今後のスケジュールと、前記ハピネス履歴として記憶される前記特定ユーザのハピネス強度、前記キーワード、および前記アイテムに基づいて、前記特定ユーザのハピネス強度を予測すると共に、ユーザに薦めるアイテムを特定する、請求項2に記載の情報処理システム。 - 前記制御部は、前記ユーザの表情、前記ユーザの音声に基づく感情認識結果、前記ユーザの音声をテキスト化し言語解析した結果、前記ユーザおよび前記ユーザがソーシャルメディアに投稿した内容の少なくともいずれかに基づいて、前記特定の感情のレベルを演算する、請求項1に記載の情報処理システム。
- 外部装置から、特定の日時および場所の感情予報マップを要求するリクエストを受信する通信部を備え、
前記制御部は、特定の日時、場所、および当該日時と場所に関連付けられたキーワードと、前記蓄積部に蓄積される感情履歴との関係から、前記特定の日時および場所における特定の感情のレベルを予測し、当該特定の感情のレベルに基づいて感情予報マップを生成し、生成した感情予報マップを前記通信部を介して前記外部装置に送信するよう制御する、請求項1に記載の情報処理システム。 - 前記制御部は、複数ユーザのスケジュールに基づいて、特定の日時、場所周辺に位置するユーザを特定し、特定したユーザの感情履歴に含まれる特定の感情のレベルと、当該特定の日時および場所に関連するキーワードとの関係から、前記特定の日時および場所における特定ユーザの特定の感情のレベルを予測し、当該特定の感情のレベルに応じて感情予報マップを生成し、前記通信部を介して前記外部装置に当該感情予報マップを送信するよう制御する、請求項7に記載の情報処理システム。
- 前記通信部は、前記外部装置から、特定ユーザの特定の日時および場所の感情予報マップを要求するリクエストを受信し、
前記制御部は、特定ユーザの今後のスケジュールと、前記感情履歴として蓄積される前記特定ユーザの特定の感情のレベルと前記キーワードに基づいて、前記特定ユーザの特定の感情のレベルを予測し、該特定の感情のレベルに応じて感情予報マップを生成し、前記通信部を介して、前記外部装置に特定ユーザに向けた感情予報マップを送信するよう制御する、請求項8に記載の情報処理システム。 - 前記通信部は、前記外部装置から、特定ユーザと嗜好が類似する類似ユーザの、特定の日時および場所の感情予報マップを要求するリクエストを受信し、
前記制御部は、前記蓄積部に蓄積される感情履歴から、特定ユーザと嗜好が類似する類似ユーザを抽出し、当該類似ユーザのスケジュールに基づいて、特定の日時に特定の場所周辺に位置するユーザを特定し、特定したユーザの感情履歴と、当該特定の日時および場所に関連するキーワードとの関係から、前記特定の日時および場所における特定の感情のレベルを予測し、該特定の感情のレベルに応じて感情予報マップを生成し、前記通信部を介して、前記外部装置に前記類似ユーザに向けた感情予報マップを送信するよう制御する、請求項8に記載の情報処理システム。 - プロセッサは、
ユーザの行動履歴に基づいて演算された特定の感情のレベル、当該特定の感情が発生した日時および位置、さらに関連するキーワードを対応付けた感情履歴を蓄積部に蓄積することと、
特定の日時、場所、および当該日時または場所に関連付けられたキーワードのうちの少なくともいずれか一つと、前記蓄積部に蓄積される感情履歴との関係から、前記特定の日時および場所における特定の感情のレベルを予測することと、
を含む、情報処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017535261A JP6747444B2 (ja) | 2015-08-18 | 2016-05-19 | 情報処理システム、情報処理方法、およびプログラム |
US15/747,469 US11227296B2 (en) | 2015-08-18 | 2016-05-19 | Information processing system and information processing method |
CN201680046954.3A CN107924545B (zh) | 2015-08-18 | 2016-05-19 | 信息处理系统和信息处理方法 |
US17/454,382 US11887135B2 (en) | 2015-08-18 | 2021-11-10 | Information processing system and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-161067 | 2015-08-18 | ||
JP2015161067 | 2015-08-18 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/747,469 A-371-Of-International US11227296B2 (en) | 2015-08-18 | 2016-05-19 | Information processing system and information processing method |
US17/454,382 Continuation US11887135B2 (en) | 2015-08-18 | 2021-11-10 | Information processing system and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017029848A1 true WO2017029848A1 (ja) | 2017-02-23 |
Family
ID=58051656
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/064837 WO2017029848A1 (ja) | 2015-08-18 | 2016-05-19 | 情報処理システム、および情報処理方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US11227296B2 (ja) |
JP (2) | JP6747444B2 (ja) |
CN (1) | CN107924545B (ja) |
WO (1) | WO2017029848A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102029760B1 (ko) * | 2018-10-17 | 2019-10-08 | 전남대학교산학협력단 | 사용자 감정 분석을 이용한 이벤트 탐지 시스템 및 방법 |
JP2020067853A (ja) * | 2018-10-25 | 2020-04-30 | マツダ株式会社 | 仮想通貨管理装置及び仮想通貨管理方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11954941B2 (en) * | 2020-01-14 | 2024-04-09 | EMC IP Holding Company LLC | Facial recognition IOT application for multi-stream video using forecasting tools |
US11272888B1 (en) | 2021-06-22 | 2022-03-15 | Nft Ip Holdings Llc | Devices and systems that measure, quantify, compare, package, and capture human content in databases |
JP7365384B2 (ja) * | 2021-10-15 | 2023-10-19 | 株式会社三井住友銀行 | 情報処理装置、方法およびプログラム |
JP7474008B1 (ja) | 2023-12-27 | 2024-04-24 | エグゼヴィータ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012509534A (ja) * | 2008-11-19 | 2012-04-19 | アルカテル−ルーセント | 位置決め可能な位置において人が抱いた感情を表すデータを記録する方法および装置ならびに関連するサーバ |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0981632A (ja) | 1995-09-13 | 1997-03-28 | Toshiba Corp | 情報公開装置 |
JP4264717B2 (ja) * | 2003-08-13 | 2009-05-20 | ソニー株式会社 | 撮像情報の検索再生装置および方法、コンテンツの検索再生装置および方法、並びに感情検索装置および方法 |
BRPI0918937A2 (pt) * | 2009-01-07 | 2016-10-11 | Koninkl Philips Electronics Nv | sistema de gerenciamento de iluminação, método para implementar o sistema de gerenciamento de iluminação e módulo executivo para o uso em um sistema de gerenciamento de iluminação |
JP5520585B2 (ja) * | 2009-12-04 | 2014-06-11 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置 |
US9026476B2 (en) | 2011-05-09 | 2015-05-05 | Anurag Bist | System and method for personalized media rating and related emotional profile analytics |
JP6178800B2 (ja) * | 2011-12-16 | 2017-08-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | ユーザの行動及び関連する感情状態の履歴記録 |
KR101917070B1 (ko) * | 2012-06-20 | 2018-11-08 | 엘지전자 주식회사 | 이동 단말기, 서버, 시스템, 이동 단말기 및 서버의 제어 방법 |
JP6049397B2 (ja) | 2012-10-23 | 2016-12-21 | グリー株式会社 | サーバ装置、イベント運営方法及びプログラム |
US20140149177A1 (en) * | 2012-11-23 | 2014-05-29 | Ari M. Frank | Responding to uncertainty of a user regarding an experience by presenting a prior experience |
JP2014146264A (ja) | 2013-01-30 | 2014-08-14 | Nikon Corp | 飲食店評価システムおよび飲食店評価プログラム |
WO2015107681A1 (ja) | 2014-01-17 | 2015-07-23 | 任天堂株式会社 | 情報処理システム、情報処理サーバ、情報処理プログラム、および情報提供方法 |
US10262264B2 (en) * | 2014-06-09 | 2019-04-16 | Cognitive Scale, Inc. | Method for performing dataset operations within a cognitive environment |
US9978362B2 (en) * | 2014-09-02 | 2018-05-22 | Microsoft Technology Licensing, Llc | Facet recommendations from sentiment-bearing content |
US20160232131A1 (en) * | 2015-02-11 | 2016-08-11 | Google Inc. | Methods, systems, and media for producing sensory outputs correlated with relevant information |
-
2016
- 2016-05-19 US US15/747,469 patent/US11227296B2/en active Active
- 2016-05-19 WO PCT/JP2016/064837 patent/WO2017029848A1/ja active Application Filing
- 2016-05-19 CN CN201680046954.3A patent/CN107924545B/zh active Active
- 2016-05-19 JP JP2017535261A patent/JP6747444B2/ja active Active
-
2020
- 2020-07-17 JP JP2020123310A patent/JP6988961B2/ja active Active
-
2021
- 2021-11-10 US US17/454,382 patent/US11887135B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012509534A (ja) * | 2008-11-19 | 2012-04-19 | アルカテル−ルーセント | 位置決め可能な位置において人が抱いた感情を表すデータを記録する方法および装置ならびに関連するサーバ |
Non-Patent Citations (2)
Title |
---|
KIYOHISA TAGUCHI: "Visualization for Spatiotemporal Distribution of People's Rich Emotions", IPSJ SIG NOTES HUMAN COMPUTER INTERACTION, vol. 2014 -HC, no. 36, 6 March 2014 (2014-03-06), pages 1 - 8, Retrieved from the Internet <URL:https://ipsj.ixsq.nii.ac.jp/ej/?action=repository_action_common_download&item_id=99235&item_no=1&attribute_id=1&file_no=1> [retrieved on 20160815] * |
YUSUKE KAJIWARA: "A Mood Prediction System for Preventing Diseases Using Biological Information and Weather Information", THE TRANSACTIONS OF THE INSTITUTE OF ELECTRICAL ENGINEERS OF JAPAN . C, vol. 135, no. 6, 1 June 2015 (2015-06-01), pages 570 - 579, ISSN: 0385-4221 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102029760B1 (ko) * | 2018-10-17 | 2019-10-08 | 전남대학교산학협력단 | 사용자 감정 분석을 이용한 이벤트 탐지 시스템 및 방법 |
JP2020067853A (ja) * | 2018-10-25 | 2020-04-30 | マツダ株式会社 | 仮想通貨管理装置及び仮想通貨管理方法 |
JP7141020B2 (ja) | 2018-10-25 | 2022-09-22 | マツダ株式会社 | 仮想通貨管理装置及び仮想通貨管理方法 |
Also Published As
Publication number | Publication date |
---|---|
US11227296B2 (en) | 2022-01-18 |
JPWO2017029848A1 (ja) | 2018-05-31 |
US20220067758A1 (en) | 2022-03-03 |
JP2020177696A (ja) | 2020-10-29 |
JP6747444B2 (ja) | 2020-08-26 |
CN107924545A (zh) | 2018-04-17 |
US20180218379A1 (en) | 2018-08-02 |
JP6988961B2 (ja) | 2022-01-05 |
US11887135B2 (en) | 2024-01-30 |
CN107924545B (zh) | 2021-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6988961B2 (ja) | 情報処理システム及び情報処理方法 | |
US11502975B2 (en) | Automatic suggestions and other content for messaging applications | |
US10891342B2 (en) | Content data determination, transmission and storage for local devices | |
CN113112306B (zh) | 消息的优先级排序 | |
US10554870B2 (en) | Wearable apparatus and methods for processing image data | |
US8849931B2 (en) | Linking context-based information to text messages | |
RU2610944C2 (ru) | Хронологический журнал действий пользователя и ассоциированных эмоциональных состояний | |
US11336594B2 (en) | Information processing system and information processing method | |
US10645460B2 (en) | Real-time script for live broadcast | |
US20200004291A1 (en) | Wearable apparatus and methods for processing audio signals | |
US10587717B2 (en) | Notification content | |
CN102947826A (zh) | 基于情境的信息聚合系统 | |
JP6677167B2 (ja) | マッチングシステム、情報処理装置、マッチング方法、およびプログラム | |
US20210144118A1 (en) | Notification targeting | |
US20170118162A1 (en) | Notification Classification | |
CN111247782B (zh) | 用于自动创建即时ad-hoc日历事件的方法和系统 | |
US20140280629A1 (en) | Method and system for optimizing composer prompts for posting in a social networking system | |
US11095945B2 (en) | Information processing device, method, and program | |
US20190303967A1 (en) | System and method to characterize a user of a handheld device | |
US10349135B2 (en) | Method and program product for automatic human identification and censorship of submitted media | |
WO2015142292A1 (en) | Methods and systems for determining similarity between network user profile data and facilitating co-location of network users | |
Kour | Digital detoxification: a content analysis of user generated videos uploaded on youtube by facebook quitters | |
CN112650930B (zh) | 一种信息处理方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16836840 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017535261 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15747469 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836840 Country of ref document: EP Kind code of ref document: A1 |