CN103488669B - Message processing device, information processing method and program - Google Patents

Message processing device, information processing method and program Download PDF

Info

Publication number
CN103488669B
CN103488669B CN201310222794.1A CN201310222794A CN103488669B CN 103488669 B CN103488669 B CN 103488669B CN 201310222794 A CN201310222794 A CN 201310222794A CN 103488669 B CN103488669 B CN 103488669B
Authority
CN
China
Prior art keywords
experience
user
information
extracted
users
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310222794.1A
Other languages
Chinese (zh)
Other versions
CN103488669A (en
Inventor
高村成
高村成一
馆野启
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN103488669A publication Critical patent/CN103488669A/en
Application granted granted Critical
Publication of CN103488669B publication Critical patent/CN103488669B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/38Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24575Query processing with adaptation to user needs using context
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/26Visual data mining; Browsing structured data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9038Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Remote Sensing (AREA)
  • Library & Information Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Information Transfer Between Computers (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

This disclosure relates to message processing device, information processing method and program.The message processing device includes:Extraction unit is experienced, includes the experience information of the information related to time or position for being extracted from the text message inputted by user;And user's extraction unit, the user's group of general character is wherein found in the experience information for being extracted by comparing the experience information of the one or more users extracted by the experience extraction unit.

Description

Message processing device, information processing method and program
Technical field
This disclosure relates to message processing device, information processing method and program.
Background technology
Recently, with the development of network environment, with diary(So-called blog)The SNS of form(Social networking service)And net Station becomes universal.By this way, the text message of the various experience for indicating many users is puted up on the internet.From this The text message of sample, can understand the experience of the past experience, ongoing experience or plan of each user.In addition, As disclosed in Japanese Laid-Open Patent No.2008-003655, the row of user can be detected based on the information obtained from sensor For pattern.
The content of the invention
But, although there may be will be with the shared experience of other users user's group, but exist wherein because the time and The mismatch of position causes to lose opportunity, i.e., can not possibly realize shared many situations of experience.
Therefore, the present disclosure proposes a kind of new improved message processing device, the letter that can support the shared experience of user's group Cease processing method and program.
According to one embodiment of the disclosure there is provided a kind of message processing device, it includes:Extraction unit is experienced, is used Include the experience information of the information related to time or position in being extracted from the text message inputted by user;And user extracts Unit, for extracting it by comparing the experience information of the one or more users extracted by the experience extraction unit In the user's group of general character is found in the experience information.
According to one embodiment of the disclosure there is provided a kind of information processing method, it includes:From the text inputted by user This information extraction includes the experience information of the information related to time or position;Also, pass through relatively more one or more users' The experience information extracted wherein finds the user's group of general character to extract in the experience information.
According to one embodiment of the disclosure there is provided a kind of program, for causing computer as message processing device, Described information processing equipment includes:Experience extraction unit, for extracted from the text message inputted by user include with the time or The experience information of the related information in position;And user's extraction unit, for being extracted by comparing by the experience extraction unit The experience information of one or more users the user's group of general character is wherein found in the experience information to extract.
According to embodiment of the disclosure as described above, it would be preferable to support the shared experience of user's group.
Brief description of the drawings
Fig. 1 is explanation figure of the diagram according to the configuration of the information processing system of one embodiment of the disclosure;
Fig. 2 is the functional block diagram for the configuration for illustrating server apparatus SV;
Fig. 3 is the explanation figure of the detailed configuration of diagram experience extraction unit;
Fig. 4 is the explanation figure of the content of the particular procedure for illustrating to be performed by experience extraction unit;
Fig. 5 is the explanation figure of the configuration of illustrated features template;
Fig. 6 is the explanation figure for the particular example that diagram experience position is extracted;
Fig. 7 is the explanation figure for the particular example that diagram experience scene is extracted;
Fig. 8 is the explanation figure for the particular example for illustrating experience information database;
Fig. 9 is flow chart of the diagram according to the server apparatus SV of the present embodiment operation;
Figure 10 is the explanation figure of the particular example of the diagram experience location drawing;
Figure 11 is the explanation figure of the particular example of diagram experience time diagram;
Figure 12 is the explanation figure for the particular example for illustrating experience categories figure;
Figure 13 is the explanation figure of the particular example of diagram experience scene graph;
Figure 14 is the explanation figure of the particular example of diagram experience time match;
Figure 15 is the explanation figure of the particular example of diagram experience location matches;And
Figure 16 is the explanation figure for the hardware configuration for illustrating server apparatus SV.
Embodiment
Hereinafter, preferred embodiment of the present disclosure will be described in detail with reference to the attached drawings.Note, in the present description and drawings, make The structural detail with the function and structure being substantially the same is represented with identical drawing reference numeral, and omits these structural details Repeat specification.
In addition, in the specification and illustrated in the drawings, there are following situations, wherein, it is attached by the afterbody to identical drawing reference numeral Connect different letters has the multiple structural details for the functional configuration being substantially the same to be distinguished from each other.However, not special wherein It is required that in the case of the multiple structural details with the functional configuration being substantially the same that are distinguished from each other, only assigning identical accompanying drawing mark Number.
In addition, describing the disclosure according to following project order.
1. the basic configuration of information processing system
2. the configuration of server apparatus
3. the operation of server apparatus
4. the particular example of operation
(First embodiment)
(Second embodiment)
(3rd embodiment)
(Fourth embodiment)
(5th embodiment)
(Supplement)
5. hardware configuration
6. conclusion
<1. the basic configuration of information processing system>
Technology in accordance with an embodiment of the present disclosure can be realized in the various modes specifically described below as example. In addition, forming the server apparatus according to each embodiment of information processing system(SV)Including:A. extraction unit is experienced (132), include the experience information of the information related to time or position for being extracted from the text message inputted by user;And B. user's extraction unit(135), for the experience information by comparing one or two user extracted by experience extraction unit To extract the user's group that general character is wherein found in experience information.
Below, first, with reference to Fig. 1, the basic configuration for information processing system provides explanation, the information processing system bag Include the server apparatus according to the present embodiment of the example as message processing device.
Fig. 1 is explanation figure of the diagram according to the configuration of the information processing system of one embodiment of the disclosure.Such as institute in Fig. 1 Show, multiple information terminal CL and server apparatus SV are mainly included according to the information processing system of one embodiment of the disclosure. In addition, the system configuration introduced herein is only example, and can be to present and following obtainable various system configuration applications According to the technology of the present embodiment.
Information terminal CL is the example of the device used by user.As information terminal CL, for example, it is assumed that mobile phone, Smart phone, digital camera, digital camera, personal computer, tablet terminal, auto-navigation system, portable game device, Care appliance(Including pedimeter(Registration mark))And Medical Devices.Meanwhile, as server apparatus SV, for example, it is assumed that family expenses Server and cloud computing system.
Naturally, the system configuration being applicable according to the technology of the present embodiment is not limited to example in Fig. 1, but in order to Illustrate convenient, under the multiple information terminal CL and server apparatus SV hypothesis connected by wired and or wireless network to Go out explanation.Thus, it is supposed that following configurations, wherein, information can be exchanged between information terminal CL and server apparatus SV.So And, following configurations can be used, the configuration causes in the various functions possessed by information providing system, freely design will be by The function that information terminal CL possesses and the function to be possessed by server apparatus SV.For example, it is desirable in view of information terminal CL's Computing capability and communication speed and design it.
<2. the configuration of server apparatus>
The general arrangements of the above-mentioned information processing system according to the present embodiment.Then, referring to figs. 2 to Fig. 8, description Server apparatus SV configuration.
Fig. 2 is the functional block diagram for the configuration for illustrating server apparatus SV.As shown in Figure 2, server apparatus SV includes text This information acquisition unit 131, experience extraction unit 132, experience information memory cell 133, experience search unit 134, Yong Huti Take unit 135, display generation unit 136 and experience recommendation unit 137.
(Text message acquiring unit)
Text message acquiring unit 131 obtains the text message inputted by user.For example, text message acquiring unit 131 The input unit for inputting text by user can be represented or represent to be used for obtain text envelope from social networking service or application The information collection apparatus of breath.Here, for convenience of description, the defeated of such as software keyboard is represented in text message acquiring unit 131 Explanation is provided under the hypothesis for entering unit.
(Experience extraction unit)
The text message obtained by text message acquiring unit 131 is inputted in experience extraction unit 132.Now, experience Extraction unit 132 can receive the input of text message and temporal information in the input of text message.When input text envelope During breath, the text message of the experience analysis input of extraction unit 132, and it is related to the experience of user from text information extraction Information.For example, experience information represents to include the event of experience(Such as experience categories), experience position and experience time letter Breath.
Here, reference Fig. 3 to be described in detail to the functional configuration of experience extraction unit 132.As shown in Figure 3, experience is carried Unit 132 is taken mainly to include type feature amount extraction unit 151, experience categories determining unit 152 and the storage of experience categories model Unit 153.In addition, experience extraction unit 132 includes position feature amount extraction unit 154, experience position extraction unit 155 and body Test position model memory cell 156.In addition, experience extraction unit 132 includes temporal characteristics amount extraction unit 157, experience time Extraction unit 158 and experience time model memory cell 159.
When inputting text message in experiencing extraction unit 132, in type feature amount extraction unit 151, position feature Text message is inputted in amount extraction unit 154 and temporal characteristics amount extraction unit 157.
Type feature amount extraction unit 151 extracts the characteristic quantity related to experience categories from the text message of input(Below Referred to as " type feature amount ").The class extracted by type feature amount extraction unit 151 is inputted in experience categories determining unit 152 Type adjustment amount.Experience categories determining unit 152 is using the learning model that is stored in experience categories model storage unit 153 from defeated The type feature amount entered determines experience categories.
In addition, position feature amount extraction unit 154 extracts the feature related to the position experienced from the text message of input Amount(Hereinafter referred to as " position feature amount ").Input and carried by position feature amount extraction unit 154 in experience position extraction unit 155 The position feature amount taken.Experience position extraction unit 155 uses the study mould stored in experience position model memory cell 156 Type determines the position of experience from the position feature amount of input.
In addition, temporal characteristics amount extraction unit 157 extracts the feature with the time correlation of experience from the text message of input Amount(Hereinafter referred to as " temporal characteristics amount ").Input and carried by temporal characteristics amount extraction unit 157 in experience time extraction unit 158 The temporal characteristics amount taken.Experience time extraction unit 158 uses the study mould stored in experience time model memory cell 159 Type to determine the time of experience from the temporal characteristics amount of input.
Here, with reference to Fig. 4, addedly illustrating what is performed by experience extraction unit 132 using music experience as example The content of processing.Fig. 4 is the explanation figure of the content of the particular procedure for illustrating to be performed by experience extraction unit 132.In addition, being Explanation is convenient, although provide explanation using music experience as example, but the technical scope not limited to this of the present embodiment.
As shown in Figure 4, in the case of music experience, the possibility example of experience categories includes " listening to music(Receive Listen)", " music video of the viewing on TV/ films/DVD(Viewing)", " purchase track/CD(Purchase)", " participate in performance or sound Happy meeting(Performance)" and " sing/broadcasting/song of wrirting music(Play)”.Experience the usage type Characteristic Extraction unit of extraction unit 132 151 and the function of experience categories determining unit 152 determine these experience categories.
For example, it is determined that in the case of experience categories " listening to ", first, type feature amount extraction unit 151 by morpheme, N members or the method for maximum substring extract the type feature amount related to the experience categories of " listening to ".Next, experience categories are true Order member 152 determines whether that it corresponds to experience categories and " received by such as SVM or logistic regression method from type feature amount Listen ".Determination result in experience categories determining unit 152 is outputted as the information for indicating experience categories.Similarly, obtain Take the determination result relative to experience categories " viewing ", " purchase ", " performance " and " broadcasting ".
In addition, realizing experience position by the function of position feature amount extraction unit 154 and experience position extraction unit 155 Put extraction.First, position feature amount extraction unit 154 performs morphemic analysis for input text message, and in experience position Input results in extraction unit 155.Next, based on morphological analysis result, experience position extraction unit 155 uses such as CRF (Condition random field)Method come extract experience position.For example, experience position extraction unit 155 uses spy shown in Figure 5 Template is levied to extract experience position as shown in Figure 6(In the example of fig. 6, extract "(Kyoto station)(KYOTO EKI)(Capital of a country station)”).
In addition, passage time Characteristic Extraction unit 157 is realized during experience with the function of experience time extraction unit 158 Between extract.Extracted similar to experience position above, body is realized by using morphological analysis and CRF etc. sequence mark method Test time extraction.In addition, as the expression of experience time, it is possible, for example, to employ such as " current ", " past ", " future ", The expression of the various units in " morning ", " dusk " and " night ".Here, in the presence of not necessarily obtain experience categories, experience position and The some or all of situation of experience time.
Supplement
In addition, although above-mentioned wherein experience extraction unit 132 extracts the geographical position at such as " A stations " as experience The example of position, but the experience information extracted by experience extraction unit 132 is not limited to the example.For example, experience extraction unit 132 can extract the information related with the non-geographic of " concert " experience scene to such as " train ".
Fig. 7 is the explanation figure for showing to experience the particular example that scene is extracted.For example, in the analysis text associated with id " 1 " In the case of this information, " performance " that experience extraction unit 132 can be extracted in text message and include is used as experience scene. In following explanation, although illustrate to use the information processing of the experience information in addition to experiencing position, it is possible that very Similar information processing is realized to by using the information related to experience position.
(Experience information memory cell)
Experience information memory cell 133 stores the experience information including being extracted by experience extraction unit 132(Such as experience class Type, experience position and experience time)Experience information database.Here, with reference to Fig. 8, illustrating the specific of experience information database Example.
Fig. 8 is the explanation figure for the particular example for illustrating experience information database.In fig. 8 in shown example, experience letter Cease database include id, for recognize each experience information, text message put up time and date, user, experience categories, Experience target, experience time, experience position and text message.For example, id " 1 " and text message " I want to sing song A in chorus in karaoke in A station(I will carry out choral song A to play Karaoka in A stations)" and by dividing Experience categories " the sing song for analysing text message and being extracted as experience information(Sing)", experience target " song A(Song A)", experience the time " future(Future)" and experience position " A station(A stands)" associated.
(Experience search unit)
The experience information database search experience that experience search unit 134 is stored from experience information memory cell 133 Information.For example, experiencing user, experience categories, experience target, experience position and at least one condition for experiencing the time when specifying When, experience search unit 134 searches for experience information corresponding with the condition specified.
(User's extraction unit)
Based on the search result in experience search unit 134, user's extraction unit 135 extracts corresponding with multiple entries User's group, in the plurality of entry, experience categories, experience target, experience position and experience the time at least one be identical. For example, user's extraction unit 135 extracts user A and user B corresponding with id " 1 " and id " 2 ", wherein, shown body in fig. 8 Test experience categories in information database, experience target identical with the experience time.
(Show generation unit)
Show that generation unit 136 produces display, the display is used to indicate in the user's group extracted by user's extraction unit Relation between experience information.The display produced by display generation unit 136 is described in detail in " particular examples of 4. operations " Particular example.
(Experience recommendation unit)
Experience recommendation unit 137 is recommended for experiencing shared time or position to the user's group extracted by user's extraction unit Put.Using such configuration, user's group accesses specific position in the time of recommendation, or user's group is accessed in the specific time The position of recommendation so that user's group can share experience.In addition, experience recommendation unit 137 can include according in user's group The cohesion of user perform recommendation.For example, in the case where the cohesion of user is high, user is relatively easy to asking same The other users adjustment time of experience and position, still, in the case that the cohesion of user is low wherein, it is believed that adjustment is difficult. Therefore, experience recommendation unit 137 can user wherein cohesion it is low in the case of perform recommendation.
<3. the operation of server apparatus>
The above-mentioned server apparatus SV according to the present embodiment configuration.Then, with reference to Fig. 9, adjust according to this reality Apply the server apparatus SV of example operation.
Fig. 9 is flow chart of the diagram according to the server apparatus SV of the present embodiment operation.As shown in Figure 9, first, when Text message acquiring unit 131 obtains text message(S310)When, the experience analysis text information of extraction unit 132, and from The text information extraction experience information related to Consumer's Experience(S320).Then, the storage of experience information memory cell 133 includes The experience information database for the experience information extracted by experience extraction unit 132(S330).
Thereafter, when experience search unit 134 is from experience information database search experience information(S340)When, based in experience Search result in search unit 134, user's extraction unit 135, which is extracted, corresponds to wherein experience categories, experience target, experience position Put and experience the user's group of the multiple entries of at least one identical of time.
Then, display generation unit 136 produces the experience information of the user's group for indicating to be extracted by user's extraction unit Relation display, i.e. visual user group relation(For example, the similarity of experience information)(S360).In addition, experience is recommended Unit 137 is recommended for experiencing shared time or position to the user's group extracted by user's extraction unit(S370).In addition, can With to the information terminal CL of each user included in user's group send the display that is produced by display generation unit 136 and Experience the content recommendation in recommendation unit 137.
<4. the particular example of operation>
The operation of the server apparatus SV according to the present embodiment is have adjusted by reference to Fig. 9.Then, with reference to figures 10 to figure 15, illustrate the particular example of the operation of server apparatus SV according to the present embodiment.In addition, below, storing single in experience information Member 133, which is stored under the hypothesis of the experience information database shown in Fig. 8, provides explanation.
(First embodiment)
In experience information database, id " 1 " is associated with the user A for the bent A that to sing experience information, and id " 2 " The experience information of user B with to sing same song A is associated.Therefore, user's group extraction unit 135, which is extracted, has common body The user A and B of type are tested as user's group.
However, user A experience position is " A stations ", and user B experience position is " B stations ", and so if is continued So then user A and user B are unable to choral song A.Thus, for example, as shown in Figure 10, display generation unit 136 produces body The location drawing is tested, the relation for instruction user A and B experience position.By providing the experience location drawing, user to user A and B A and B can be found in the other users with same target closely.
Thereafter, for example, being agreed by communication with one another and on experience position, it is bent that user A and B can share singing A experience.Alternatively, experience recommendation unit 137 recommends experience position wherein(For example, A station in Karaoke, B station In Karaoke or the Karaoke in centre position)In the case of, by accessing the experience position recommended, user A and B can With shared experience.
Here, experience recommendation unit 137 can determine the crowded level of each time or position, and use crowded water Flat determination result come recommend experience position.For example, being determined based on the experience information stored in experience information database Current time exist many users using B station in Karaoke in the case of, experience recommendation unit 137 can by Karaoke in A stations is recommended as experience position.Using such configuration, the shared experience of user's group can be fully supported.
(Second embodiment)
In experience information database, id " 3 " and the experience information that play game B user C relative to someone in C stations It is associated.And user Ds of the id " 4 " with to play the B that plays relative to someone in same C stations experience information is associated.Therefore, User's group extraction unit 135 extracts user C and D with common experience categories and is used as user's group.
However, the user C experience time is " current ", and the user D experience time is " future ", and so if is held Continuous so then user C and D can not play game B relative to each other in C stations.Therefore, display generation unit 136 is produced for indicating The experience time diagram of the relation of user C and D experience time.By to user C and D provide this experience time diagram, user C and D can find the other users with same target.
Thereafter, for example, being agreed by communication with one another and on the experience time, user C and D can C station in that This shared experience for playing game B.Alternatively, the experience time is recommended in experience recommendation unit 137(For example, in afternoon March 17) In the case of, by poly- one piece of the time in recommendation, user C and D can share experience.
Here, with reference to Figure 11, illustrating the configuration example for experiencing time diagram.Figure 11 is diagram by 136 pairs of display generation unit The explanation figure of the configuration example of the experience time diagram produced in each experience categories.As shown in Figure 11, generation unit is shown 136 arrange the experience time that each user is clustered on the experience time diagram of each experience time in a matrix fashion wherein.Make With such configuration, it can visualize that wherein user is possible to gather one piece of time zone and wherein user can not possibly gather one piece Time zone, can provide the chance run into the user's group that can not be met each other in the case of without the system.
(3rd embodiment)
Although the shared particular example of above-mentioned experience, describes support experience shared as 3rd embodiment The example avoided.
In experience information database, user Es of the id " 5 " with being jogged in the D of position experience information is associated, and id The user F of " 6 " with being jogged in same position D experience information is associated.User E fails to have comfortably slow because of crowded The experience of race.Meanwhile, position D is not crowded in the time zone that wherein user F jogs, and therefore, user F is with comfortable slow Success in the experience of race.
In this case, for example, display generation unit 136 can be retouched on the generation of jogging in the D of position with reference to Figure 11 The experience time diagram stated.By reference to produced experience time diagram, user E can find with less runner when Area, and it is crowded in next jog therefore, it is possible to avoid.
(Fourth embodiment)
In experience information database, user Gs of the id " 7 " with to play jazz in the A of station experience information is associated, And user Hs of the id " 8 " with to listen to jazz in the A of station experience information is associated.Herein, although user G and H experience Type is different, but the relation the need for there is " broadcasting music " and " listening to music " with seed.Therefore, by based on building in advance Vertical seed and the experience categories that need are to performing matching, and the extraction of user's group extraction unit 135 includes user G and H user Group.
In addition, display generation unit 136 can produce experience categories figure for example as shown in Figure 12.Matched somebody with somebody using such Put, user G and H can find the other users with seed and the relation needed, and therefore, it is possible to realize user G mesh The target of mark and user H.
In addition, although by the above-mentioned example for seed and needs of a pair " broadcasting music " and " listening to music ", still Seed and needs are to being not limited to the example.For example, the present embodiment is applied to various seeds and pair needed, such as " purchase thing " " sale thing " pair and " needing part-time worker " and " apply part-time worker " pair.
(5th embodiment)
As described with reference to Figure 7, experience extraction unit 132 can extract the non-geographic body with such as " train " and " concert " Test the related information of scene.Therefore, user's group extraction unit 135 can extract the user with common experience scene as above Group, and show that generation unit 136 can produce the display organized for visual user and extract result.
For example, as shown in Figure 13, by arrange with the non-geographic project of such as " family ", " workplace " and " outgoing " and For indicate such as " during operation ", the sundry item of the User Status of " before sleep " and " during movement " it is corresponding User's group, display generation unit 136 can produce experience scene graph.Using such configuration, because can realize with similar Scene user search/recommendation/matching, so can produce for the unique high sympathetic response of the user of shared similar scene Communication effectiveness.
(Supplement)
Here, in the presence of the text message of wherein user it is fuzzy and be difficult to from text message specify specific experience position or The situation of experience time.For example, on text message " I want to have an experience of XXX with someone next time(The secondary experience having with someone XXX under me)", time ambiguity is experienced, and be difficult to specify spy The fixed experience time.
Therefore, the experience that user's extraction unit 135 will can be extracted according to the level-of-detail of temporal expressions from text message Time is converted to the specific period, and extraction can match the user's group of changed period.For example, as shown in Figure 14, User's group extraction unit 135 can be by from text message " I want to have an experience of XXX with someone next time(The secondary experience having with someone XXX under me)" extract the experience time " next time " be converted to The current period between the time when the front and rear scheduled time.In addition, user's extraction unit 135 can be " bright by the experience time It 20 points or so " be converted to the period in 20 points or so of preset range.Similarly, user's extraction unit 135 can be by body Testing the time " night tomorrow " is converted to the period with corresponding length.
Using such configuration, though sufficient matching can be also realized in the case of experience time ambiguity wherein, and And report matching result or content recommendation to user's group.
In addition, user's extraction unit 135 also performs similar processing in the case of location fuzzy is experienced.More specifically Ground, for example, as shown in Figure 15, user's extraction unit 135 will can be carried according to the level-of-detail of location presentation from text message The experience position taken is converted to particular range, and extracts the user's group of the commensurate in scope with being changed.
<5. hardware configuration>
Above-mentioned embodiment of the disclosure, by the cooperation of software and following server apparatus SV and hardware come Realize the information processing by server apparatus SV above.
Figure 16 is the explanation figure for the hardware configuration for illustrating server apparatus SV.As shown in Figure 16, server apparatus SV bags Include CPU(CPU)201、ROM(Read-only storage)202、RAM(Random access memory)203rd, input equipment 208, Output equipment 210, storage device 211, driver 212 and communication equipment 215.
CPU201 is controlled in server apparatus SV as arithmetic processing unit and control device, and according to various programs In integrated operation.In addition, CPU201 can be microprocessor.ROM202 storages are joined by the CPU201 programs used or arithmetic Number.The program used at the execution time that RAM203 is stored temporarily in CPU201 and the ginseng fully changed according to the execution Number.These are interconnected by the main bus including cpu bus.
Input equipment 208 includes being used for by user inputs the input block of information and for producing based on the input by user Raw input signal and the input control circuit for being output to CPU201, wherein, input block includes mouse, keyboard, touch Plate, button, microphone, switch and bar.By operating input equipment 208, server apparatus SV user can set in server Various data are inputted in standby SV, and are operated to server apparatus SV designated treatments.
Output equipment 210 is for example including liquid crystal display(LCD)Equipment, OLED(Organic Light Emitting Diode)Equipment and such as lamp Display device.In addition, output equipment 210 includes audio output device, such as loudspeaker and earphone.For example, display device is aobvious Show the image of shooting or the image of generation.Meanwhile, voice data etc. is converted to sound by audio output device, and exports it.
The data that storage device 211 is created as the example of the memory cell of the server apparatus SV according to the present embodiment are deposited Store up equipment.Storage device 211 can include storage medium, in the storage medium recording equipment of record data, from the storage Medium reads the reading equipment of data and deletes the sweep equipment of the data recorded in the storage medium.Storage device 211 Storage is by the CPU201 programs performed and various data.
Driver 212 is memory medium reader/write device, and is comprised in server apparatus SV or outside attachment To server apparatus SV.Driver 212 reads filling in the disk being such as attached, CD, magneto-optic disk and semiconductor memory The information recorded in storage medium 24 is unloaded, and is output to RAM203.In addition, driver 212 can load and unload storage Information is write in medium 24.
Communication equipment 215 is, for example, the communication interface for including being connected to the communicator of network 12.In addition, communication equipment 215 can be the wire communication facility that communication is performed by wire, even if it is to support WLAN(LAN)Communication set Standby or support LTE(Long Term Evolution)Communication equipment.
In addition, network 12 is the wired or wireless transmission path of the information for being sent from the equipment for being connected to network 12. For example, network 12 can include:Common line network, such as internet, telephone line network and satellite communication network;Various LAN (LAN), including Ethernet(Registration mark);And WAN(Wide area network).In addition, network 12 can include special circuit Net, such as IP-VPN(Internet Protocol-Virtual private network).
<6. conclusion>
As described above, the experience information by comparing the multiple users extracted from text message, according to the clothes of the present embodiment Device equipment of being engaged in SV extracts the user's group for the general character for finding experience information.By this way, by the case of no support The user's group that can not be met each other provides the chance run into, it would be preferable to support the shared experience of user's group.
Those skilled in the art should be understood that various modifications, combination, sub-portfolio and change can be according to design requirements Occur with other factors, as long as they are in the range of appended claim or its equivalents.
For example, although above-mentioned wherein based on extracting user by the experience information analyzed text message and obtained The example of group, but the disclosure is not limited to the example.For example, server apparatus SV may further include behavior extraction unit, Behavior extraction unit extracts the row of user based on the sensor information that the sensor set from information terminal CL is obtained For pattern information, and user's extraction unit 135 can be by comparing the information group including experience information and behavior pattern information To extract user's group.As the sensor set in information terminal CL, there is provided motion sensor, position sensor etc..Example Such as, behavior extraction unit extracts user based on the sensor information obtained from motion sensor in the behavior pattern letter jogged Breath, increases the positional information obtained from position sensor, and can thereby determine that where user jogs.
Furthermore, it is not required that following disclosed order in flow charts to be handled with time sequencing according to specification in service The step in processing in device equipment SV.For example, can be with disclosed order different processing in flow charts or simultaneously The step in the processing in server apparatus SV is handled capablely.
Furthermore it is possible to set up computer program, the computer program can such as to include in server apparatus SV CPU201, ROM202 and RAM203 hardware complete and the functional equivalent of server apparatus SV above each configuration Function.Further it is provided that causing the storage medium of storage computer program.
In addition, this technology can also be configured as follows.
(1)A kind of message processing device, including:
Extraction unit is experienced, includes the information related to time or position for being extracted from the text message inputted by user Experience information;And
User's extraction unit, for by comparing described in the one or more users extracted as the experience extraction unit Experience information wherein finds the user's group of general character to extract in the experience information.
(2)According to(1)Message processing device, further comprise:
Generation unit is shown, for producing display, the institute shown for indicating to be extracted by user's extraction unit State the relation of the experience information of user's group.
(3)According to(1)Or(2)Message processing device, further comprise:
Recommendation unit is experienced, for recommending to be used to experience shared to the user's group extracted by user's extraction unit Time or position.
(4)According to(3)Message processing device, wherein, it is described experience recommendation unit be based on one or more of users The experience information determine in special time or the crowded level in specific location, and use the crowded level Result is determined to recommend for experiencing the shared time or the position.
(5)According to(1)Or(2)Message processing device, further comprise:
Recommendation unit is experienced, for described one included in the user's group extracted by user's extraction unit At least a portion of individual or multiple users is recommended for avoiding time shared with the experience of another user or position.
(6)According to(1)Extremely(5)Any one message processing device, further comprise:
Behavior extraction unit, the behavior pattern information for extracting user from sensor information,
Wherein, user's extraction unit is by comparing including by the one or many of the experience extraction unit extraction The experience information of individual user and the behavior mould of the one or more of users extracted by the behavior extraction unit The information group of formula information extracts the user's group.
(7)According to(1)Extremely(6)Any one message processing device,
Wherein, the experience information further comprises the information related to experience categories;And
Wherein, user's extraction unit refers to the information related to the experience categories, and is being used as the wherein Target and be used as the situation that general character is provided between the requested target provided of experience of second user that the experience of one user is provided Under, extraction includes the user's group of first user and the second user.
(8)According to(1)Extremely(7)Any one message processing device,
Wherein, the experience information further comprises the information related to non-geographic experience scene;And
Wherein, user's extraction unit refers to the described information related to non-geographic experience scene, and extracts Wherein find the user's group of the general character in the non-geographic experiences scene.
(9)According to(1)Extremely(8)Any one message processing device, further comprise:
Experience information memory cell, the body for storing the experience information for including being extracted by the experience extraction unit Test information database;And
Search unit is experienced, for the experience information database search body stored from the experience information memory cell Information is tested, and is supplied to user's extraction unit to carry out the comparison search result.
(10)A kind of information processing method, including:
Being extracted from the text message inputted by user includes the experience information of the information related to time or position;And
Wherein looked for by the experience information extracted of relatively more one or more users to extract in the experience information To the user's group of general character.
(11)A kind of program, for causing computer as message processing device, described information processing equipment includes:
Extraction unit is experienced, includes the information related to time or position for being extracted from the text message inputted by user Experience information;And
User's extraction unit, for by comparing described in the one or more users extracted as the experience extraction unit Experience information wherein finds the user's group of general character to extract in the experience information.
The disclosure includes the Japanese earlier patent application with being submitted on June 13rd, 2012 in Japan Office The theme of theme correlation disclosed in JP2012-133785, its entire content is comprised in this by quoting.

Claims (10)

1. a kind of message processing device, including:
Circuit, is configured as:
The first experience information is extracted from the text message of the user obtained from social media, the first experience information instruction user Future experience;
The second experience information is extracted from the text message of other users, second experience information indicates that other users are following, current Or past experience;And
Based on the first experience information and the second experience information, those users that there is general character with user are extracted from other users;
Wherein, the information related to non-geographic experience scene is each included in the first experience information and the second experience information; And
Wherein, the user extracted is extracted based on the information related to non-geographic experience scene from other users, and is extracted Those users in scene with general character are experienced in non-geographic.
2. message processing device according to claim 1, wherein, circuit is further configured to:
Produce display, the relation for showing the second experience information for indicating extracted user.
3. message processing device according to claim 1, wherein, circuit is further configured to:
Recommend to the user extracted for experiencing shared time or position.
4. message processing device according to claim 3, wherein, circuit is further configured to:Based on the use extracted Second experience information at family determines in special time or the crowded level in specific location, and using crowded level really Determine result to recommend for experiencing shared time or position.
5. message processing device according to claim 1, wherein, circuit is further configured to:
At least a portion to the other users included in the user extracted is recommended for avoiding the body with another user Test shared time or position.
6. message processing device according to claim 1, wherein, circuit is further configured to:
The behavior pattern information of user is extracted from sensor information, and
Second experience information extracted by comparing extracts user with the behavior pattern information extracted.
7. message processing device according to claim 1,
Wherein, each in the first experience information and the second experience information includes the information related to experience categories;And
Wherein, extracted user is extracted from other users based on the information related to experience categories, and made wherein General character is found between the target provided for the experience of the first user and the target as the requested offer of experience of second user In the case of, extract the first user and second user is used as the user with general character.
8. message processing device according to claim 1,
Wherein, the user extracted includes having in other users the second experience letter matched with the first experience information extracted Those users of breath.
9. message processing device according to claim 1, wherein, the second experience information indicates the following body of other users Test.
10. a kind of information processing method, including:
The first experience information is extracted from the text message of the user obtained from social media, the first experience information instruction user Future experience;
The second experience information is extracted from the text message of other users, second experience information indicates that other users are following, current Or past experience;And
Based on the first experience information and the second experience information, those users that there is general character with user are extracted from other users;
Wherein, the information related to non-geographic experience scene is each included in the first experience information and the second experience information; And
Wherein, the user extracted is extracted based on the information related to non-geographic experience scene from other users, and is extracted Those users in scene with general character are experienced in non-geographic.
CN201310222794.1A 2012-06-13 2013-06-06 Message processing device, information processing method and program Active CN103488669B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012133785A JP6127388B2 (en) 2012-06-13 2012-06-13 Information processing apparatus, information processing method, and program
JP2012-133785 2012-06-13

Publications (2)

Publication Number Publication Date
CN103488669A CN103488669A (en) 2014-01-01
CN103488669B true CN103488669B (en) 2017-09-05

Family

ID=49756892

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310222794.1A Active CN103488669B (en) 2012-06-13 2013-06-06 Message processing device, information processing method and program

Country Status (3)

Country Link
US (4) US9135368B2 (en)
JP (1) JP6127388B2 (en)
CN (1) CN103488669B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6306333B2 (en) * 2013-12-04 2018-04-04 株式会社 ミックウェア Information processing apparatus, information processing method, and program
KR102301476B1 (en) * 2014-05-16 2021-09-14 삼성전자주식회사 Electronic device and method for notification in internet service
JP6347680B2 (en) * 2014-06-26 2018-06-27 シャープ株式会社 System for managing action logs
JP6565886B2 (en) * 2016-12-07 2019-08-28 トヨタ自動車株式会社 Information providing apparatus and information providing program
JP7060014B2 (en) * 2017-04-21 2022-04-26 ソニーグループ株式会社 Information processing equipment, information processing methods and programs
JP6501029B1 (en) * 2018-07-26 2019-04-17 オムロン株式会社 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM
CN109508413A (en) * 2018-10-22 2019-03-22 中国银行股份有限公司 Job scheduling methods of exhibiting and device, storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1755663A (en) * 2004-04-07 2006-04-05 索尼株式会社 Information-processing apparatus, information-processing methods and programs
CN101069184A (en) * 2005-09-28 2007-11-07 索尼株式会社 Information processing device, method, and program
WO2010111537A2 (en) * 2009-03-27 2010-09-30 T-Mobile Usa, Inc. Providing event data to a group of contacts

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4548624B2 (en) * 1998-06-30 2010-09-22 雅信 鯨田 Device for meeting / exchange / contact / communication support
JP2000222466A (en) * 1999-02-02 2000-08-11 Japan Metals & Chem Co Ltd Business management support system
JP2003087845A (en) * 2001-09-13 2003-03-20 Sanyo Electric Co Ltd Information providing server, information providing method, and information acquisition method
JP2007108806A (en) * 2005-09-16 2007-04-26 Dowango:Kk User matching server, user matching method, user matching program
JP5041202B2 (en) 2006-06-20 2012-10-03 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2008165761A (en) * 2006-12-06 2008-07-17 Mediocritas Inc Information retrieval system
US8312380B2 (en) * 2008-04-04 2012-11-13 Yahoo! Inc. Local map chat
JP2010267105A (en) * 2009-05-15 2010-11-25 Yahoo Japan Corp Device, method and program for searching action history
US8601021B2 (en) * 2010-01-19 2013-12-03 Electronics And Telecommunications Research Institute Experience information processing apparatus and method for social networking service
US9361624B2 (en) * 2011-03-23 2016-06-07 Ipar, Llc Method and system for predicting association item affinities using second order user item associations

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1755663A (en) * 2004-04-07 2006-04-05 索尼株式会社 Information-processing apparatus, information-processing methods and programs
CN101069184A (en) * 2005-09-28 2007-11-07 索尼株式会社 Information processing device, method, and program
WO2010111537A2 (en) * 2009-03-27 2010-09-30 T-Mobile Usa, Inc. Providing event data to a group of contacts

Also Published As

Publication number Publication date
US9507840B2 (en) 2016-11-29
US9135368B2 (en) 2015-09-15
JP6127388B2 (en) 2017-05-17
US20170046443A1 (en) 2017-02-16
US10042943B2 (en) 2018-08-07
US20130339376A1 (en) 2013-12-19
JP2013257761A (en) 2013-12-26
US20180300417A1 (en) 2018-10-18
CN103488669A (en) 2014-01-01
US20150331922A1 (en) 2015-11-19
US10824687B2 (en) 2020-11-03

Similar Documents

Publication Publication Date Title
CN103488669B (en) Message processing device, information processing method and program
US20240346072A1 (en) Systems, methods and apparatus for generating musicrecommendations based on combining song and user influencers with channel rule characterizations
CN103403705B (en) Loading a mobile computing device with media files
US9390091B2 (en) Method and apparatus for providing multimedia summaries for content information
US8838641B2 (en) Content recommendation system, content recommendation method, content recommendation device, and information storage medium
Braunhofer et al. Location-aware music recommendation
US9471688B2 (en) Personalized targeting of media stations
US20140074269A1 (en) Method for Recommending Musical Entities to a User
US20210357450A1 (en) Analyzing captured sound and seeking a match for temporal and geographic presentation and navigation of linked cultural, artistic and historic content
CN106462623A (en) Content item usage based song recommendation
JP2010250528A (en) Feeling matching device, feeling matching method, and program
US11095945B2 (en) Information processing device, method, and program
CN111339349A (en) Singing bill recommendation method
JP2018507503A (en) Music providing method and music providing system
Behrendt Telephones, music and history: From the invention era to the early smartphone days
CN106649480A (en) Method for generating music listand server
Åman et al. Enriching user experiences with location-sensitive music services
Areias et al. TourismShare
US20210110846A1 (en) Information processing apparatus, information processing method, and program
JP2016048437A (en) Information processor, information processing method and program
Suzuki et al. What Songs We Listen to Together: Automatic Music Selection for Groups

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant