WO2011114634A1 - データ処理装置およびデータ処理方法 - Google Patents
データ処理装置およびデータ処理方法 Download PDFInfo
- Publication number
- WO2011114634A1 WO2011114634A1 PCT/JP2011/001211 JP2011001211W WO2011114634A1 WO 2011114634 A1 WO2011114634 A1 WO 2011114634A1 JP 2011001211 W JP2011001211 W JP 2011001211W WO 2011114634 A1 WO2011114634 A1 WO 2011114634A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- unit
- information
- social information
- output
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 265
- 238000003672 processing method Methods 0.000 title claims description 8
- 238000004891 communication Methods 0.000 claims description 68
- 230000005540 biological transmission Effects 0.000 claims description 56
- 238000013500 data storage Methods 0.000 claims description 48
- 238000004458 analytical method Methods 0.000 claims description 45
- 238000006243 chemical reaction Methods 0.000 claims description 44
- 238000007726 management method Methods 0.000 claims description 43
- 238000000605 extraction Methods 0.000 claims description 39
- 238000003860 storage Methods 0.000 claims description 36
- 230000002093 peripheral effect Effects 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 230000007423 decrease Effects 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 7
- 230000001186 cumulative effect Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 92
- 238000000034 method Methods 0.000 description 49
- 230000008569 process Effects 0.000 description 39
- 230000006870 function Effects 0.000 description 25
- 238000012937 correction Methods 0.000 description 22
- 238000003384 imaging method Methods 0.000 description 12
- 238000012546 transfer Methods 0.000 description 9
- 238000009825 accumulation Methods 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 238000009826 distribution Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 208000029152 Small face Diseases 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000010187 selection method Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000000386 athletic effect Effects 0.000 description 1
- 235000013527 bean curd Nutrition 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/587—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
Definitions
- the present invention relates to a data processing apparatus and the like that outputs content data such as accumulated still images and moving pictures, and more particularly to a data processing apparatus and a data processing method that outputs data using social information.
- imaging devices such as consumer digital still cameras, video cameras, or cameras incorporated in mobile phones have been advanced to be sophisticated.
- Such an imaging device supports, for example, high-quality continuous shooting, as well as high-quality recording of photos and videos.
- the imaging device automatically provides meta information such as, for example, position information of a shooting location or the name of a person shown in a picture or a video.
- the imaging device has, for example, a network connection function, and directly uploads a photographed picture from the main unit to a server on the network.
- a wide variety of such high-performance imaging devices are available at low cost.
- the photographer e-mails, for example, photo data or video data directly to the other party. Also, for example, the photographer uploads photo data or video data to a server on the network.
- the present invention is to solve the above-mentioned conventional problems, and it is an object of the present invention to provide a data processing apparatus and method capable of realizing data sharing with family members and acquaintances while suppressing the burden of operation by the user. I assume.
- a data processing apparatus is a data processing apparatus that outputs content data, and the intimacy degree indicating the degree of intimacy between predetermined users and the content data And a data output determination unit that determines whether to output the content data using social information including information for associating the closeness, and the data output determination unit determines that the content data is to be output.
- a data output unit that outputs the content data, the data output determination unit refers to the social information, and the content associated with the closeness when the closeness is equal to or higher than a predetermined threshold value It is determined to output data.
- the relationship between the user who is the social information owner and the other users is determined using the closeness represented by the numerical values in the social information, and the output availability of the content data is determined based on the determination result. .
- content data related to the user having an intimate relationship can be output for browsing or the like. Thereby, it is possible to realize data sharing with family members and acquaintances while suppressing the burden of the operation by the user.
- the data processing apparatus further includes a data relationship extraction unit that extracts information indicating a relationship between content data
- the data output determination unit further includes information extracted by the data relationship extraction unit. It is determined to output content data having a predetermined relationship shown.
- whether or not the content data can be output is determined depending on whether the relationship between the content data has a predetermined relationship. For this reason, for example, it becomes possible to control so as to permit only browsing and providing of content data imaged on a predetermined day from the imaging date and time of content data, and desired user selects desired object data from a large number of object data etc. It does not impose any burden on the user's operation. Also, content data captured on a specific day can be output for viewing or the like. Thereby, it is possible to realize data sharing with family members and acquaintances while suppressing the burden of the operation by the user.
- the data processing apparatus further includes a recognition dictionary storage unit storing a recognition dictionary for recognizing an object included in the content data, and the content data included in the content data using the recognition dictionary.
- the data output determination unit is a user related to the object extracted by the object analysis unit among the users associated with the social information, the closeness degree It is determined that the content data is to be output to an external device associated with a user having a predetermined threshold value or more.
- a user associated with an object included in the content data and having a closeness level equal to or higher than a predetermined threshold value is selected based on social information, and is associated with the user It is determined whether the content data can be provided to the external device. Therefore, the sharing process of the content data can be automated while suppressing the operation load by the user.
- a user having an intimate relationship with the extracted specific person can be controlled as a transmission destination candidate as the user related to the face object. . Therefore, even in a situation where a user having an intimate relationship is not included as an object of the content data, the content data can be transmitted to the user having an intimate relationship.
- the data processing device further includes a communication unit that communicates with the external device via a communication network, and the data output unit transmits the content data to the external device via the communication unit. Output.
- the data processing device further includes a data conversion unit that converts the content data into an arbitrary format, and the data output determination unit further transmits to the data conversion unit according to the closeness. It is determined that the content data is converted and output, and the data output unit outputs the converted content data to the data conversion unit according to the determination result of the data output determination unit.
- the content data is converted and output according to the closeness of the user as the provision destination candidate. For this reason, for example, in the case of a high closeness "bad grandparent", the data is converted based on the closeness determined to be enlarged so that the face of the grandchild becomes a large picture. In addition, it can be controlled to be converted to an image as small as a notice and transmitted to a person who is not very close. In this way, appropriate content data is converted into an appropriate format without imposing a burden on the user operation to instruct conversion of the content data each time according to the relationship with the transmission partner or the characteristics of the other partner, and there is an intimate relationship It can be sent to the user.
- the data processing device further includes a companion history management unit that acquires near field communication history with the external device performed via the communication unit as companion history information, and the data output determination unit Furthermore, among the users associated with the social information, the external device performing the short distance communication indicated by the accompanying history information, the intimacy being an external device associated with the user having a predetermined threshold value or more. It is determined that the content data is to be output.
- a companion history management unit that acquires near field communication history with the external device performed via the communication unit as companion history information
- the data output determination unit Furthermore, among the users associated with the social information, the external device performing the short distance communication indicated by the accompanying history information, the intimacy being an external device associated with the user having a predetermined threshold value or more. It is determined that the content data is to be output.
- the content data is output to an external device that has performed near field communication among the users associated with the social information, and the closeness is associated with the user having a predetermined threshold value or more. For this reason, for example, by identifying the device of an acquaintance existing in the vicinity when the content data was photographed in an actual travel etc., it is possible to control to transmit only the photograph of the time zone that has been together to the device of the acquaintance become. As a result, it is possible to transmit data according to the actual action history to the user who has an intimate relationship, without imposing a user operation burden of selecting the data photographed in the same time zone.
- the data processing device further acquires state information indicating whether the external device can receive the content data via the communication unit, and updates the social information including the state information. Whether the data output determination unit further outputs the content data using the state information included in the latest social information updated by the social information update unit Determine
- the content data can be received from the external device that can receive the content data, because the social information in which the latest state information indicating whether it is an external device that can receive the content data is reflected is determined. Can be transmitted, and data sharing that more accurately reflects the relationship between users can be achieved.
- the data processing device further includes a social information update unit that acquires latest social information from the external device via the communication unit, and updates the social information, and the data output determination unit Furthermore, it is determined whether the content data is to be output, using the latest social information updated by the social information updating unit.
- the data processing apparatus further stores, as history data, an application unit that executes an application that provides a communication function with the external device via the communication unit, and a process history of the execution of the application.
- the familiarity included in the social information using at least one of a data storage unit, communication partner information included in the history data, the cumulative number of communications, access frequency, access frequency increase / decrease tendency, and text of transmission / reception data
- the data output determination unit further determines whether to output the content data using the latest social information updated by the social information update unit. .
- the closeness included in the social information is updated using at least one of communication partner information, communication total count, access frequency, access frequency increase / decrease tendency, and text of transmission / reception data, and the latest state is reflected.
- communication partner information communication total count
- access frequency access frequency increase / decrease tendency
- text of transmission / reception data text of transmission / reception data
- the data processing device further includes a sensor that detects surrounding information indicating a position of the data processing device, a data storage unit that stores history data of detection results of the sensor, and the communication unit.
- a social information updating unit for acquiring history data of peripheral information of the external device, and updating social information of the external device to social information including the acquired history data, the social information updating unit further comprising: The history data stored in the storage unit is compared with the history data of the updated external device, and at least one of the relative distance between the position information of the data processing device and the external device, area information, and the accompanying frequency change tendency To update the intimacy included in the social information, and the data output determination unit further Recently the updated by distribution updating unit using social information, determines whether to output the content data.
- the intimacy included in the social information is updated using at least one of the relative distance between the position information of the data processing apparatus and the external apparatus, the area information, and the accompanying frequency change tendency, and the latest state is reflected.
- Whether to provide data is determined using social information including the determined closeness. For this reason, for example, it is possible to determine the device of an acquaintance who was present in the vicinity when the data was taken during an actual trip, etc., and to transmit only photos of places or time zones that were present together to the device of the acquaintance It can control. As a result, it is possible to transmit data according to the actual action history to the user who has an intimate relationship, without imposing a user operation burden of selecting the data photographed in the same time zone.
- the data processing apparatus uses the recognition dictionary used by the object analysis unit according to the closeness.
- the system includes a social information management unit that acquires data from the external device and updates the recognition dictionary, and the object analysis unit is included in the content data using the recognition dictionary updated by the social information management unit. Extract an object
- data of the recognition dictionary is acquired from the external device according to the closeness, the recognition dictionary is updated, and the object included in the content data is extracted using the latest recognition dictionary. That is, in updating, editing, storing, and externally providing processing of a recognition dictionary used to analyze an object included in content data and meta information associated with the recognition dictionary, processing is required with a user who is an information owner of the recognition dictionary and meta information.
- the relationship with the user is determined using the closeness between the users represented by the social information. Therefore, for example, for a user who desires to edit the recognition dictionary, control can be made to permit editing of the recognition dictionary only when closeness is set that exceeds a predetermined threshold, and malicious user from malicious users Certain editing can be avoided, and only editing from close users can be permitted.
- recognition dictionaries can be shared between users in close relationship with each other, it is possible to reduce the user operation load required for the user to learn recognition dictionaries compared to the case where individual devices learn the recognition dictionaries. it can.
- the data output determination unit further includes: It is determined to preferentially output the meta information associated with the higher degree of closeness included in the social information among the two or more meta information, and the data output unit preferentially outputs the content data And the meta information determined to be output.
- the present invention can not only be realized as such a data processing apparatus, but also can be realized as a data processing method in which the processing of a characteristic processing unit provided in the data processing apparatus is a step.
- the present invention can also be realized as one or more integrated circuits including the processing unit.
- the present invention can also be realized as a program that causes a computer to execute the steps included in the above data processing method. Needless to say, such a program can be distributed via a recording medium such as a CD-ROM or a transmission medium such as the Internet.
- the data processing apparatus of the present invention it is possible to realize data sharing with family members and acquaintances while suppressing the burden of operation by the user.
- FIG. 1 is a block diagram showing the configuration of a data processing apparatus according to a first embodiment of the present invention.
- FIG. 2 is a diagram showing an example of meta information of content data in the first embodiment of the present invention.
- FIG. 3 is a diagram showing an example of social information in the first embodiment of the present invention.
- FIG. 4 is a diagram showing a node concept of social information in the first embodiment of the present invention.
- FIG. 5 is a diagram showing an example of the priority determination threshold value table according to the first embodiment of the present invention.
- FIG. 6 is a flow chart showing the flow of data processing in the first embodiment of the present invention.
- FIG. 7 is a flowchart showing a flow of data relationship extraction processing according to the first embodiment of the present invention.
- FIG. 1 is a block diagram showing the configuration of a data processing apparatus according to a first embodiment of the present invention.
- FIG. 2 is a diagram showing an example of meta information of content data in the first embodiment of the present invention.
- FIG. 3 is
- FIG. 8 is a diagram showing an example of a relation group table according to Embodiment 1 of the present invention.
- FIG. 9 is a flow chart showing a flow of data output determination processing in the first embodiment of the present invention.
- FIG. 10A is a diagram showing an example of the internal data list in the first embodiment of the present invention.
- FIG. 10B is a diagram showing an example of the internal data list in the first embodiment of the present invention.
- FIG. 11 is a block diagram showing the configuration of a data processing apparatus according to a second embodiment of the present invention.
- FIG. 12 is a diagram showing an example of meta information of content data in the second embodiment of the present invention.
- FIG. 13 is a diagram showing an image of content data in the second embodiment of the present invention.
- FIG. 10A is a diagram showing an example of the internal data list in the first embodiment of the present invention.
- FIG. 10B is a diagram showing an example of the internal data list in the first embodiment of the present invention.
- FIG. 11 is a block
- FIG. 14 is a flowchart showing a flow of data relationship extraction processing according to the second embodiment of the present invention.
- FIG. 15 is a flowchart showing a flow of object analysis processing in the second embodiment of the present invention.
- FIG. 16 is a diagram showing an image of content data in the second embodiment of the present invention.
- FIG. 17 is a diagram showing an example of the priority level determined from the object recognition result and the intimacy degree of social information in the second embodiment of the present invention.
- FIG. 18 is a flowchart showing a flow of data output determination processing in the second embodiment of the present invention.
- FIG. 19 is a diagram showing an example of the attribute correction table according to the second embodiment of the present invention.
- FIG. 20 is a diagram showing an example of social information in the second embodiment of the present invention.
- FIG. 20 is a diagram showing an example of social information in the second embodiment of the present invention.
- FIG. 21 is a diagram showing a node concept of social information in the second embodiment of the present invention.
- FIG. 22 is a diagram showing an example of the internal data list in the second embodiment of the present invention.
- FIG. 23 is a diagram showing an example of a device attribute table according to the second embodiment of the present invention.
- FIG. 24A is a diagram showing an example of a screen output result in the second embodiment of the present invention.
- FIG. 24B is a diagram showing an example of the screen output result in the second embodiment of the present invention.
- FIG. 25 is a block diagram showing a configuration of a data processing device in the third embodiment of the present invention.
- FIG. 26 is a flowchart showing a flow of data output determination processing in the third embodiment of the present invention.
- FIG. 27A is a diagram showing an example of a sub attribute correction table according to Embodiment 3 of the present invention.
- FIG. 27B is a diagram showing an example of a sub attribute correction table according to Embodiment 3 of the present invention.
- FIG. 28 is a diagram showing an example of social information in the third embodiment of the present invention.
- FIG. 29 is a diagram showing an example of the internal data list of the data output determination process in the third embodiment of the present invention.
- FIG. 30A is a diagram showing an example of a data conversion table according to Embodiment 3 of the present invention.
- FIG. 30B is a diagram showing an example of a data conversion table according to Embodiment 3 of the present invention.
- FIG. 31 is a diagram showing an example of the data output result in the third embodiment of the present invention.
- FIG. 32 is a block diagram showing a configuration of a data processing device in the fourth embodiment of the present invention.
- FIG. 33 is a diagram showing an example of accumulation of content data in the fourth embodiment of the present invention.
- FIG. 34 is a diagram showing an example of social information in the fourth embodiment of the present invention.
- FIG. 35 is a flowchart showing a flow of companion history management processing in the fourth embodiment of the present invention.
- FIG. 36 is a diagram showing an example of companion history data according to Embodiment 4 of the present invention.
- FIG. 37 is a flowchart showing a flow of data output determination processing in the fourth embodiment of the present invention.
- FIG. 38 is a diagram showing an example of imaging time zone distribution according to Embodiment 4 of the present invention.
- FIG. 38 is a diagram showing an example of imaging time zone distribution according to Embodiment 4 of the present invention.
- FIG. 39 is a block diagram showing a configuration of a data processing device in the fifth embodiment of the present invention.
- FIG. 40 is a diagram showing an example of accumulation of content data in the fifth embodiment of the present invention.
- FIG. 41 is a flowchart showing a flow of history saving processing in the fifth embodiment of the present invention.
- FIG. 42 is a diagram showing an example of movement of each device in the fifth embodiment of the present invention.
- FIG. 43 is a diagram showing an example of position information history data for each device according to the fifth embodiment of the present invention.
- FIG. 44 is a flowchart showing a flow of data output determination processing in the fifth embodiment of the present invention.
- FIG. 45A is a diagram showing an example of a social information update threshold according to Embodiment 5 of the present invention.
- FIG. 45A is a diagram showing an example of a social information update threshold according to Embodiment 5 of the present invention.
- FIG. 45B is a diagram showing an example of the social information update threshold in the fifth embodiment of the present invention.
- FIG. 46 is a diagram showing an example of the data output threshold according to the fifth embodiment of the present invention.
- FIG. 47 is a diagram showing an example of relative position information calculation result in Embodiment 5 of the present invention.
- FIG. 48 is a block diagram showing another configuration of the data processing apparatus according to the fifth embodiment of the present invention.
- FIG. 49 is a block diagram showing a configuration of a data processing apparatus according to a sixth embodiment of the present invention.
- FIG. 50 is a flowchart showing a flow of social information management processing according to the sixth embodiment of the present invention.
- FIG. 51 is a diagram showing an example of user information in the sixth embodiment of the present invention.
- FIG. 52 is a diagram showing an example of the screen output result in the sixth embodiment of the present invention.
- FIG. 53 is a block diagram showing the minimum configuration of the data processing apparatus according to the embodiment of the present invention.
- FIG. 1 is a block diagram showing a configuration of data processing apparatus 100 in the first embodiment of the present invention.
- the data processing apparatus 100 includes an input unit 101, a data storage unit 102, a content data storage unit 103, an application unit 104, a data relationship extraction unit 105, a social information storage unit 106, and data output determination. It has a unit 107 and an output unit 108 (configuration of the first embodiment).
- the data processing apparatus 100 is, for example, a video recorder or a home server capable of inserting an external storage medium storing content data such as image data, and a digital capable of capturing content data such as still images and moving images. It is a still camera, a digital video camera, etc.
- the input unit 101 acquires content data to be processed by an input unit (for example, an input device of an external storage medium or a built-in camera module) mounted by the data processing apparatus 100 and transfers the content data to the data storage unit 102.
- an input unit for example, an input device of an external storage medium or a built-in camera module mounted by the data processing apparatus 100 and transfers the content data to the data storage unit 102.
- the data storage unit 102 stores the content data transferred from the input unit 101 as a content data storage unit 103 in a storage medium configured by a hard disk, a flash memory, or the like in a format that can be reloaded.
- the application unit 104 has various functions (for example, a content viewer display function, a slide show reproduction function, a print output function, etc.) to be provided to the user who uses the data processing apparatus 100. Provide various functions according to the instructed instruction. At that time, the application unit 104 reads the content data stored in the content data storage unit 103 by the data storage unit 102 at an arbitrary timing, and performs desired processing.
- various functions for example, a content viewer display function, a slide show reproduction function, a print output function, etc.
- the data relationship extraction unit 105 reads the content data stored in the content data storage unit 103 by the data storage unit 102, and extracts information indicating the relationship between the stored content data as a relationship output result.
- the social information storage unit 106 is a user who owns or uses the data processing apparatus 100, or a user who is related to a user who owns or uses the data processing apparatus 100 even if the data processing apparatus 100 is not used directly.
- the social information is information including closeness indicating a degree of closeness between predetermined users, and information for associating content data and closeness.
- the data output determination unit 107 refers to the social information, and determines that the content data associated with the intimacy degree is to be output when the intimacy degree is equal to or more than a predetermined threshold.
- the data output determination unit 107 is content data associated with closeness or higher than the predetermined threshold value, and is content data having a predetermined relationship indicated by the information extracted by the data relationship extraction unit 105. It is determined to output.
- the predetermined threshold is, for example, 0.5 when the intimacy degree takes a value of 0 to 1.
- the data output determination unit 107 requests the data relationship extraction unit 105 to output the relationship output result of the target content data. Then, using the relationship output result and the social information stored in the social information storage unit 106, the data output determination unit 107 determines whether or not the content data can be output. Then, the data output determination unit 107 returns the determination result to the application unit 104.
- the application unit 104 instructs the output unit 108 to output a screen display or the like for the content data for which data output is possible. That is, the application unit 104 has a function as a data output unit that outputs the content data to the output unit 108 when the data output determination unit 107 determines that the content data is to be output.
- FIG. 2 is a diagram showing an example of meta information of content data in the first embodiment of the present invention.
- the data storage unit 102 creates a database of some meta information indicating the outline of the content data stored in the content data storage unit 103 and holds the database. It is possible to refer and rewrite according to the requirements of
- the meta information includes, for example, a data name (also referred to as an object identifier or data path, etc.) which enables access to each data, an extension indicating a file format type, and a type indicating a content data type (this embodiment In the embodiment, only “Image” is used to mean the image of a still image for simplicity of explanation), and the device ID uniquely assigned to the device that generated the content data (in the present embodiment, the explanation is simplified Therefore, a simple character string representing the difference between the devices (represented by "DSC-X" or the like) and a shooting date and time indicating the timing when the content data is generated are included.
- a data name also referred to as an object identifier or data path, etc.
- FIG. 3 is a diagram showing an example of social information in the first embodiment of the present invention.
- the social information shown in FIG. 3 is stored in the social information storage unit 106, and for each identification ID, a name representing the name or nickname of the user who owns or uses the data processing apparatus 100, an e-mail to each user Etc., an owned device ID uniquely assigned to a device owned by each user, and a closeness degree representing intimacy with a social information owner.
- the owned device ID is information for associating content data with closeness.
- the name “Mike” of the identification ID "0" is the social information owner who owns the data processing apparatus 100, and the numerical value of the intimacy degree is "-" (no numerical value input required). That is, the closeness to the user "Mike” and each user is managed as a numerical value from 0 to less than 1.
- the closeness numerical value has been described as a normalized numerical value from 0 to less than 1, but the closeness management method is not limited to this, and a point system that increases without an upper limit may be used, or the granularity is lowered. It may be divided into several levels such as A to E.
- the social information owner shows an example of only "Mike", but there may be a plurality of social information owners, which can be expressed by the same management method.
- FIG. 4 is a diagram showing a node concept of social information in the first embodiment of the present invention.
- FIG. 5 is a diagram showing an example of the priority determination threshold value table according to the first embodiment of the present invention.
- the social information expresses the intimacy with each user as a numerical value of intimacy, centering on the social information owner “Mike”.
- the higher the numerical value the closer the relationship (closer to 1), and the lower the numerical value, the distant relationship (closer to 0).
- FIG. 5 is a priority determination threshold value table internally held by the data output determination unit 107, and using this threshold value, the priority level of data output for each user is calculated from the closeness.
- the priority levels are configured in four stages of A to C and Z, and in FIG. 4, "Alice” and “Julia” are both 0.83 and the priority levels “A” and “Tom”. And “Paul” both have priority levels "B” at 0.53 and 0.51.
- FIG. 6 is a flow chart showing the flow of data processing in the first embodiment of the present invention.
- FIG. 7 is a flowchart showing a flow of data relationship extraction processing according to the first embodiment of the present invention.
- FIG. 8 is a diagram showing an example of a relation group table according to Embodiment 1 of the present invention.
- FIG. 9 is a flow chart showing a flow of data output determination processing in the first embodiment of the present invention.
- FIGS 10A and 10B are diagrams showing an example of the internal data list in the first embodiment of the present invention.
- the application unit 104 of the data processing apparatus 100 displays a slide show on the screen as a screen saver using FIG. 6, and the content data stored in the data storage unit 102 has an intimate relationship. Data processing for selecting and displaying the content shot by the user will be described.
- the application unit 104 performs the input operation from the user at an arbitrary timing such as 10 minutes or more. Activate various functions (here, slide show reproduction function), and confirm whether or not the data storage unit 102 has at least one or more target data effective for executing various functions (S101) .
- step S101 when target data does not exist (N of S101), data processing is ended.
- step S102 when target data exists (Y of S101), the data relationship extraction part 105 acquires the substance of target data (S102).
- the data relationship extraction unit 105 performs processing of extracting the relationship between the target data (S103). This process will be described later with reference to FIGS. 7 to 8.
- the data output determination unit 107 performs processing of determining whether or not data output is possible according to the relationship output result extracted by the data relationship extraction unit 105 and the social information stored in the social information storage unit 106. (S104). This process will be described later with reference to FIGS.
- the data output determination unit 107 stores, as a processing queue, the result of the determination on the relationship group indicated by the relationship output result extracted by the data relationship extraction unit 105 (S105).
- the data output determination unit 107 confirms whether the determination of the data output has been completed for all the related groups (S106).
- the data output determination unit 107 changes the target relationship group (S107), and repeats the process from step S104.
- the data output determination unit 107 notifies the application unit 104 of the end of the determination, and the application unit 104 executes the stored processing queue ( S108).
- the application unit 104 executes slide show reproduction using content data permitted to output data.
- the data relationship extraction unit 105 determines whether there is an item that can be a common element in the meta information of the target content data shown in FIG. 2 (S201).
- the data relationship extraction unit 105 determines that there is a common element (Y in S201), the extracted common element (for example, the content type, the device ID, etc., is only taken date and time for simplicity). Is registered as a common attribute in the relation group table internally held by the data relation extraction unit 105 (S202).
- the data relationship extraction unit 105 adds information indicating content data corresponding to each common attribute to the relationship group table (S203).
- FIG. 8 is a relation group table which is one of the relation extraction results outputted by the data relation extraction unit 105, and among the contents data shown in FIG. 2, three groups having the photographing date as a common attribute are formed.
- the data output determination unit 107 reads data information from the target relationship group table shown in FIG. 8 (S301). Furthermore, the data output determination unit 107 reads the social information stored in the social information storage unit 106 (S302).
- the data output determination unit 107 compares the meta information (see FIG. 2) of the content data referenced from the data information with the social information read (S303), and determines whether the meta information matches the social information It is determined whether or not it is (S304).
- the data output determination unit 107 determines that the meta information and the social information do not match (N in S304).
- the data output determination process ends.
- the data output determination unit 107 follows the priority determination threshold value table shown in FIG. The priority level is determined from the density (S305).
- the data output determination unit 107 determines whether the determined priority level satisfies a predetermined condition (here, only the priority level is A or B) (that is, whether the closeness is equal to or higher than a predetermined threshold) If it is determined that the condition is satisfied (Y at S306), the data output determination unit 107 registers the information in the internal data list held internally (S307). On the other hand, when the data output determination unit 107 determines that the priority level does not satisfy the condition (N in S306), the data output determination unit 107 proceeds to the next step (S308) without registering the information in the internal data list.
- a predetermined condition here, only the priority level is A or B
- the data output determination unit 107 determines whether the determination process has been completed for all the data described in the related group table (S308), and when it is determined that the process has not been completed yet (N in S308) Then, the data is changed, and the processes after step S305 are repeated (S309). On the other hand, when the data output determination unit 107 determines that all the data has been completed (Y in S308), the determination process ends.
- step S305 in FIG. 2B, the priority level for each user referenced using the device ID is described for reference.
- the user "Mike” who is the owner of the device ID "DSC-X" is a social information owner as shown in FIG. 3, and the priority level is not assigned as "-".
- the user "James” who is the owner of the device ID “DSC-Y” has a familiarity level of 0.42, the priority level is “C”, and the user “Alice” who is the owner of the device ID “DSC-Z”. Is a closeness level 0.83 and the priority level is “A”, and the user “Paul” who is the owner of the device ID “CM-P” is a closeness degree 0.51 and the priority level “B”.
- the owner of the device ID “DSC-V” has no corresponding information in the social information, and becomes an unknown user for the social information owner “Mike”, and the priority level is not given and becomes “unknown”. .
- FIGS 10A and 10B are diagrams showing an example of the internal data list in the first embodiment of the present invention.
- FIG. 10A shows an example of the result of the internal data list output by determining the data group indicated by “G1” in the group ID (GID) of the related group table shown in FIG. 8 according to the process of steps S306 to S307. ing.
- the application unit 104 of the data processing apparatus 100 displays the slide show on the screen as a screen saver by returning the internal data list output by the data output determination unit 107 to the application unit 104, display is performed using the internal data list. Control the content.
- the relationship between the user who is the social information owner and the other users is determined using the closeness represented by the numerical values in the social information, and the availability of the content data output is determined based on the determination result. to decide. For this reason, for example, it becomes possible to control to permit browsing and providing of the target data only when the closeness degree exceeds the predetermined threshold, and the user operation such as selecting desired target data from a large number of target data There is no burden.
- content data related to the user having an intimate relationship can be output for browsing or the like.
- whether or not the content data can be output is determined depending on whether the relationship between the content data has a predetermined relationship. For this reason, for example, it becomes possible to control so as to permit only browsing and providing of content data imaged on a predetermined day from the imaging date and time of content data, and desired user selects desired object data from a large number of object data etc. It does not impose any burden on the user's operation. Also, content data captured on a specific day can be output for viewing or the like.
- FIG. 11 is a block diagram showing the configuration of a data processing apparatus 100A according to the second embodiment of the present invention.
- the same components as in FIG. 1 will be assigned the same reference numerals and descriptions thereof will be omitted.
- the data processing apparatus 100A includes a recognition dictionary storage unit 109, an object analysis unit 110, and a communication unit 111 in addition to the components shown in FIG. Further, data processing apparatus 100A is connected to external data processing apparatus 100B and data processing apparatus 100C via network 200 (configuration of the second embodiment).
- the data processing apparatus 100A inserts, for example, an external storage medium storing image data, and captures a plurality of video recorders and home servers capable of storing a plurality of read image data, a plurality of still images and moving images, They are digital still cameras and digital video cameras that can store data.
- the recognition dictionary storage unit 109 is a memory that stores a recognition dictionary for recognizing an object included in content data.
- the object analysis unit 110 extracts an object included in the content data using the recognition dictionary.
- the communication unit 111 communicates with the data processing apparatus 100B and the data processing apparatus 100C, which are external devices, via the network 200, which is a communication network.
- the data output determination unit 107 is a user associated with the object extracted by the object analysis unit 110 among the users associated with the social information, and the external associated with the user whose intimacy degree is equal to or more than a predetermined threshold It is determined that the content data is to be output to the device.
- the predetermined threshold is, for example, 0.5 when the intimacy degree takes a value of 0 to 1.
- the user associated with an object includes not only the user indicated by the object but also a user whose intimacy with the user is equal to or greater than a predetermined value (for example, 0.95).
- the application unit 104 also has a function as a data output unit that outputs content data to the external device via the communication unit 111.
- FIG. 12 is a diagram showing an example of meta information of content data in the second embodiment of the present invention.
- FIG. 13 is a diagram showing an image of content data in the second embodiment of the present invention.
- the data storage unit 102 stores four still image photographs taken with the device ID “DSC-X” in the content data storage unit 103 as the latest content data. It shall be.
- the data name “C-1” is “landscape photo” and “C-2” is “person portrait with only one person“ Mike ”shown "Photo”
- “C-3” is the “Group photo” of “Mike” and a few others on the far right
- "C-4" is a "portrait portrait photograph” of "Mike” 's friend “Alice” It is.
- the application unit 104 of the data processing apparatus 100A transfers the latest content data stored in the data storage unit 102 to the external data processing apparatus 100B or 100C. Then, among the latest content data stored in the data storage unit 102, a user who has an intimate relationship with a user who is in an intimate relationship to the user or a user who is in an intimate relationship with a user who is in an object
- data processing for selecting and transferring content data will be described below.
- FIG. 14 is a flowchart showing a flow of data relationship extraction processing according to the second embodiment of the present invention.
- the flow of the process of the data relation extracting unit 105 shown in FIG. 14 is an object of a character of a person or a signboard included in target data, a car, an animal, or a natural object such as a mountain or a tree.
- Object analysis processing (S401) to be extracted is an object of a character of a person or a signboard included in target data, a car, an animal, or a natural object such as a mountain or a tree.
- Object analysis processing (S401) to be extracted to be extracted.
- the process after step S402 is the same as the process after step S201 of FIG. 7, description is abbreviate
- FIG. 15 is a flowchart showing a flow of object analysis processing in the second embodiment of the present invention.
- the data relationship extraction unit 105 of the data processing apparatus 100A causes the data storage unit 102 to store the target data input through the input unit 101, or a predetermined time after a predetermined time after the storage, or the like.
- the entity of the target data accumulated by the data accumulation unit 102 is read for analysis at an arbitrary timing.
- the data relationship extraction unit 105 instructs the object analysis unit 110 to analyze the object, and the object analysis unit 110 internally develops the read data (S501).
- the object analysis unit 110 ends the object analysis process.
- the object analysis unit 110 determines that an object is present in the target data (Y in S502)
- the object analysis unit 110 evaluates a similarity category or the like to which the extracted object belongs by using a recognition dictionary (S503).
- the object analysis unit 110 determines whether or not the evaluation result indicates that the evaluation result indicates that the object can be determined as an object such as a specific person registered in the recognition dictionary (S504), and the evaluation result indicates that the determination is possible. If it is determined (Y in S504), the evaluation result (attribute type such as being a person, specific person name estimated to be similar, name of similarity, etc.) is added to the meta information (S505).
- the evaluation result attribute type such as being a person, specific person name estimated to be similar, name of similarity, etc.
- the object analysis unit 110 determines that the evaluation result that the determination is impossible is obtained (N in S504), the object analysis unit 110 adds the determination impossible to the meta information (S506).
- the object analysis unit 110 determines whether or not all objects included in the expanded target data have been detected (S507), and if it is determined that detection of all objects has not been completed (S507 N) The target object is changed, and the processing after step S503 is repeated (S508).
- the object analysis unit 110 determines that all objects have been detected (Y in S507), it notifies the data relationship extraction unit 105 that the object analysis processing has been completed, and the data relationship extraction unit 105 does not process Repeat this process until there is no target data of.
- FIG. 16 is a diagram showing an image of content data in the second embodiment of the present invention.
- FIG. 17 is a diagram showing an example of the priority level determined from the object recognition result and the intimacy degree of social information in the second embodiment of the present invention.
- the object analysis unit 110 sequentially analyzes the objects included in the target data while assigning object IDs, and as shown in (a) of FIG. 17, the objects included in each target data The recognition result of is added to the meta information.
- the object ID (MID) “1” is “71%” and the degree of similarity “James”, and the object ID “4” is a degree of similarity "54%” is “Kevin”, and the object ID "5" is determined to be “Mike” at a similarity of "88%”.
- the object IDs “2” and “3” are cases where it is determined that there is no person who can be determined to be similar according to the recognition dictionary, and in this case, “unknown” is additionally written in step S506.
- FIG. 18 is a flowchart showing a flow of data output determination processing in the second embodiment of the present invention.
- FIG. 19 is a diagram showing an example of the attribute correction table according to the second embodiment of the present invention.
- FIG. 20 is a diagram showing an example of social information in the second embodiment of the present invention.
- FIG. 21 is a diagram showing a node concept of social information in the second embodiment of the present invention.
- FIG. 22 is a diagram showing an example of the internal data list in the second embodiment of the present invention.
- FIG. 23 is a diagram showing an example of a device attribute table according to the second embodiment of the present invention.
- 24A and 24B are diagrams showing an example of the screen output result in the second embodiment of the present invention.
- the data output determination unit 107 corrects the social information stored in the social information storage unit 106 based on the attribute information. Hold.
- the closeness is corrected by “+0.20”
- the closeness is corrected by “+0.10”.
- FIG. 20 shows attribute information added to the social information shown in FIG. 3 of the first embodiment.
- This attribute information is registered, for example, by providing a function of rewriting social information stored in the social information storage unit 106 using the “social information setting tool” or the like among various functions of the application unit 104 of the data processing apparatus 100A. It is possible to change and save it by directly specifying the relationship between persons.
- by holding a key holder capable of carrying social information in a portable manner over the input unit 101 of the data processing apparatus 100A it is reflected on attribute information of social information stored in the social information storage unit 106 via the application unit 104. It is good.
- the data output determination unit 107 reads data information from the relationship group table output by the data relationship extraction unit 105 (S601), and further reads social information stored in the social information storage unit 106 (S602). ).
- the data output determination unit 107 determines whether to apply correction based on the attribute correction table, and when applying (Y in S603), the closeness is corrected according to the attribute correction table (S604). On the other hand, when the data output determination unit 107 does not apply the correction based on the attribute correction table (N in S603), the step S604 is omitted.
- FIG. 21 is a conceptual diagram showing how a user node and familiarity spreading from "Mike”, which is a social information owner, are corrected by the attribute correction table.
- the data output determination unit 107 further compares meta information of the content data referred to by the data information with social information (S605), and the result of analyzing an object such as a similar person matches the social information. It is determined whether or not (S606). Then, if the data output determination unit 107 determines that the object analysis result does not match the social information (N in S606), the data output determination process ends.
- the data output determination unit 107 determines that the object analysis result matches the social information (Y in S606), the data output determination unit 107 determines the priority level from the closeness of social information according to the priority determination threshold table (S607). That is, in FIG. 17, as shown in (b) of FIG. 17, a priority level is given to each extraction object.
- the data output determination unit 107 determines whether there are a plurality of extraction objects included in the target data (S608). When the data output determination unit 107 determines that there are a plurality of extraction objects included in the target data (Y in S608), the data output determination unit 107 designates the social information owner or the user with the highest closeness as the attention area (S609).
- the data output determination unit 107 determines that there are not a plurality of extraction objects included in the target data (N in S608), the data output determination unit 107 designates an effective user with the highest closeness as the attention area (S610).
- the selection method of the attention area is not limited to this, and the largest area ratio of the extracted objects may be selected. Alternatively, one having the highest degree of similarity of the person extracted by the recognition dictionary may be selected.
- the data output determination unit 107 adds the similar person extracted in each relationship group to the internal data list as a "character", and each similar person extracted by the recognition dictionary has a prescribed condition at the priority level ( Here, it is determined whether the person appearing as the subject satisfies the priority level "A” or "B” (that is, whether the closeness is equal to or higher than a predetermined threshold) (S611).
- step S612 is omitted.
- the data output determination unit 107 determines whether or not there is a user whose familiarity is equal to or higher than the predetermined value (here, 0.95 or more) for the region of interest specified in steps S609 and S610 (S613). . Then, when the data output determination unit 107 determines that there is a user whose familiarity is equal to or more than the predetermined value (Y in S613), the user having the predetermined value or more is linked to the attention area in the internal data list. It registers as "data provision candidate" (S614). On the other hand, when the data output determination unit 107 determines that there is no user whose intimacy degree is equal to or more than the predetermined value (N in S613), step S614 is omitted.
- the predetermined value here, 0.95 or more
- the data output determination unit 107 determines whether or not the data output determination process has been completed for all the data (S615), and when it is determined that the process has not been completed (N in S615), the target data in the related group Are changed (S616), and the processing after step S607 is repeated. On the other hand, when the data output determination unit 107 determines that all the data has been completed (Y in S615), the data output determination process ends.
- the data output determination unit 107 outputs the internal data list as shown in FIG. 22 by changing the related group table in steps S106 to S107 of FIG. 6 described above.
- C-4" belonging to the related group "2" has only "Alice” as a character, and is further designated as a focused area.
- “Alice” has a priority level of “A” and is selected as a data provision candidate.
- priority level “A” is selected. Will be selected as a data provision candidate.
- social information based on “Alice” it can be easily imagined that the data provision candidates can be increased if the closeness among the users extracted therefrom is greater than or equal to a predetermined value.
- the application unit 104 of the data processing apparatus 100A transmits the latest content data stored in the data storage unit 102 to external data. It is transferred to the processing device 100B or the data processing device 100C. Then, at the time of this transfer, the application unit 104 of the data processing apparatus 100A controls the data transmission destination using the internal data list, whereby the intimate relationship among the content data stored in the data storage unit 102 is obtained.
- Content data can be selected and transferred to the user who is present as a subject or to a user who is closely related to the user who is present as a subject.
- the application unit 104 of the data processing apparatus 100A transmits data to each user designated as a data provision candidate of the internal data list shown in FIG. 22 in step S108 of FIG. Sends the content data belonging to the related group to the data processing apparatus 100B and the data processing apparatus 100C linked to each user.
- the related group “1” data names “C-1”, “C-2”, and “C-3” are transmitted as a series of related data groups to the data provision candidates “James” and “Julia”.
- the selection of a series of related data groups is dependent on the processing of the data relationship extraction unit 105, and in the present embodiment, the shooting date and time is described as an example, but the criteria for relationship extraction are not limited to this. If it is possible to make determinations such as “athletic event” or “travel” by scene discrimination of content data by the object analysis unit 110, events may be grouped as a unit, for example.
- FIG. 23 is a diagram showing an example of a device attribute table according to the second embodiment of the present invention.
- FIG. 23A is a device attribute table stored by the social information storage unit 106 as information related to social information, and the function attribute of each device is indicated by the device ID and the general name. It is possible to refer to the device type and various function attributes (here, the photographing function and the data receiving function) represented. Moreover, (b) of FIG. 23 is an example which represented the user (device owner) who owns each device obtained by referring to social information by mapping with device ID.
- the data processing apparatus 100B owned by the user "James” is the device ID "DSC-Y" and can be photographed as a digital still camera but the data reception function is Depending on the capability of the data processing apparatus 100B, such as not having it, data transmission processing may not be possible.
- the user "Alice” owns a digital still camera (device ID “DSC-Z”) that can not receive data, and a home server (device ID "HS-A” that can not shoot but can receive data. If yes, then select a device that is more suitable for data transmission among the plurality of devices and complete the transmission.
- DSC-Z digital still camera
- HSS-A home server
- 24A and 24B are diagrams showing an example of the screen output result in the second embodiment of the present invention.
- the data processing apparatus 100A automatically selects up to the transmission destination device as a data provision candidate and executes transmission.
- the application unit 104 transmits to the output unit 108. Control may be performed to display the screen 2400.
- the preview area 2401 displays a list of image data images of the selected related group “1”, and the preview area selection tab 2402 displays an image data image of the related group “2” which is currently not displayed.
- the character introduction area 2403 the character included in the image data selected in the related group "1" based on the internal data list shown in FIG. 22 has a user icon (or an avatar imitating the user's character, etc.) It is displayed using.
- the data transmission candidate display area 2404 the user who is currently scheduled to transmit is displayed using a user icon (or an avatar or the like).
- the user visually confirms the data transmission content and the data transmission candidate, and then presses the transmission execution button 2405 via the input unit (touch operation, mouse operation, voice input operation, etc.) of the input unit 101.
- the transmission execution button 2405 via the input unit (touch operation, mouse operation, voice input operation, etc.) of the input unit 101.
- “Kevin” that did not satisfy the predetermined condition described in the present embodiment was not automatically selected as a data transmission candidate.
- data is simultaneously transmitted by moving the “Kevin” icon from the character introduction area 2403 of the screen 2400 to the data transmission candidate display area 2404 by an operation such as drag and drop. It can be adopted as a candidate.
- the data transmission process can be canceled by pressing the cancel button 2406.
- FIG. 24B has shown an example of the data transmission content selection screen by another procedure regarding the data transmission process using the closeness degree of social information.
- the address book that manages the person who is the transmission partner by reversing the data group in which the selected user is the subject, or the user who is highly related or interested in the selected user.
- the obtained data group is displayed as a related group of content data to be transmitted.
- pressing the transmission execution button 2412 can execute data transmission to the data transmission candidate “Julia”.
- the data transmission process can be canceled by pressing the cancel button 2413.
- the relationship between the user associated with the object included in the data and the user as the data providing destination candidate is determined using the closeness between the users represented by the social information. Therefore, for example, when the data includes a specific face object, a user (grandparent: Julia) having a close relationship with the extracted specific person (grandchild: Mike) is selected as a transmission destination candidate. Can be controlled. Thus, without imposing a user operation load instructing transmission to the grandparents, the data is transmitted to the grandparents as a user having an intimate relationship even in a situation where no grandparent's face object is included in the data. Can.
- the data does not include a specific face object, it is determined whether the data can be provided or not in units of extracted relationship groups. For this reason, it is possible to simultaneously provide photo data before and after shooting a photo including a face, such as a landscape photo during travel, and a photo including an event context that can not be transmitted only by photo data including a specific face object. Data can also be sent simultaneously.
- control can be performed to transmit content data for which data provision is permitted to the data processing device 100B or the data processing device 100C, which is an external device.
- the selection condition of the data provision candidate the description has been made using the values “A” to “B” as the priority level and the closeness to the subject of 0.95 or more, but the selection condition is not limited thereto.
- all users estimated as subjects may be candidates for providing data, or conditions using other indicators may be used.
- the implementation means is not limited to this, and it is also possible to select data provision candidates for a plurality of social information owners.
- the predetermined priority level can be obtained by summing up the intimacy calculated from the users whose intimacy is both above the predetermined priority level or from the multiple social information owners It is good also as a system which chooses the user who exceeds as a data provision candidate.
- an object included in the data of the related group is defined. It may be an appearance object (character) only when it appears more than the number of times, or the intimacy degree of the social information may be a predetermined condition (for example, 0.30 or more). Furthermore, even if the current closeness is low, the user who meets the predetermined condition (for example, appears 50% or more for all the images in the related group) in the related group of a certain event in the related group of an event has data It may be selected as a provision candidate.
- the data sharing means is not limited to this, and it is assumed that the data processing apparatus holds the image data to be shared and browses to the outside based on the external request (Web server system), and the public location information (URL : Uniform Resource Locator) may be sent, or it may be uploaded once to an external service such as SNS (Social Networking Service) or photo sharing service, and its public location information and invitation information (URL, login to the service) Information) may be transmitted.
- SNS Social Networking Service
- photo sharing service a public location information and invitation information (URL, login to the service) Information
- FIG. 25 is a block diagram showing the configuration of a data processing apparatus 100D according to the third embodiment of the present invention.
- the same components as in FIG. 1 and FIG. 11 will be assigned the same reference numerals and descriptions thereof will be omitted.
- the data processing apparatus 100D has a data conversion unit 112 in addition to the components shown in FIG. 11 (configuration of the third embodiment).
- the data processing apparatus 100D inserts, for example, an external storage medium storing image data, and captures a plurality of video recorders and home servers capable of storing a plurality of read image data, a plurality of still images and moving images, They are digital still cameras and digital video cameras that can store data.
- FIG. 26 is a flowchart showing a flow of data output determination processing in the third embodiment of the present invention.
- FIG. 27A and FIG. 27B are diagrams showing an example of the sub attribute correction table according to the third embodiment of the present invention.
- FIG. 28 is a diagram showing an example of social information in the third embodiment of the present invention.
- FIG. 29 is a diagram showing an example of the internal data list of the data output determination process in the third embodiment of the present invention.
- 30A and 30B are diagrams showing an example of the data conversion table in the third embodiment of the present invention.
- FIG. 31 is a diagram showing an example of the data output result in the third embodiment of the present invention.
- the data conversion unit 112 shown in FIG. 25 converts content data into an arbitrary format. Specifically, the data conversion unit 112 has a role of converting or processing data to be output according to an instruction of the internal data list output by the data output determination unit 107.
- the data output determination unit 107 determines that the data conversion unit 112 converts the content data according to the closeness and outputs the converted content data.
- the application unit 104 also has a function as a data output unit that outputs the content data converted to the data conversion unit 112 according to the determination result of the data output determination unit 107.
- application section 104 of data processing apparatus 100D transfers the latest content data stored in data storage section 102 to external data processing apparatus 100B or data processing apparatus 100C. . Then, among the latest content data stored in the data storage unit 102, a user who has an intimate relationship with a user who is in an intimate relationship to the user or a user who is in an intimate relationship with a user who is in an object
- data processing for changing the size of data to be transferred according to the closeness to the user transferring the data when selecting and transferring the content data will be described below.
- steps S701 to S703 of the data output determination process by the data output determination unit 107 are the same as steps S601 to S603 shown in FIG.
- step S704 the data output determination unit 107 corrects the social information stored in the social information storage unit 106 using the sub attribute correction table shown in FIGS. 27A and 27B in addition to the attribute correction table described in step S604. Do.
- the closeness of each user is corrected based on the sub attribute information.
- the user “Tom” is designated “parent (parent)” as the sub attribute information for the social information owner "Mike”. Therefore, according to the sub-attribute correction table of FIG. 27A, correction is made to “0.83” of +0.10 with respect to the conventional closeness of “0.73” (see FIG. 20).
- the user “Alice” is corrected to "0.99” of +0.05 as shown in FIG. 27B because "school” is specified as the sub attribute information (similar to the user "Julia", Here, the maximum value of closeness is 0.99).
- Steps S705 to S706 are the same as steps S605 to S616 shown in FIG.
- the closeness is corrected according to the sub-attribute at step S704, and therefore the attention of the related group "1" As a user related to the area "Mike", a user "Tom (father of Mike)” whose intimacy level has been changed to "A” is newly registered as a data provision candidate.
- the data output determination unit 107 satisfies a predetermined condition (for example, the priority level is “A” or “B”) among the users of data provision candidates in the internal data list. It is determined whether or not (S707).
- a predetermined condition for example, the priority level is “A” or “B”
- the data output determination unit 107 determines that the predetermined condition is satisfied (Y in S 707), the content output target of the data output is obtained based on the data conversion table as shown in FIGS. 30A and 30B. A data conversion rate is determined (S 708), and information is added to the internal data list (S 709).
- FIG. 30A is a diagram for changing the data conversion rate based on the priority level
- FIG. 30B is for changing the data conversion rate in consideration of sub-attribute information in addition to the priority level.
- the user “James” who has the priority level “B” and does not have sub attribute information has the data conversion rate “50%” and the priority level “A”
- the user “Julia” whose sub attribute information is “grandmother” has a data conversion rate of "150%”, the priority level is "A”, and the user “Tom” whose sub attribute information is “parent” has a data conversion rate Is “75%”, the priority level is “A”, the sub attribute information is “school”, and the user “Alice” having no reference information in the data conversion table has a data conversion rate of “100”. %.
- the data conversion unit 112 of the data processing apparatus 100D of the third embodiment performs the data conversion ratio of the internal data list output by the data output determination unit 107 in the processing queue execution process shown in step S108 of FIG. According to the instruction, the size of data to be output etc. is converted for each user. Thereafter, the application unit 104 of the data processing apparatus 100D transfers the data converted by the data conversion unit 112 to the user.
- the decision basis of the data conversion rate is not limited to this, and it is based on the profile (for example, age, gender, residence area, handicap information, screen size of owned equipment, capability such as communication function, etc.)
- the data conversion rate may be determined.
- the size and resolution of individual data of the data are converted using the data conversion rate referenced from the priority level or the sub attribute information, or processing such as zooming is performed to partially cut out the data. It is assumed that conversion is performed.
- the conversion method of the data is not limited to this, and at least one or more still images selected as data to be provided are arranged as one still image and expressed (that is, at least one in a region divided in one still image). The above still image may be laid out and converted so as to newly generate at least one or more pieces of data to be originally provided as summary data.
- the summary data newly generated by the summary may be converted as common summary data for at least one or more users who are candidates for providing the summary data, but conversion to summary data is It is not limited to only one generation.
- a still image reference is made to the closeness or profile with the user serving as the provision destination candidate, and further the closeness with other users having an intimate relationship with the user serving as the supply destination candidate.
- at least one or more still images that satisfy the above may be automatically selected, and conversion may be performed so that at least one or more summary data different for each user as a provision destination candidate is generated as a still image.
- the conversion of the data to a still image is described as an example, but the data is not limited to a still image.
- the data is a moving image
- the image size or resolution of the entire moving image is converted at a predetermined data conversion rate, or a scene whose face area is specified by a recognition dictionary is preferentially left with high image quality.
- the scene of may be converted to reduce the resolution and the frame rate according to a predetermined data compression rate.
- only scenes in which a face area that satisfies a certain condition appears may be reconstructed as a digest version and converted so as to generate summary data.
- the summary data to be generated as a moving image may be summary data common to the user serving as the provision destination candidate, but referring to the closeness or profile with the user serving as the provision destination candidate And at least one or more moving image scenes that meet certain conditions are automatically selected, and at least one or more summaries that are different for each user serving as a provision destination candidate. It may be converted to generate data as a moving image.
- the data processing apparatus 100D may provide summary data commensurate with the taste and taste according to the profile of each user who is the provision destination candidate.
- summary data only the essence of the data (eg, the digest of the grandchild's activity), the entire outline (eg, the atmosphere of the entire wedding, the outline of travel) Grasp a time-series change from the viewpoint of the other user (for example, a cousin's growth record after a previous meeting, a travel record of a friend), or a presentation adapted to the preference of the other own user
- Contents included in the summary data such as subject, order of appearance scenes, etc.
- grasping by effects eg, favorite templates such as news style, displaying characters and expressions in large size, slide show with slow switching speed
- effects eg, favorite templates such as news style, displaying characters and expressions in large size, slide show with slow switching speed
- the data processing apparatus 100D forms group information in consideration of the relationship between the other users in addition to the closeness of the user as the provision destination candidate, and forms at least the information on the data and the group information.
- One or more summary data may be generated in advance. Thereby, the user can manage / view and control at least one or more summary data in the group information unit.
- the data processing apparatus 100D is, for example, a user who establishes a certain distance between the first summary data for users such as family and relatives, and the second summary data for users such as close friends, Third summarized data can be generated in advance for the superior of the company, the same or the junior, or the fellow of the circle activity).
- the first summary data may include, for example, private content such as the expression of a grandchild or the appearance of a house.
- the second summary data may include the contents of content data to be shared, which is generated in the process of spending the same event together such as a home party or a trip.
- the third summary data may include content centered on an openable event that does not include private content such as a picture of a family member or a close friend.
- the data processing apparatus 100D provides summary data suitable for the group information in which the user is included, from among the plurality of summary data generated in advance in this way, to each user who is the provision destination candidate. it can.
- the selection method and the data conversion rate of the data are changed according to the closeness and the profile of the user as the provision destination candidate. For this reason, for example, in the case of high closeness "bad grandparents with bad eyes", select or convert data so that the face of the grandchild becomes a large picture, and convert it into a small picture about notification to people who are not very close It can be controlled to send. Thereby, appropriate data is converted into an appropriate format without imposing a user operation burden of instructing selection and conversion of the data each time according to the relationship with the other party of the transmission and the characteristics of the other party, and there is an intimate relationship It can be sent to the user.
- FIG. 32 is a block diagram showing the configuration of a data processing apparatus 100E according to a fourth embodiment of the present invention.
- the same components as in FIG. 1 and FIG. 11 will be assigned the same reference numerals and descriptions thereof will be omitted.
- the data processing apparatus 100E has an accompanying history management unit 113 and a history data storage unit 114 (configuration of the fourth embodiment).
- the data processing apparatus 100E inserts, for example, an external storage medium storing image data, and captures a plurality of video recorders and home servers capable of storing a plurality of read image data, and a plurality of still images and moving images. They are digital still cameras and digital video cameras that can store data.
- FIG. 33 is a diagram showing an example of accumulation of content data in the fourth embodiment of the present invention.
- FIG. 34 is a diagram showing an example of social information in the fourth embodiment of the present invention.
- FIG. 35 is a flowchart showing a flow of companion history management processing in the fourth embodiment of the present invention.
- FIG. 36 is a diagram showing an example of accompanying history information in the fourth embodiment of the present invention.
- FIG. 37 is a flowchart showing a flow of data output determination processing in the fourth embodiment of the present invention.
- FIG. 38 is a diagram showing an example of imaging time zone distribution according to Embodiment 4 of the present invention.
- the accompanying history management unit 113 illustrated in FIG. 32 acquires, as accompanying history information, a short distance communication history with the data processing apparatus 100B or the data processing apparatus 100C that is an external device, which is performed through the communication unit 111.
- the data output determination unit 107 is an external device that has performed near field communication indicated by accompanying history information among users associated with social information, and the external device associated with a user whose closeness is equal to or higher than a predetermined threshold To determine that content data is to be output.
- the predetermined threshold is, for example, 0.5 when the intimacy degree takes a value of 0 to 1.
- the history data storage unit 114 is a memory that stores the accompanying history information.
- the companion history management unit 113 is internally included by the data processing apparatus 100E, and is also internally incorporated by the data processing apparatus 100B and the data processing apparatus 100C connected via the communication unit 111 via the network 200.
- the network 200 is a short distance wireless network forming a local ad hoc communication network, and it is assumed that the data processing devices 100E, 100B, 100C communicate with each other, and each data processing device 100E, 100B, 100C.
- the companion history manager 113 manages the mutual existence as companion history information, and the application unit 104 of the data processing apparatus 100E subsequently operates the external data processing apparatus 100B or data processing apparatus for the content data stored by the data storage unit 102. Transfer to 100C.
- the data storage unit 102 stores 10 still picture photographs taken with the device ID “DSC-X” as the latest content data.
- the relationship group extracted by the data relationship extraction unit 105 of the data processing apparatus 100E is assumed to be grouped based on the shooting date and time of “C-1” to “C-8” to simplify the description.
- closeness is set such that the user “Mike” is the social information owner, and the address of each user and the owned device ID are registered in advance. Do.
- the companion history management unit 113 of the data processing apparatus 100E checks a predetermined timer cycle, such as 10 minutes (S801).
- the accompanying history management unit 113 corresponds to the timer cycle (Y in S801)
- the external data processing apparatus connected in the network 200 via the communication unit 111 here, the data processing apparatus 100B or 100C.
- the companion history management unit 113 determines whether there is a device on the network 200 (S803). Then, when the companion history management unit 113 determines that a device is present on the network 200 (Y in S803), for example, the device ID of the device is used as the device information, and the history data storage unit It is registered as companion history data in 114 (S804).
- step S801 when the companion history management unit 113 does not correspond to the timer cycle in step S801 and when there is no device in step S803, the process ends (N in S801 and N in S803).
- FIG. 36 shows an example of companion history data registered by the companion history management unit 113 in step S804.
- the horizontal axis of the figure represents each time zone from 8:00 am to 20:00 pm of the shooting date and time information (here, October 1, 2002) of the content data to be stored in an hour unit.
- the vertical axis represents the device ID of the device detected by the accompanying history management unit 113 of the data processing apparatus 100E in the same time zone.
- the device ID in which " ⁇ " is described in the figure exists in the vicinity of the data processing apparatus 100E in the same time zone, and the companion history management unit 113 determines that the companion is.
- the device ID "DSC-Y" owned by the user “James” was accompanied by the user "Mike” owning the data processing apparatus 100E "DSC-X" from 9 am to 16 pm It turns out that.
- a user linked with another device ID is also present in the vicinity of the user "Mike", and is regarded as companion from the companion history data, but in the social information shown in FIG. 34, the device ID "DSC-V" Is not registered, it is considered to be the device ID of the device owned by the user unknown to the user "Mike".
- steps S901 to S906 are the same as the processing of the data output determination unit 107 shown in FIG.
- the data output determination unit 107 of the fourth embodiment satisfies the predetermined condition, for example, the photographing time of the transmission data candidate It is determined whether there is a match between the time zone in which the device ID registered in the companion history data output by the companion history management unit 113 is regarded as companion (S907).
- the data output determination unit 107 adopts it as final transmission data to the user who is associated with the condition (here, the device ID) (S908).
- the data output determination unit 107 does not match the companion history data, that is, does not satisfy the predetermined condition (N in S907), it is not adopted as final transmission data to the user who is tied to the condition (S909) .
- the data output determination unit 107 adds information to the internal data list based on the determination result of step S 908 or step S 909 (S 910).
- the data processing apparatus 100E shown in FIG. 32 executes the processing queue according to the internal data list output by the data output determination unit 107 in step S108 in FIG.
- the equipment for which the companion history management unit 113 left the history as companion The content data is transmitted according to the distribution of imaging time zones (distribution of the time zone in which the content data was imaged) shown as an example in FIG. 38, focusing on two owners of “James” and “Paul”.
- each time zone from 8:00 am to 20:00 pm on October 1, 2002 is represented in 1 hour unit, and shows the distribution of the content photographed in each time zone It is.
- the content data "C-1 to C-7” taken during the time zone from 9 to 16 o'clock accompanied by “James” and from 8 o'clock to 11 o'clock accompanied by "Paul”
- the device ID “DSC-V” not registered in the social information is naturally ignored in the process of executing the processing queue.
- the devices are local via the network
- the accompanying time zone is determined from the accompanying history data indicating that it has been in the vicinity by performing a proper communication. Therefore, for example, it is possible to determine the device of an acquaintance who was present in the vicinity when the data was photographed during an actual trip or the like, and to control to transmit only the photograph of the time zone that was present together to the device of the acquaintance It will be. As a result, it is possible to transmit data according to the actual action history to the user who has an intimate relationship, without imposing a user operation burden of selecting the data photographed in the same time zone.
- “Kevin” which is the owner of the device ID “DSC-K” is selected as the data output destination because the intimacy is “0.06” and the evaluation level is low at “Z” at the priority level. Although it did not, the closeness was improved because it was acting with the data processing apparatus 100E owned by the social information owner "Mike”, or it was selected as the data output destination, and the action was performed based on the companion history data It is also possible to output content data of a time zone in which
- the companion history management unit 113 periodically acquires companion history data, for example, 10 minutes by using a regular timer, but the management method of the companion history data is not limited to this.
- accompanying history at any timing such as at any timing desired by the user or before or after the power is turned on, immediately after an operation event such as photographing occurs, or when an operation event notification is received from a peripheral device via the network 200 Data may be accumulated.
- accompanying history data managed by the accompanying history management unit 113 has been described in the form of being rounded and recorded in units of one hour, the method of managing accompanying history data is not limited to this, and short distance communication is performed for each device ID. Alternatively, other methods may be used, such as recording the time at the time of switching between online and offline.
- the device ID of the device owned by each user is described as being registered in advance as social information, but the method of managing the owned device ID is not limited thereto, and the address of the target user is used
- the device information including the device ID may be acquired from the device that has performed communication, and it may be registered in the social information as the owned device ID by estimating that the device is an owned device from the frequency and the cumulative number of times of communication. At that time, numerical values for estimating the likelihood may be registered at the same time.
- the determination of the data output is not limited to this, and all the intimacy is high.
- filtering may be performed specifying a finer time zone.
- FIG. 39 is a block diagram showing a configuration of a data processing device 100F according to Embodiment 5 of the present invention.
- the data processing apparatus 100F has a history data storage unit 114, a social information updating unit 115, and a sensor 116 in addition to the components shown in FIG. 11 (configuration of the fifth embodiment).
- the data processing apparatus 100F of the fifth embodiment inserts, for example, an external storage medium storing image data, and captures a plurality of video recorders and home servers capable of storing a plurality of read image data, a plurality of still images and moving images, They are digital still cameras and digital video cameras that can store data.
- FIG. 40 is a diagram showing an example of accumulation of content data in the fifth embodiment of the present invention.
- FIG. 41 is a flowchart showing a flow of history saving processing in the fifth embodiment of the present invention.
- FIG. 42 is a diagram showing an example of movement of each device in the fifth embodiment of the present invention.
- FIG. 43 is a diagram showing an example of position information history data for each device according to the fifth embodiment of the present invention.
- FIG. 44 is a flowchart showing a flow of data output determination processing in the fifth embodiment of the present invention.
- 45A and 45B are diagrams showing an example of the social information update threshold according to the fifth embodiment of the present invention.
- FIG. 46 is a diagram showing an example of the data output threshold according to the fifth embodiment of the present invention.
- FIG. 47 is a diagram showing an example of relative position information calculation result in Embodiment 5 of the present invention.
- the social information updating unit 115 illustrated in FIG. 39 refers to the device peripheral information stored in the history data storage unit 114 by the sensor 116 as history data, and updates the social information stored in the social information storage unit 106. Specifically, the social information updating unit 115 acquires history data of peripheral information of the data processing apparatus 100B or the data processing apparatus 100C, which is an external device, through the communication unit 111, and the social information of the external apparatus is Update to social information including acquired historical data.
- the sensor 116 detects surrounding information indicating the position of the data processing device 100F.
- the data storage unit 102 stores history data of the detection result of the sensor 116 in the history data storage unit 114.
- the data output determination unit 107 uses the latest social information updated by the social information update unit 115 to determine whether to output content data.
- the application unit 104 of the data processing apparatus 100F transfers the content data stored in the data storage unit 102 to the external data processing apparatus 100B or 100C.
- the data processing for determining the data to be transferred will be described below by analyzing the device peripheral information left by the device owned by the user in addition to the updated user's familiarity at the time of the transfer.
- data storage unit 102 stores 10 still picture photographs taken with device ID “DSC-X” as the latest content data. .
- the relationship groups extracted by the data relationship extraction unit 105 of the data processing apparatus 100F are grouped based on the shooting dates and times “C-1” to “C-8” to simplify the explanation.
- shooting location information known by GPS (Global Positioning System) is included for each data.
- the closeness is set such that the user “Mike” is the social information owner, and the address of each user and the owned device ID are registered in advance. Do.
- the social information updating unit 115 of the data processing apparatus 100F checks a predetermined timer cycle, such as 10 minutes (S1001).
- the social information updating unit 115 acquires device peripheral information such as position information and temperature that can be acquired by the sensor 116 (S1002).
- the social information updating unit 115 determines whether or not there is a change in the acquired device peripheral information (S1003). Then, if the social information updating unit 115 determines that there is a change in the device peripheral information (Y in S1003), for example, causes the data storage unit 102 to register the positional information of the device as history data (S1004).
- step S1003 if the social information updating unit 115 does not correspond to the timer cycle in step S1001 and if it is determined in step S1003 that there is no change in the device peripheral information, the process ends (N in S1001 and N in S1003).
- FIG. 42 shows an example of moving on the map of the area where each device is in the fifth embodiment.
- the device IDs “DSC-X” and “DSC-Y” that have left Kyoto Station act together until Nijo Castle, and thereafter, the device ID “DSC-X” is Kinkakuji, and the device ID “DSC- "Y” is a conceptual illustration of a route from Nijo Castle where the two meet again in Gion and return to Kyoto Station via Kiyomizu Temple.
- FIG. 43 shows an example in which, in addition to the device IDs “DSC-X” and “DSC-Y” shown in FIG. 42, the position information traced by the device ID “CM-P” is plotted in chronological order as history data. .
- step S1003 of FIG. 41 it is assumed that the record is left when there is a large change in the position information as the device peripheral information in each time zone.
- the data processing apparatus 100F acquires history data accumulated in each device from the data processing apparatus 100B or the data processing apparatus 100C connected via the communication unit 111 via the network 200.
- Steps S1101 to S1102 are the same as the processing of data output determination unit 107 shown in FIG.
- the data output determination unit 107 of the fifth embodiment determines whether or not to perform correction based on history data (S1103), and when it is determined that correction is not to be performed (N of S1103), the process proceeds to step S1105. .
- the data output determination unit 107 determines that the correction is to be performed (Y in S1103)
- the data output determination unit 107 instructs the social information update unit 115 to make a correction attempt.
- the social information updating unit 115 analyzes the history data as shown in FIG. 43, and corrects the closeness to each user according to the social information updating threshold as shown in FIGS. 45A and 45B (S1104).
- FIG. 45A shows some variations such as increasing the closeness to “+0.05” if the cumulative time that the devices to be compared and analyzed stay in the same area is 50 hours or more.
- An example of the threshold which updates closeness is shown.
- FIG. 45B shows an example of a threshold for updating the closeness depending on the same area stay time and the value of the place (place value) indicated by the ID “2” in FIG. 45A.
- the location referred to by the location information such as GPS is an address registered as a school
- the devices staying in the same area are likely to be alumni, and the devices are strapped according to the value of the location. It is assumed that the intimacy with the attached user is corrected.
- Information on places can now be obtained through a wide variety of information through network services via the Internet, so by a general evaluation of the place where the device was brought in, for example, it would be very good for families, lovers, and high-end restaurants in theme parks. It can be considered as an important person, and the closeness can be updated along with the action of the real world.
- the social information updating unit 115 compares the history data stored by the data storage unit 102 with the history data of the updated external device, and the relative distance between the position information of the data processing device 100F and the external device
- the closeness included in the social information is updated using at least one of the information and the accompanying frequency increase / decrease tendency. That is, for example, the social information updating unit 115 detects area information of the place where the relative distance between the data processing apparatus 100F and the external apparatus is close, and the closeness between the two apparatuses is determined according to the area information. If the frequency of accompanying both devices tends to increase, the closeness is increased.
- steps S1105 to S1106 are the same as steps S705 to S706 shown in FIG.
- the data output determination unit 107 of the fifth embodiment satisfies the predetermined condition, for example, as shown in FIG.
- the data output determination unit 107 of the fifth embodiment satisfies the predetermined condition, for example, as shown in FIG.
- the data output determination unit 107 adopts data of the corresponding time zone and the imaging location as final transmission data (S1108).
- the data output determination unit 107 does not fall within the defined relative distance range as the data provision threshold, that is, the predetermined condition is not satisfied (N in S1107), the data is not adopted as final transmission data (N. S1109).
- the data output determination unit 107 adds information to the internal data list based on the determination result of step S1108 or step S1109 (S1110).
- step S108 the data processing apparatus 100F illustrated in FIG. 39 executes the processing queue according to the internal data list output by the data output determination unit 107.
- the user is a user whose priority level is determined to be “A” or “B” based on the closeness shown in FIG. 34, and as shown in FIG.
- the two devices “DSC-Y” analyzed and “James” and “Paul” who are owners of “CM-P” are data destination candidates.
- “James” content data “C-1” to “C-” photographed in the time zone from 9:00 to 10:45 where the relative distance matches the condition of "within 3 km” shown in FIG. 4
- the sensor of each device via the network
- the historical data such as position information accumulated by the above is acquired, and the accompanying place or time zone is determined from the historical data calculated from the relative distance between the respective devices.
- the closeness included in the social information is updated using at least one of the relative distance of the position information of the data processing apparatus 100F and the external apparatus, the area information, and the accompanying frequency change tendency, and the latest state is reflected. Whether to provide data is determined using social information including the intimacy degree.
- the social information updating unit 115 acquires status information indicating whether the external device can receive content data via the communication unit 111, as in the description of (a) in FIG. You may decide to update social information including.
- the status information is, for example, information on power ON / OFF of the external device, information indicating whether or not the capacity for receiving content data is sufficient, or content for performing other processing. The information indicates that data can not be received.
- the social information updating unit 115 of the data processing apparatus 100F has been described using an example in which the social information is updated by analyzing the history data output by the sensor 116.
- the content of the communication formed by the application unit 104 by communication with the application unit (not shown) included in the external data processing apparatus is output as history data, and the social information updating unit 115 communicates with the external data processing apparatus.
- Social information may be updated by analyzing history data of contents.
- the application unit 104 executes an application that provides a communication function with the external device via the communication unit 111. Then, the data accumulation unit 102 accumulates, in the history data storage unit 114, the process history of the execution of the application as history data.
- the social information updating unit 115 performs at least one of communication partner information included in the history data stored in the history data storage unit 114, the total number of times of communication, the access frequency, the access frequency increase / decrease tendency, and the text of transmission / reception data.
- the data output determination unit 107 uses the latest social information updated by the social information update unit 115 to determine whether to output content data.
- the closeness included in the social information is updated using at least one of the communication partner information, the communication total count, the access frequency, the access frequency increase / decrease tendency, and the text of the transmission / reception data, and the latest state
- the communication partner information the communication total count
- the access frequency the access frequency increase / decrease tendency
- the text of the transmission / reception data and the latest state
- FIG. 48 is a block diagram showing another configuration of the data processing apparatus 100F according to the fifth embodiment of the present invention.
- the social information updating unit 115 of the data processing apparatus 100F analyzes social information by analyzing information that can be acquired inside the data processing apparatus 100F and information that is acquired from the outside. did.
- the social information updating unit 115 may update the social information with a configuration as shown in FIG.
- the data processing apparatus 100F is connected to the social information server 500 via the network 200.
- the social information updating unit 115 acquires the latest social information from the social information server 500 which is an external device via the communication unit 111, and updates the social information.
- the social information updating unit 115 transmits a social information acquisition request to the social information management unit 502 via the communication unit 111 of the data processing apparatus 100F and the communication unit 501 of the social information server 500. Then, the social information updating unit 115 acquires the latest social information as a response to the social information acquisition request, and stores the acquired latest social information in the social information storage unit 106.
- the data output determination unit 107 uses the latest social information updated by the social information update unit 115 to determine whether to output content data.
- the social information updating unit 115 updates the social information stored in the social information storage unit 106 based on the information acquired inside the data processing apparatus 100F
- all or part of the social information may be external social It may be transmitted as a social information update request to the social information management unit 502 included in the information server 500.
- FIG. 49 is a block diagram showing a configuration of a data processing device 100G in the sixth embodiment of the present invention.
- the data processing apparatus 100G has a social information management unit 117 in addition to the components of FIG. 11, and the recognition dictionary stored in the recognition dictionary storage unit 109 stores meta information therein (implementation of Configuration of mode 6).
- the data processing apparatus 100G inserts, for example, an external storage medium storing image data, and captures a plurality of video recorders and home servers capable of storing a plurality of read image data, a plurality of still images and moving images, These include digital still cameras and digital video cameras that can store, and digital photo frames that can store and display still images and moving images.
- the social information management unit 117 shown in FIG. 49 receives a data acquisition and update request from an external device via the communication unit 111, the data of the recognition dictionary used by the object analysis unit 110 is received from the external device according to the closeness. Get and update the recognition dictionary.
- the social information management unit 117 is stored in the social information storage unit 106 based on a request from the external data processing apparatus 100B or 100C connected via the communication unit 111 and the network 200. Update, edit, save, and externally provide a recognition dictionary (including meta information) stored in the existing social information or recognition dictionary storage unit 109.
- the object analysis unit 110 uses the recognition dictionary updated by the social information management unit 117 to extract an object included in the content data.
- the data output determination unit 107 determines the social information among the two or more pieces of meta information It is determined that the meta information associated with the higher degree of intimacy included in the information is preferentially output.
- the application unit 104 has a function as a data output unit that outputs content data and meta information determined to be output preferentially by the data output determination unit 107.
- FIG. 50 is a flowchart showing a flow of social information management processing according to the sixth embodiment of the present invention.
- FIG. 51 is a diagram showing an example of user information in the sixth embodiment of the present invention.
- FIG. 52 is a diagram showing an example of the screen output result in the sixth embodiment of the present invention.
- the social information stored in the social information storage unit 106 of the data processing apparatus 100G is the same as FIG. 20, the target image data is FIG. 16, and the object analysis result by the object analysis unit 110 is the same as FIG. It is assumed that
- the social information management unit 117 of the data processing apparatus 100G transmits an external device (here, owned by the user "Alice") via the communication unit 111 and the network 200.
- the social information operation request from the data processing apparatus 100B) to be received is received (S1201).
- the social information management unit 117 directly acquires, from the social information operation request message data, user information (see FIG. 51) of the user who is the owner of the transmission source device that has transmitted the social information operation request, or the data processing It separately acquires from the apparatus 100B (S1202).
- the social information management unit 117 passes the acquired user information of the user to the data output determination unit 107, and requests determination of whether data can be provided to the data processing apparatus 100B (S1203).
- the social information management unit 117 determines, for example, whether or not the defined condition such that the priority level is "A" is satisfied (S1204).
- the user information of the user is simultaneously recorded in the recognition dictionary.
- the application unit 104 displays a screen via the output unit 108
- the user who edited the meta information can edit the display priority of the meta information and the application unit 104. It can control according to closeness with the said user who operates.
- the data output determination unit 107 determines to preferentially output the meta information associated with the higher intimacy degree.
- the social information management unit 117 rejects the social information operation request (S1206), and ends the process.
- the application unit 104 displays the data “C-3” on the screen and at the same time, an object analysis unit
- the meta information such as the names and comments of similar persons analyzed by 110 is displayed.
- the comment “Tofu was delicious” is displayed as the meta information of the user “James”, but this may be the content that the user “James” edited as a comment on the data in advance, or the user From the URL of the blog described in the user information of “James”, a diary or an article of the shooting date and time may be acquired from an external server (not shown) on the network 200 via the communication unit 111 and displayed.
- the intimacy can be increased by acquiring and displaying the latest diary and articles as well as the diary and article of the shooting date and time when the image was photographed.
- the user's latest interests and activities can be visually checked on the screen of the digital photo frame.
- the content of the social information operation request is not limited to this, and the recognition dictionary
- the learning operation for object analysis may be performed from the external data processing apparatus 100B, or the data processing apparatus 100B may refer to, acquire, and use a recognition dictionary held by the data processing apparatus 100G.
- the data of the recognition dictionary is acquired from the external device according to the closeness, the recognition dictionary is updated, and the object included in the content data is extracted using the latest recognition dictionary. That is, in updating, editing, storing, and externally providing processing of a recognition dictionary used to analyze an object included in the data and the meta information associated with the recognition dictionary, the processing is requested with the user who is the information owner of the recognition dictionary and meta information.
- the relationship with the user is determined using the closeness between the users represented by the social information. Therefore, for example, it becomes possible to control to permit editing of the recognition dictionary only when closeness is set for a user who desires to edit the recognition dictionary exceeding a predetermined threshold, and malicious user from malicious user Editing can be avoided, and only editing from users in close relationship can be permitted.
- recognition dictionaries can be shared between users in close relationship with each other, it is possible to reduce the user operation load required for the user to learn recognition dictionaries compared to the case where individual devices learn the recognition dictionaries. it can.
- meta information when a plurality of meta information is attached to one object included in the data by editing by a plurality of users, it is possible to control to preferentially display the meta information attached by the user having high intimacy. For this reason, it is possible to select meta information having high reliability and strong interest among a plurality of meta information.
- the explanation has been made using the owned device ID that associates the user with the device as a means for determining whether each user satisfies the prescribed condition, but the determination means of the user
- the present invention is not limited to this, and it is also possible to use binary data including an address of an electronic mail that can specify the user, a blog or SNS diary URL, keywords, login information for SNS sites or data processing devices, and images.
- the data processing apparatus includes the processing unit as shown in FIG. 1 and the like. However, as shown in FIG. 53, the data processing apparatus 100H at least includes the application. It is sufficient if the unit 104 and the data output determination unit 107 are provided.
- FIG. 53 is a block diagram showing the minimum configuration of the data processing apparatus according to the embodiment of the present invention.
- the data output determination unit 107 determines that the content data is to be output when the closeness is equal to or higher than the predetermined threshold, and the application unit 104 outputs the content data according to the determination result of the data output determination unit 107.
- the user operation load such as selecting desired target data from a large number of target data is not imposed, the data sharing with family members and acquaintances can be performed while suppressing the operation load by the user. It can be realized.
- the present invention can not only be realized as such a data processing apparatus, but also can be realized as a data processing method in which processing of a characteristic processing unit provided in the data processing apparatus is taken as a step.
- the present invention can also be realized as a program that causes a computer to execute the steps included in the above data processing method. Needless to say, such a program can be distributed via a recording medium such as a CD-ROM or a transmission medium such as the Internet.
- each functional block included in the data processing apparatus may be realized as an LSI which is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include some or all.
- an LSI may be called an IC, a system LSI, a super LSI, or an ultra LSI depending on the degree of integration. Further, the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible. After the LSI is manufactured, a field programmable gate array (FPGA) that can be programmed or a reconfigurable processor that can reconfigure connection and setting of circuit cells in the LSI may be used.
- FPGA field programmable gate array
- a data processing apparatus includes a video recorder, a home server, a digital still camera, a digital video camera, a personal computer, a computer for enterprise or enterprise (workstation), a digital television receiver equipped with an image data capturing function, a set top box It is useful for application to car navigation systems, projectors, mobile terminals, music components, digital photo frames, remote control terminals for device control, and the like.
- Data processing device 101
- Input unit 102
- Data storage unit 103
- Content data storage unit 104
- Application unit 105
- Data relationship extraction unit 106
- Social information storage unit 107
- Data output determination unit 108 output unit 109 recognition dictionary storage unit 110 object analysis unit 111, 501 communication unit 112 data conversion unit 113 companion history management unit 114 history data storage unit 115 social information update unit 116 sensor 117, 502 social information management unit 200 network 500 social information server
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
図1は、本発明の実施の形態1におけるデータ処理装置100の構成を示すブロック図である。
図11は、本発明の実施の形態2のデータ処理装置100Aの構成を示すブロック図である。図11において、図1と同じ構成要素については同じ符号を用い、説明を省略する。
図25は、本発明の実施の形態3のデータ処理装置100Dの構成を示すブロック図である。図25において、図1および図11と同じ構成要素については同じ符号を用い、説明を省略する。
図32は、本発明の実施の形態4のデータ処理装置100Eの構成を示すブロック図である。図32において、図1および図11と同じ構成要素については同じ符号を用い、説明を省略する。
図39は、本発明の実施の形態5におけるデータ処理装置100Fの構成を示すブロック図である。図39において、図1および図11と同じ構成要素については同じ符号を用い、説明を省略する。
図49は、本発明の実施の形態6におけるデータ処理装置100Gの構成を示すブロック図である。図49において、図1および図11と同じ構成要素については同じ符号を用い、説明を省略する。
101 入力部
102 データ蓄積部
103 コンテンツデータ格納部
104 アプリケーション部
105 データ関係抽出部
106 ソーシャル情報格納部
107 データ出力判定部
108 出力部
109 認識辞書格納部
110 オブジェクト解析部
111、501 通信部
112 データ変換部
113 同伴履歴管理部
114 履歴データ格納部
115 ソーシャル情報更新部
116 センサ
117、502 ソーシャル情報管理部
200 ネットワーク
500 ソーシャル情報サーバ
Claims (15)
- コンテンツデータを出力するデータ処理装置であって、
所定のユーザ間の親密さの度合いを示す親密度と、前記コンテンツデータおよび前記親密度を対応付けるための情報とを含むソーシャル情報を用いて、前記コンテンツデータを出力するか否かを判定するデータ出力判定部と、
前記データ出力判定部が前記コンテンツデータを出力すると判定した場合に、前記コンテンツデータを出力するデータ出力部とを備え、
前記データ出力判定部は、前記ソーシャル情報を参照し、前記親密度が所定の閾値以上の場合に、前記親密度に対応付けられるコンテンツデータを出力すると判定する
データ処理装置。 - 前記データ処理装置は、さらに、
コンテンツデータ間の関係を示す情報を抽出するデータ関係抽出部を備え、
前記データ出力判定部は、さらに、前記データ関係抽出部が抽出した情報で示される所定の関係を有するコンテンツデータを出力すると判定する
請求項1に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記コンテンツデータに含まれるオブジェクトを認識するための認識辞書を格納している認識辞書格納部と、
前記認識辞書を用いて、前記コンテンツデータに含まれるオブジェクトを抽出するオブジェクト解析部とを備え、
前記データ出力判定部は、さらに、前記ソーシャル情報で関連付けられるユーザのうち、前記オブジェクト解析部が抽出したオブジェクトに関連するユーザであって、前記親密度が所定の閾値以上のユーザに対応付けられた外部装置に、前記コンテンツデータを出力すると判定する
請求項1または2に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
通信ネットワークを介して前記外部装置と通信を行う通信部を備え、
前記データ出力部は、前記通信部を介して前記外部装置に前記コンテンツデータを出力する
請求項3に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記コンテンツデータを任意の形式にデータ変換するデータ変換部を備え、
前記データ出力判定部は、さらに、前記親密度に応じて前記データ変換部に前記コンテンツデータを変換させて出力すると判定し、
前記データ出力部は、前記データ出力判定部の判定結果に応じて前記データ変換部に変換された前記コンテンツデータを出力する
請求項1~4のいずれか1項に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記通信部を介して行った前記外部装置との近距離通信履歴を同伴履歴情報として取得する同伴履歴管理部を備え、
前記データ出力判定部は、さらに、前記ソーシャル情報で関連付けられるユーザのうち、前記同伴履歴情報で示される近距離通信を行った外部装置であって、前記親密度が所定の閾値以上のユーザに対応付けられた外部装置に、前記コンテンツデータを出力すると判定する
請求項4に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記通信部を介して前記外部装置が前記コンテンツデータを受信できるか否かを示す状態情報を取得し、前記状態情報を含む前記ソーシャル情報を更新するソーシャル情報更新部を備え、
前記データ出力判定部は、さらに、前記ソーシャル情報更新部により更新された最新の前記ソーシャル情報に含まれる前記状態情報を用いて、前記コンテンツデータを出力するか否かを判定する
請求項4に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記通信部を介して前記外部装置から最新のソーシャル情報を取得し、前記ソーシャル情報を更新するソーシャル情報更新部を備え、
前記データ出力判定部は、さらに、前記ソーシャル情報更新部により更新された最新の前記ソーシャル情報を用いて、前記コンテンツデータを出力するか否かを判定する
請求項4に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記通信部を介して前記外部装置とのコミュニケーション機能を提供するアプリケーションを実行するアプリケーション部と、
前記アプリケーションの実行による処理履歴を履歴データとして蓄積するデータ蓄積部と、
前記履歴データに含まれるコミュニケーションの相手情報、通信累計回数、アクセス頻度、アクセス頻度増減傾向および送受信データの本文のうちの少なくとも1つを用いて、前記ソーシャル情報に含まれる親密度を更新するソーシャル情報更新部とを備え、
前記データ出力判定部は、さらに、前記ソーシャル情報更新部により更新された最新の前記ソーシャル情報を用いて、前記コンテンツデータを出力するか否かを判定する
請求項4に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記データ処理装置の位置を示す周辺情報を検知するセンサと、
前記センサによる検知結果の履歴データを蓄積するデータ蓄積部と、
前記通信部を介して前記外部装置の周辺情報の履歴データを取得し、当該外部装置のソーシャル情報を、取得した履歴データを含むソーシャル情報に更新するソーシャル情報更新部とを備え、
前記ソーシャル情報更新部は、前記データ蓄積部が蓄積した履歴データと更新した前記外部装置の履歴データとを比較して、前記データ処理装置および前記外部装置の位置情報の相対距離、エリア情報および同伴頻度増減傾向のうちの少なくとも1つを用いて、前記ソーシャル情報に含まれる親密度を更新し、
前記データ出力判定部は、さらに、前記ソーシャル情報更新部により更新された最新の前記ソーシャル情報を用いて、前記コンテンツデータを出力するか否かを判定する
請求項4に記載のデータ処理装置。 - 前記データ処理装置は、さらに、
前記通信部を介して前記外部装置からのデータ取得更新要求を受付けた場合、前記親密度に応じて、前記オブジェクト解析部が用いる前記認識辞書のデータを前記外部装置から取得して前記認識辞書を更新するソーシャル情報管理部を備え、
前記オブジェクト解析部は、前記ソーシャル情報管理部により更新された前記認識辞書を用いて、前記コンテンツデータに含まれるオブジェクトを抽出する
請求項4に記載のデータ処理装置。 - 前記オブジェクト解析部が前記認識辞書を用いて抽出可能な1つのオブジェクトに対する2つ以上のメタ情報が前記ソーシャル情報に関連付けられている場合、
前記データ出力判定部は、さらに、前記2つ以上のメタ情報のうち、前記ソーシャル情報に含まれる親密度が高い方に関連付けられたメタ情報を優先的に出力すると判定し、
前記データ出力部は、前記コンテンツデータと、優先的に出力すると判定された前記メタ情報とを出力する
請求項11に記載のデータ処理装置。 - コンテンツデータを出力するデータ処理方法であって、
所定のユーザ間の親密さの度合いを示す親密度と、前記コンテンツデータおよび前記親密度を対応付けるための情報とを含むソーシャル情報を用いて、前記コンテンツデータを出力するか否かを判定するデータ出力判定ステップと、
前記データ出力判定ステップで前記コンテンツデータを出力すると判定した場合に、前記コンテンツデータを出力するデータ出力ステップとを含み、
前記データ出力判定ステップでは、前記ソーシャル情報を参照し、前記親密度が所定の閾値以上の場合に、前記親密度に対応付けられるコンテンツデータを出力すると判定する
データ処理方法。 - 請求項13に記載のデータ処理方法に含まれるステップをコンピュータに実行させるプログラム。
- コンテンツデータを出力する集積回路であって、
所定のユーザ間の親密さの度合いを示す親密度と、前記コンテンツデータおよび前記親密度を対応付けるための情報とを含むソーシャル情報を用いて、前記コンテンツデータを出力するか否かを判定するデータ出力判定部と、
前記データ出力判定部が前記コンテンツデータを出力すると判定した場合に、前記コンテンツデータを出力するデータ出力部とを備え、
前記データ出力判定部は、前記ソーシャル情報を参照し、前記親密度が所定の閾値以上の場合に、前記親密度に対応付けられるコンテンツデータを出力すると判定する
集積回路。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11755827.0A EP2549390A4 (en) | 2010-03-18 | 2011-03-02 | DATA PROCESSING DEVICE AND DATA PROCESSING METHOD |
US13/320,383 US8650242B2 (en) | 2010-03-18 | 2011-03-02 | Data processing apparatus and data processing method |
JP2011535738A JP5570079B2 (ja) | 2010-03-18 | 2011-03-02 | データ処理装置およびデータ処理方法 |
CN201180002065.4A CN102428466B (zh) | 2010-03-18 | 2011-03-02 | 数据处理装置以及数据处理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-062420 | 2010-03-18 | ||
JP2010062420 | 2010-03-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011114634A1 true WO2011114634A1 (ja) | 2011-09-22 |
Family
ID=44648750
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/001211 WO2011114634A1 (ja) | 2010-03-18 | 2011-03-02 | データ処理装置およびデータ処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8650242B2 (ja) |
EP (1) | EP2549390A4 (ja) |
JP (1) | JP5570079B2 (ja) |
CN (1) | CN102428466B (ja) |
WO (1) | WO2011114634A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013120414A (ja) * | 2011-12-06 | 2013-06-17 | Canon Inc | 情報処理装置、情報処理装置の制御方法、プログラム |
WO2013136792A1 (ja) * | 2012-03-15 | 2013-09-19 | パナソニック株式会社 | コンテンツ処理装置、コンテンツ処理方法およびプログラム |
WO2013145518A1 (ja) * | 2012-03-28 | 2013-10-03 | ソニー株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
JP2014082582A (ja) * | 2012-10-15 | 2014-05-08 | Nippon Hoso Kyokai <Nhk> | 視聴装置、コンテンツ提供装置、視聴プログラム、及びコンテンツ提供プログラム |
WO2014097716A1 (ja) * | 2012-12-21 | 2014-06-26 | ソニー株式会社 | 情報処理装置、情報処理方法、端末、制御方法およびプログラム |
JP2016507799A (ja) * | 2012-12-03 | 2016-03-10 | 株式会社カカオ | 写真共有を推薦するサーバ及び方法、並びに、写真共有インターフェイス領域を表示するデバイス |
CN107257545A (zh) * | 2012-02-23 | 2017-10-17 | 三星电子株式会社 | 服务器及其信息提供方法 |
JP2018060552A (ja) * | 2012-11-14 | 2018-04-12 | フェイスブック,インク. | イメージ・パニングおよびズーミング効果 |
JP2019057011A (ja) * | 2017-09-20 | 2019-04-11 | ヤフー株式会社 | 判定装置、判定方法及び判定プログラム |
WO2019082606A1 (ja) * | 2017-10-24 | 2019-05-02 | パナソニックIpマネジメント株式会社 | コンテンツ管理機器、コンテンツ管理システム、および、制御方法 |
KR20190071642A (ko) * | 2019-06-07 | 2019-06-24 | 주식회사 비즈모델라인 | 친밀도를 이용한 관계형 포인트 운영 시스템 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012256105A (ja) * | 2011-06-07 | 2012-12-27 | Sony Corp | 表示装置、オブジェクト表示方法、及びプログラム |
JP2013003635A (ja) * | 2011-06-13 | 2013-01-07 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
US8452772B1 (en) * | 2011-08-01 | 2013-05-28 | Intuit Inc. | Methods, systems, and articles of manufacture for addressing popular topics in a socials sphere |
US9785628B2 (en) * | 2011-09-29 | 2017-10-10 | Microsoft Technology Licensing, Llc | System, method and computer-readable storage device for providing cloud-based shared vocabulary/typing history for efficient social communication |
CN103516926A (zh) * | 2012-06-20 | 2014-01-15 | 华晶科技股份有限公司 | 影像传输装置及其方法 |
US20140036087A1 (en) * | 2012-07-31 | 2014-02-06 | Sony Corporation | Enhancing a user experience utilizing camera location information and user device information |
KR20140068299A (ko) * | 2012-11-26 | 2014-06-09 | 한국전자통신연구원 | 소셜 네트워크 포렌식 장치 및 이 장치의 sns 데이터 분석 방법 |
JP6097632B2 (ja) * | 2013-05-10 | 2017-03-15 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム並びに記憶媒体 |
US11814088B2 (en) | 2013-09-03 | 2023-11-14 | Metrom Rail, Llc | Vehicle host interface module (vHIM) based braking solutions |
US9826541B1 (en) * | 2014-08-19 | 2017-11-21 | University Of South Florida | System and method for user-specific quality of service scheduling in wireless systems |
US11349589B2 (en) | 2017-08-04 | 2022-05-31 | Metrom Rail, Llc | Methods and systems for decentralized rail signaling and positive train control |
CN108600347A (zh) * | 2018-04-10 | 2018-09-28 | 王大江 | 一种分布式计算数据同步方法和装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001290789A (ja) * | 2000-04-10 | 2001-10-19 | Fujitsu Ltd | 通信手段選択支援装置及び方法 |
JP2007280125A (ja) * | 2006-04-07 | 2007-10-25 | Canon Inc | 情報処理装置、情報処理方法 |
JP2009124606A (ja) * | 2007-11-16 | 2009-06-04 | Sony Corp | 情報処理装置、情報処理方法、プログラム及び情報共有システム |
JP2009141952A (ja) * | 2007-11-16 | 2009-06-25 | Sony Corp | 情報処理装置、情報処理方法、コンテンツ視聴装置、コンテンツ表示方法、プログラム及び情報共有システム |
JP2009206774A (ja) * | 2008-02-27 | 2009-09-10 | Canon Inc | 画像伝送システム、画像伝送装置及び制御方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3471605B2 (ja) | 1998-04-22 | 2003-12-02 | 日本電信電話株式会社 | メンバー親密度利用型情報提示方法および装置とメンバー親密度利用型情報提示プログラムを記録した記録媒体 |
JP2001060200A (ja) | 1999-08-23 | 2001-03-06 | Fuji Xerox Co Ltd | 検索装置 |
KR100716995B1 (ko) * | 2005-03-24 | 2007-05-10 | 삼성전자주식회사 | 개인 컨텐츠 공유를 위한 인증 및 개인 컨텐츠 전송 방법과그에 적합한 디스플레이 장치와 서버 |
JP2007034743A (ja) | 2005-07-27 | 2007-02-08 | Nippon Telegraph & Telephone East Corp | コンテンツ配信システムおよび方法、プログラム |
US9305087B2 (en) * | 2007-12-20 | 2016-04-05 | Google Technology Holdings | Method and apparatus for acquiring content-based capital via a sharing technology |
US9189137B2 (en) * | 2010-03-08 | 2015-11-17 | Magisto Ltd. | Method and system for browsing, searching and sharing of personal video by a non-parametric approach |
-
2011
- 2011-03-02 JP JP2011535738A patent/JP5570079B2/ja active Active
- 2011-03-02 CN CN201180002065.4A patent/CN102428466B/zh active Active
- 2011-03-02 WO PCT/JP2011/001211 patent/WO2011114634A1/ja active Application Filing
- 2011-03-02 EP EP11755827.0A patent/EP2549390A4/en not_active Ceased
- 2011-03-02 US US13/320,383 patent/US8650242B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001290789A (ja) * | 2000-04-10 | 2001-10-19 | Fujitsu Ltd | 通信手段選択支援装置及び方法 |
JP2007280125A (ja) * | 2006-04-07 | 2007-10-25 | Canon Inc | 情報処理装置、情報処理方法 |
JP2009124606A (ja) * | 2007-11-16 | 2009-06-04 | Sony Corp | 情報処理装置、情報処理方法、プログラム及び情報共有システム |
JP2009141952A (ja) * | 2007-11-16 | 2009-06-25 | Sony Corp | 情報処理装置、情報処理方法、コンテンツ視聴装置、コンテンツ表示方法、プログラム及び情報共有システム |
JP2009206774A (ja) * | 2008-02-27 | 2009-09-10 | Canon Inc | 画像伝送システム、画像伝送装置及び制御方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2549390A4 * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013120414A (ja) * | 2011-12-06 | 2013-06-17 | Canon Inc | 情報処理装置、情報処理装置の制御方法、プログラム |
CN107257545A (zh) * | 2012-02-23 | 2017-10-17 | 三星电子株式会社 | 服务器及其信息提供方法 |
JPWO2013136792A1 (ja) * | 2012-03-15 | 2015-08-03 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | コンテンツ処理装置、コンテンツ処理方法およびプログラム |
WO2013136792A1 (ja) * | 2012-03-15 | 2013-09-19 | パナソニック株式会社 | コンテンツ処理装置、コンテンツ処理方法およびプログラム |
US9372874B2 (en) | 2012-03-15 | 2016-06-21 | Panasonic Intellectual Property Corporation Of America | Content processing apparatus, content processing method, and program |
US9942278B2 (en) | 2012-03-28 | 2018-04-10 | Sony Corporation | Controlling communication based on relationship between a plurality of devices |
WO2013145518A1 (ja) * | 2012-03-28 | 2013-10-03 | ソニー株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
JP2014082582A (ja) * | 2012-10-15 | 2014-05-08 | Nippon Hoso Kyokai <Nhk> | 視聴装置、コンテンツ提供装置、視聴プログラム、及びコンテンツ提供プログラム |
US10459621B2 (en) | 2012-11-14 | 2019-10-29 | Facebook, Inc. | Image panning and zooming effect |
JP2018060552A (ja) * | 2012-11-14 | 2018-04-12 | フェイスブック,インク. | イメージ・パニングおよびズーミング効果 |
JP2016507799A (ja) * | 2012-12-03 | 2016-03-10 | 株式会社カカオ | 写真共有を推薦するサーバ及び方法、並びに、写真共有インターフェイス領域を表示するデバイス |
JPWO2014097716A1 (ja) * | 2012-12-21 | 2017-01-12 | ソニー株式会社 | 情報処理装置、情報処理方法、端末、制御方法およびプログラム |
WO2014097716A1 (ja) * | 2012-12-21 | 2014-06-26 | ソニー株式会社 | 情報処理装置、情報処理方法、端末、制御方法およびプログラム |
US10848391B2 (en) | 2012-12-21 | 2020-11-24 | Sony Corporation | Information processing apparatus, information processing method, terminal, control method and program for urging an action executed by a different user based on a relationship point |
JP2019057011A (ja) * | 2017-09-20 | 2019-04-11 | ヤフー株式会社 | 判定装置、判定方法及び判定プログラム |
JP7037899B2 (ja) | 2017-09-20 | 2022-03-17 | ヤフー株式会社 | 判定装置、判定方法及び判定プログラム |
WO2019082606A1 (ja) * | 2017-10-24 | 2019-05-02 | パナソニックIpマネジメント株式会社 | コンテンツ管理機器、コンテンツ管理システム、および、制御方法 |
US11301512B2 (en) | 2017-10-24 | 2022-04-12 | Panasonic Intellectual Property Management Co., Ltd. | Content management device, content management system, and control method |
KR20190071642A (ko) * | 2019-06-07 | 2019-06-24 | 주식회사 비즈모델라인 | 친밀도를 이용한 관계형 포인트 운영 시스템 |
KR102129018B1 (ko) * | 2019-06-07 | 2020-07-03 | 주식회사 비즈모델라인 | 친밀도를 이용한 관계형 포인트 운영 시스템 |
Also Published As
Publication number | Publication date |
---|---|
US8650242B2 (en) | 2014-02-11 |
JPWO2011114634A1 (ja) | 2013-06-27 |
EP2549390A4 (en) | 2013-10-02 |
CN102428466A (zh) | 2012-04-25 |
JP5570079B2 (ja) | 2014-08-13 |
EP2549390A1 (en) | 2013-01-23 |
US20120066309A1 (en) | 2012-03-15 |
CN102428466B (zh) | 2015-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011114634A1 (ja) | データ処理装置およびデータ処理方法 | |
US11936720B2 (en) | Sharing digital media assets for presentation within an online social network | |
JP5823499B2 (ja) | コンテンツ処理装置、コンテンツ処理方法、コンテンツ処理プログラム、及び集積回路 | |
CN103023965B (zh) | 基于事件的媒体分组、回放和共享 | |
JP5068379B2 (ja) | 近接検出に基づいてメディアを拡張するための方法、システム、コンピュータプログラム、および装置 | |
US9342817B2 (en) | Auto-creating groups for sharing photos | |
JP6229655B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
TWI522821B (zh) | 相片管理系統 | |
US9521211B2 (en) | Content processing device, content processing method, computer-readable recording medium, and integrated circuit | |
US20130307997A1 (en) | Forming a multimedia product using video chat | |
CN102945276A (zh) | 生成和更新基于事件的回放体验 | |
JP2010118056A (ja) | コンテンツアルバム化装置及びコンテンツアルバム化方法 | |
CN103412951A (zh) | 基于人物照片的人脉关联分析管理系统与方法 | |
US20150189118A1 (en) | Photographing apparatus, photographing system, photographing method, and recording medium recording photographing control program | |
US20150242405A1 (en) | Methods, devices and systems for context-sensitive organization of media files | |
JP2009176032A (ja) | 情報処理装置および方法、並びにプログラム | |
CN103177051A (zh) | 相片管理系统 | |
JP2020194472A (ja) | サーバ、表示方法、作成方法、およびプログラム | |
US20230260549A1 (en) | Information processing apparatus, information processing method, and program | |
JP7266356B1 (ja) | プログラム、情報処理装置、情報処理システム及び情報処理方法 | |
US20230353795A1 (en) | Information processing apparatus, information processing method, and program | |
Sarvas | Media content metadata and mobile picture sharing | |
JP2022176567A (ja) | 再生情報生成装置、動画編集装置および動画編集プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180002065.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011535738 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11755827 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2011755827 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011755827 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13320383 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |