WO2010126042A1 - Système de sortie de contenu - Google Patents

Système de sortie de contenu Download PDF

Info

Publication number
WO2010126042A1
WO2010126042A1 PCT/JP2010/057464 JP2010057464W WO2010126042A1 WO 2010126042 A1 WO2010126042 A1 WO 2010126042A1 JP 2010057464 W JP2010057464 W JP 2010057464W WO 2010126042 A1 WO2010126042 A1 WO 2010126042A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
unit
storage unit
search condition
terminal device
Prior art date
Application number
PCT/JP2010/057464
Other languages
English (en)
Japanese (ja)
Inventor
淳 新谷
寺田 智
重幸 山中
英知 大槻
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2010126042A1 publication Critical patent/WO2010126042A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/38Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/38Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/383Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Definitions

  • the present invention relates to a content output system that displays and reproduces content composed of images, sounds, and the like, a server device, a content output device, a content output method, a content output program, and a recording medium that stores the content output program.
  • Patent Document 1 describes a method for enjoying the contents owned by the user in combination with the contents on the Internet.
  • a photographic image owned by a user and a desired photographic image selected by the user from among a large number of photographic images on the Internet are arranged in combination, and these photographic images are printed out. .
  • Patent Document 1 the user must specify all of the content and its arrangement position, and the input operation for this is complicated. In particular, when there is a large amount of content on the server, it is necessary to find and select the desired content from this large amount of content, which is extremely troublesome for the user.
  • the present invention has been made in view of the above-described conventional problems, and efficiently searches for and uses useful content from a large amount of content stored in a server device or the like on a network. It is an object to provide a content output system, a server device, a content output device, a content output method, a content output program, and a recording medium storing the content output program.
  • a content output system is a content output system that performs information communication between a terminal device and a server device through a network.
  • the terminal device stores a plurality of contents in a first content storage.
  • a classification unit that classifies a plurality of contents stored in the first content storage unit into one or a plurality of content groups based on a classification condition, and classifies each of the content groups into the content group
  • a search condition generation unit that generates a search condition based on the accompanying information of the content that has been generated, and the search condition generated by the search condition generation unit for each of the content groups is transmitted to the server device.
  • each of the content groups includes an output unit that outputs both the content classified into the content group and the content received from the server device.
  • the first content storage unit exists in a terminal device such as its own personal computer
  • the second content storage unit exists in a server device on the network.
  • the content is classified into one or a plurality of content groups based on the classification conditions, and search conditions are generated for each content group based on the accompanying information of the content classified into the content group and stored in the server device.
  • the content corresponding to the search condition can be searched from each content that has been set.
  • the content obtained from the classification condition and the content obtained from the search condition can be output together. That is, if the classification conditions are appropriately set on the personal computer side, a plurality of contents stored in the first content storage unit are classified into one or a plurality of content groups based on the classification conditions.
  • a search condition is generated based on the accompanying information of the content classified into the content group, and the content that matches the search condition is searched on the server device side, and is classified into the content group
  • Both the content and the content obtained from the search condition are output. Therefore, simply by setting the classification conditions on the personal computer side, the mutually related contents are selected on the personal computer side and the server apparatus side, and the mutually related contents are output. It becomes possible to efficiently retrieve useful content from a large amount of content stored in a server device or the like on a network and use it in a terminal device.
  • the output unit may set a display layout of the content classified into the content group and the content received from the server device, and display and output these content in the display layout.
  • the output unit may output the content classified into the content group and the content received from the server device in an identifiable manner.
  • the output unit may output the content classified into the content group and the content received from the server device together with accompanying information of each of these content.
  • This configuration makes it possible to check the accompanying information of each content that has been output.
  • the terminal device may include an input operation unit for inputting the classification condition.
  • This configuration makes it possible to input and set arbitrary classification conditions.
  • the classification condition may be set in advance, changed by an input operation of the input operation unit, or input set by an input operation of the input operation unit. As described above, various methods for setting the classification condition can be applied.
  • the accompanying information may be position information or date / time information.
  • the classification unit compares the position information or date / time information of each content stored in the first content storage unit with a threshold value, and sets one or more contents stored in the first content storage unit or You may classify
  • the classification unit arranges the contents in time series using the date and time information of each content stored in the first content storage unit, and then arranges the contents using the position information of each content. May be classified into one or a plurality of content groups.
  • the threshold value may be set in advance, changed by an input operation of the input operation unit, input set by an input operation of the input operation unit, or changed based on accompanying information of content. .
  • Various methods for setting the threshold in this way can be applied.
  • the terminal device transmits accompanying information of each content stored in the first content storage unit to the server device, and the server device performs classification based on the accompanying information of each content received from the terminal device.
  • a condition may be obtained and this classification condition may be transmitted to the terminal device.
  • the content transmitted from the server device to the terminal device includes an address on the Internet, and from the terminal device to the server device or another server in response to an input operation on the terminal device.
  • the server sends the address, collects information based on the address at the server device that received the address, sends the information from the server device to the terminal device, and displays the information on the terminal device. May be.
  • the content output system of the present invention is a content output system that performs information communication between a terminal device and a server device via a network
  • the server device includes a first content storage unit that stores a plurality of contents and a plurality of contents. And a plurality of contents stored in the first content storage unit are transmitted to the terminal device, and a search condition is received from the terminal device as a response to the transmission of the content.
  • a communication unit that transmits content that meets a condition to the terminal device, and a search unit that searches for content that meets the received search condition from among a plurality of contents stored in the second content storage unit,
  • the terminal device stores information in the first content storage unit from the server device.
  • a plurality of received contents a classification unit that classifies the received plurality of contents into one or more content groups based on classification conditions, and each of the content groups is classified into the content group
  • a search condition generation unit that generates a search condition based on the accompanying information of the content, and the search condition generated by the search condition generation unit for each of the content groups is transmitted to the server device.
  • a communication unit that receives content corresponding to a search condition from a server device as a response to transmission, and an output unit that outputs, for each of the content groups, content classified into the content group and each content received from the server device It has.
  • the first content storage unit and the second content storage unit exist in the server device on the network, and a plurality of devices stored in the first content storage unit of the server device are the terminal devices such as a personal computer.
  • Content is classified into one or more content groups based on the classification conditions, and for each content group, based on the accompanying information of the content classified into the content group A search condition is generated, and content corresponding to the search condition can be searched from each content stored in the second content storage unit of the server device.
  • the content obtained from the classification condition and the content obtained from the search condition can be output together.
  • the classification conditions are appropriately set on the personal computer side, a plurality of contents stored in the first content storage unit of the server device are classified into one or a plurality of content groups based on the classification conditions. The Then, for each of the content groups, a search condition is generated based on the accompanying information of the content classified into the content group, and the content that matches the search condition is stored in the second content storage unit on the server device side. The contents retrieved from the contents and classified into the contents group and the contents obtained from the search conditions are output together. Therefore, by setting the classification condition on the personal computer side, the mutually related content is selected between the first content storage unit and the second content storage unit of the server device, and the mutually related content is selected. Is output, the user can efficiently search for useful content from a large amount of content stored in a server device or the like on the network and use it on the terminal device.
  • the content output system of the present invention is a content output system that performs information communication between a terminal device and a server device via a network, wherein the server device includes a first content storage unit that stores a plurality of contents and a plurality of contents.
  • Each of the content group, a classification unit that classifies a plurality of contents stored in the first content storage unit into one or a plurality of content groups based on a classification condition, and each of the content groups A search condition generation unit that generates a search condition based on accompanying information of content classified into the content group, and content corresponding to the search condition generated by the search condition generation unit for each of the content groups From a plurality of contents stored in the second content storage unit
  • the first content storage unit and the second content storage unit exist in the server device on the network, and the server device can store a plurality of contents stored in the first content storage unit based on the classification condition. It can be classified into one or more content groups. Further, the server device generates a search condition for each content group based on the accompanying information of the content classified into the content group, and the search condition is determined from each content stored in the second content storage unit of the server device. The content corresponding to can be searched. Then, for each content group, the content classified from the content group and the content obtained from the search condition are transmitted from the server device to a terminal device such as a personal computer, and the content is output by the terminal device. Can do.
  • the mutually related content is selected between the first content storage unit and the second content storage unit of the server device, and the mutually related content is output to the terminal device. It is possible to efficiently search for useful content from a large amount of content stored in the server device and use it in the terminal device.
  • the server device includes a first content storage unit storing a plurality of contents and a second content storage unit storing a plurality of contents in a server device that performs information communication with a terminal device through a network.
  • a plurality of contents stored in the first content storage unit are classified into one or a plurality of content groups based on classification conditions, and each of the content groups is classified into the content group
  • a search condition generation unit that generates a search condition based on the accompanying information of the content, and a content corresponding to the search condition generated by the search condition generation unit for each of the content groups
  • the second content storage unit A search unit for searching from a plurality of contents stored in the database, and each of the content groups And a both communication unit that transmits to the terminal device searched content by categorized content and the search unit to the Ceiling group.
  • the server device can classify the plurality of contents stored in the first content storage unit into one or a plurality of content groups based on the classification condition. Further, the server device generates a search condition for each content group based on the accompanying information of the content classified into the content group, and corresponds to the search condition from each content stored in the second content storage unit Content to be searched, and the content classified from the content group and the content obtained from the search condition can be transmitted to the terminal device through the network. That is, according to the server device of the present invention, the mutually related content is selected between the first content storage unit and the second content storage unit, and the mutually related content is transmitted to the terminal device. Therefore, the user can efficiently search for useful content from a large amount of content stored in a server device or the like on the network and use it on the terminal device.
  • the content output device of the present invention includes a first content storage unit storing a plurality of contents, a second content storage unit storing a plurality of contents, and a plurality of contents stored in the first content storage unit.
  • a search condition that generates a search condition for each of the content groups based on the incidental information of the content classified into the content group for each of the content groups based on the classification condition
  • a search unit for searching for a content corresponding to the search condition generated by the search condition generation unit from a plurality of contents stored in the second content storage unit for each of the content groups; For each of the content groups, the content classified into the content group and the content searched by the search unit. And a both output section for outputting the tool.
  • the output device can classify the plurality of contents stored in the first content storage unit into one or a plurality of contents groups based on the classification condition. Further, the output device generates a search condition for each content group based on the accompanying information of the content classified into the content group, and corresponds to the search condition from each content stored in the second content storage unit Content to be searched, and content obtained from the content group and the search condition can be output to the output unit. That is, according to the output device of the present invention, the mutually related content is selected between the first content storage unit and the second content storage unit, and the mutually related content is output to the output unit. Therefore, the user can efficiently search for and use useful content from a large amount of content stored in the output device.
  • the content output method of the present invention is a content output method for outputting content, wherein a first content storage step for storing a plurality of contents, a second content storage step for storing a plurality of contents, and the first content storage.
  • a search condition generation step for generating a search condition, and for each of the content groups, the content corresponding to the search condition generated in the search condition generation step is searched from the plurality of contents stored in the second content storage step. Search step, and For people, and an output step of outputting both the content searched categorized content and the search step to the content group.
  • the plurality of contents stored in the first content storage step can be classified into one or a plurality of content groups based on the classification condition. Further, for each content group, a search condition is generated based on the accompanying information of the content classified into the content group, and the content corresponding to the search condition is searched from the content stored in the second content storage step. Thus, the content classified into the content group and the content obtained from the search condition can be output together. That is, according to the content output method of the present invention, mutually related contents are selected from the plurality of contents stored in the first content storage step and the plurality of contents stored in the second content storage step, The mutually related contents are output. Therefore, by causing a terminal device such as a computer to execute such a content output method, a user of the terminal device can efficiently use a useful content out of a large amount of content stored in a server device on the network. Search and use.
  • Such a content output method can be realized as a content output program for causing a computer to execute each step, and this content output program is recorded on a computer-readable recording medium and provided. It is possible.
  • the computer can implement the present invention by reading a program from a recording medium, receiving the program through a communication network, and executing the program.
  • a plurality of processes can be distributed to a plurality of terminals. Therefore, the program can be applied not only to a single terminal such as a computer but also to a system.
  • a content output system capable of efficiently searching and using useful content from a large amount of content stored in a server device or the like on a network
  • a content output method, a content output program, and a recording medium storing the content output program can be provided.
  • FIG. 1 is a block diagram showing an embodiment of a content output system of the present invention.
  • FIG. 2 is a flowchart showing content search and output processing in the terminal device of FIG.
  • FIG. 3 is a diagram illustrating a list of accompanying information of photographic images displayed on the screen in the terminal device of FIG.
  • FIG. 4 is a diagram illustrating a content group including a photographic image at a shooting position that falls within a wide area.
  • FIG. 5 is a flowchart showing processing for classifying photographic images into content groups in the terminal device of FIG.
  • FIG. 6 is a diagram illustrating search conditions given from the terminal device of FIG. 1 to the server device.
  • FIG. 7 is a diagram exemplifying accompanying information of a photographic image retrieved by the server device of FIG. FIG.
  • FIG. 8 is a diagram showing an example of a display form of content on the screen in the terminal device of FIG. 1
  • (a) is a diagram showing a display example of a photographic image stored in the second content storage unit
  • b) is a diagram showing a display example of a photographic image stored in the first content storage unit.
  • FIG. 9 is a diagram illustrating another display form of the content on the screen in the terminal device of FIG.
  • FIG. 10 is a diagram for explaining an operation when deleting the content on the screen in the terminal device of FIG. 1.
  • FIG. 11 is a diagram for explaining a work purchase screen as another content display form in the terminal device of FIG.
  • FIG. 1 is a diagram showing a display example of a photographic image
  • FIG. 12 is a flowchart showing processing for displaying the work purchase screen shown in FIG. 11
  • FIG. 13 is a diagram illustrating still another display form of content in the terminal device of FIG.
  • FIG. 14 is a diagram illustrating an example in which a product image is displayed as content on the screen in the terminal device of FIG. 1, and FIG.
  • FIG. 14A is a diagram illustrating a display example of the product image stored in the first content storage unit.
  • (b) is a diagram showing a display example of the product image stored in the second content storage unit.
  • FIG. 15 is a diagram showing another example in which a product image is displayed as content on the screen in the terminal device of FIG.
  • FIG. 16 is a block diagram showing a modification of the terminal device of FIG.
  • FIG. 17 is a block diagram showing a modification of the system of FIG.
  • FIG. 18 is a block diagram showing another modification of the system of FIG.
  • FIG. 19 is a block diagram showing an embodiment of the content output apparatus of the present invention.
  • FIG. 1 is a block diagram showing an embodiment of a content output system of the present invention.
  • the system of this embodiment includes a terminal device 101 on the user side, a server device 201, and a network N that connects the terminal device 101 and the server device 201 to each other.
  • the terminal device 101 includes a personal computer and peripheral devices thereof, and includes an input unit 102, a content management unit 103, a first content storage unit 104, a content classification unit 105, a search condition generation unit 106, a communication unit 107, a display.
  • a generation unit 108, a display unit 109, and the like are provided.
  • the input unit 102 is a keyboard, a mouse, or the like.
  • the content management unit 103, the content classification unit 105, the search condition generation unit 106, and the display generation unit 108 are implemented by reading out and executing programs in the ROM by the CPU and implementing their functions.
  • the first content storage unit 104 is a storage device such as a hard disk device.
  • the communication unit 107 is a communication interface or the like, and performs data communication with the server apparatus 201 through the network N.
  • the display unit 109 is a display device such as a liquid crystal display device.
  • the first content storage unit 104 is not limited to a hard disk device, and may be an external storage medium such as an SD card, a DVD medium, or a BD medium that can be read by the terminal device 101.
  • the server device 201 includes a computer and its peripheral devices, and includes a communication unit 202, a search unit 203, a conversion table storage unit 204, a second content storage unit 205, and the like.
  • the communication unit 202 is a communication interface or the like, and performs data communication with the terminal device 101 through the network N.
  • search unit 203 a program in the ROM is read and executed by the CPU, and its function is realized.
  • the conversion table storage unit 204 and the second storage unit 205 are storage devices such as a hard disk device.
  • a plurality of contents are inputted and stored in the first content storage unit 104 of the terminal device 101 through an interface (not shown) of the terminal device 101.
  • the content in the first content storage unit 104 is a personal property acquired by the user of the terminal device 101.
  • the second content storage unit 205 of the server apparatus 201 a large number of contents are input and stored through the network N and an interface (not shown) of the server apparatus 201.
  • the content in the second content storage unit 205 is a shared material that can be used by an unspecified number of people.
  • These contents are still images such as photographic images, and have accompanying information indicating shooting date and time, shooting position, and the like.
  • the terminal device 101 After the power switch of the terminal device 101 is turned on and the terminal device 101 is activated, when content classification is instructed by an input operation of the input unit 102 (step S301), the terminal device 101 responds thereto.
  • the content management unit 103 searches the content in the first content storage unit 104, generates a list of the searched content, and displays the content list on the screen of the display unit 109 through the display generation unit 108 ( Step S302).
  • a list of all content in the first content storage unit 104 may be generated, or some content such as content creation period, age, folder hierarchy, and specific tags may be included.
  • a list may be generated.
  • the classification of contents is instructed by an input operation of the input unit 102, and the creation period, age, folder hierarchy, specific tag, and the like are instructed.
  • the content management unit 103 selects content corresponding to the instructed creation period, age, folder hierarchy, specific tag, and the like, and generates a list of the selected content. For example, content thumbnails and file names are displayed as a list.
  • the list of contents is displayed on the screen of the display unit 109 and simultaneously output to the content classification unit 105.
  • the content classification unit 105 refers to the accompanying information of each content in the list, and classifies all the content listed in the list into one or a plurality of content groups based on the classification condition (step S303).
  • This classification condition may be selected by an input operation of the input unit 102 from among a plurality of types of preset classification conditions, or may be set by an input operation of the input unit 102.
  • the search condition generation unit 106 receives the content classified into each content group from the content classification unit 105 via the content management unit 103, and refers to the accompanying information of the content classified into the content group for each content group. Then, a search condition is generated from the accompanying information (step S304). At this time, every time one content group is created by the content classification unit 105, a search condition may be generated from the accompanying information of each content classified into the content group, or the first content may be generated by the content classification unit 105. After the classification of all the contents in the storage unit 104 is completed and all the content groups are created, each search condition may be generated for each content group.
  • the content management unit 103 When the content management unit 103 receives the search condition generated by the search condition generation unit 106, the content management unit 103 transmits the search condition from the communication unit 107 to the server device 201 through the network N, and the content corresponding to the search condition is transmitted to the server device. It requests to 201 (step S305).
  • the search condition from the terminal device 101 is received by the communication unit 202 through the network N, and this search condition is input to the search unit 203.
  • the search unit 203 refers to the conversion table in the conversion table storage unit 204 and converts this search condition so as to match the accompanying information of each content in the second content storage unit 205, and the second content storage unit 205.
  • the content having the accompanying information corresponding to the converted search condition is searched by referring to the accompanying information of each content, and the searched content is transmitted from the communication unit 202 to the terminal device 101 through the network N.
  • the terminal device 101 waits for a response from the server device 201 (step S306), and when the content in the second content storage unit 205 having accompanying information corresponding to the search condition is received by the communication unit 107 via the network N (step S306).
  • step S306 “yes”), the received content is input to the content management unit 103 (step S307).
  • the content management unit 103 outputs the received content in the second content storage unit 205 and the content of the content group classified in step S303 to the display generation unit 108.
  • the display generation unit 108 When these contents are input, the display generation unit 108 generates a display layout of these contents (step S308), and displays and outputs these contents together on the screen of the display unit 109 in the display layout (step S309). .
  • step S306 If no response is received from the server apparatus 201 and no content is received from the server apparatus 201 (“no” in step S306), only the contents of the content group classified in step S303 are displayed on the screen of the display unit 109. Is displayed and output (steps S308 and S309).
  • steps S304 to S309 it is determined whether or not the processing of steps S304 to S309 has been completed for all content groups whose content has been classified by the content classification unit 105 (step S310). 2), if the processes of steps S304 to S309 are repeated and completed (“yes” in step S310), the process of FIG. 2 ends.
  • step S302S the process proceeds to the processing after step S302S and the content classification is started.
  • the process may proceed immediately to step S302S and subsequent steps.
  • a recording medium such as a flash memory
  • step S302S when a card such as an IC card or a mobile phone having a card function is held over while the terminal device 101 is operating, the card is scanned to determine whether or not specific information is recorded on the card. When it is determined that the information is included, the process may proceed to step S302S and subsequent steps. As a result, the user can proceed to the processing after step S302S only by scanning the card.
  • the card is scanned by a reader (such as a card reader or Felica), and (C: ⁇ user) of the terminal device 101, or the server device 201 ( ⁇ 10.23.45.67 ⁇ japan) or URL (http: // pro It is determined whether an address on the Internet such as /) is recorded on the card, and when it is determined that this address is recorded, the process proceeds to step S302S and subsequent steps.
  • a reader such as a card reader or Felica
  • URL http: // pro It is determined whether an address on the Internet such as /) is recorded on the card, and when it is determined that this address is recorded, the process proceeds to step S302S and subsequent steps.
  • an address indicating the location of the first content storage unit 104 or the second content storage unit 205 is set as an address, and this address is delivered to an application that executes the processing of the flowchart of FIG.
  • the first content storage unit 104 and the second content storage unit 205 may be accessed.
  • an application for executing the processing of the flowchart of FIG. 2 may be activated.
  • this fact may be displayed on the screen of the display unit 109, and the application may be started in response to an instruction by an input operation of the input unit 102 thereafter.
  • the first content storage unit 104 or the second content storage unit 205 is accessed based on this address, and the content stored in these storage units is confirmed.
  • the application may be activated.
  • user information is set as specific information recorded on the card, and a correspondence table between the user information and the addresses of the first content storage unit 104 and the second content storage unit 205 is stored in the memory of the terminal device 101. deep. Then, when the user information is read from the card, the addresses of the first content storage unit 104 and the second content storage unit 205 corresponding to the user information are obtained by referring to the table in the memory, and these addresses are shown in the flowchart of FIG. The first content storage unit 104 and the second content storage unit 205 may be accessed based on this address. Further, a process for creating a correspondence table between the user information and the addresses of the first content storage unit 104 and the second content storage unit 205 may be set so that the user can create the correspondence table. . Furthermore, a password, personal name, membership number, fingerprint, etc. may be set as user information. If it is a fingerprint, it is necessary to recognize and identify this fingerprint.
  • content classification conditions are also stored in a recording medium or a card, and when the card is held over, the content classification conditions are read from the card, and the classification conditions are transferred to the content classification unit 105, and the content classification unit 105.
  • the content classification by may be started.
  • the content classification will be explained.
  • the first content storage unit 104 stores a plurality of photographic images that are personal belongings acquired by the user of the terminal device 101.
  • the second content storage unit 205 stores a large number of photographic images provided by, for example, a photographic service company, which are shared materials that can be used by an unspecified number of people.
  • the content management unit 103 searches for a photographic image in the first content storage unit 104. Then, a list of the retrieved photographic images is generated, and the photographic image list is displayed on the screen of the display unit 109 through the display generation unit 108.
  • FIG. 3 illustrates a list of photographic images.
  • other information may be included as accompanying information, and the types of information may be increased or decreased.
  • the accompanying information of the photographic image is described in XML, it may be described in another description language, binary data, or a structure data format handled in the program.
  • the list of photographic images is displayed on the screen of the display unit 109 and simultaneously output to the content classification unit 105.
  • the content classification unit 105 refers to the accompanying information of each photographic image in this list and classifies each photographic image into several content groups based on the classification condition.
  • the content classification unit 105 focuses on the longitude information ⁇ gps-long> of the shooting position, which is accompanying information of the photographic image, and refers to the longitude information ⁇ gps-long> of each photographic image to obtain one photographic image.
  • the photographic position of a photographic image that falls within a certain area with the photographic position as the base point is obtained, and each photographic image is classified under a classification condition in which all the photographic images that have the photographing position within this certain area are set as one content group.
  • the separation distance of the shooting positions is calculated.
  • the base point and the identifier ⁇ picture id "4"
  • the separation distance between the photographing positions of the photographic image is 7.5 km
  • the separation distance of the shooting position from the base point is compared with a threshold value, and the separation distance is less than the threshold value.
  • the separation distance of the shooting position of the photographic image from the base point is obtained for each base point, the separation distance of the shooting position of the photographic image is compared with a threshold value, and this photographic image is Decide whether to classify into the same content group as the photographic image of the base point. Thereby, a plurality of content groups can be obtained.
  • the longitude information ⁇ gps-long> but also the latitude information ⁇ gps-lat> may be used, or both may be used to determine the separation distance between the shooting positions of each photographic image.
  • the threshold value may be set and stored in advance in the memory of the terminal apparatus 101, or may be changed or set as appropriate by an input operation of the input unit 102 by the user. It is also possible to change the threshold according to the base point, shooting date and time, content type, and the like.
  • the entire shooting range may be obtained from the shooting positions of all the photographic images in the first content storage unit 104, and a threshold value corresponding to the entire shooting range may be calculated. For example, when the entire shooting range is wide, the threshold value is increased, and when the entire shooting range is narrow, the threshold value is set small. As a result, the number of photographic images included in each content group can be adjusted.
  • the threshold value may be inquired from the terminal device 101 to the server device 201.
  • the server device 201 includes a data table in which areas and thresholds are associated with each other. The server device 201 obtains an area where the shooting position from the terminal device 101 is entered, searches the data table for a threshold value corresponding to this area, and sets the threshold value to the terminal. Send to device 101. More specifically, as the data table, a data table in which a large threshold value is set corresponding to a large area of Mt.
  • FIG. 4 is a diagram illustrating a content group including a photographic image at a shooting position that falls within a wide area.
  • the content classification unit 105 refers to the shooting date / time information ⁇ time> of each photo image in the content list of FIG. 3 and sorts the photo images in the order of their shooting date / time. Then, the content classification unit 105 sequentially selects each photo image from the top, calculates a shooting date difference between one photo image and the next photo image, and determines whether the date difference is less than a threshold value. If it is less than the threshold, the next group of photographic images is included in the content group of one photographic image, and if it is greater than or equal to the threshold, a new content group is set, and the next content group is set to the next content group. Include sequential photographic images. Thereby, it is possible to classify photographic images having a short shooting time interval into one content group.
  • the next photographic image is selected, the time difference from a photographic image that does not belong to any content group is calculated, and if this time difference is less than the threshold, the photographic image Are grouped into new content groups.
  • the threshold value can be set by various methods, similar to the threshold value compared with the separation distance of the photographing position of the photographic image described above. For example, the maximum value of the time difference may be calculated for the entire photographic image in the first content storage unit 104, and the threshold value may be set small when the maximum value is large, and the threshold value may be set small when the time difference is small. As a result, the number of photographic images included in the content group can be adjusted.
  • content groups can be created in units of one day, one week, one month, one year, and even seasonal, morning and night.
  • the date and time information is transmitted from the terminal apparatus 101 to the server apparatus 201, and a list of contents close to the date and time of the date and time information is created on the server 201 side. May be returned to the terminal apparatus 101.
  • photographic images may be classified into content groups by using both the shooting date and the shooting position instead of selectively using the shooting date and the shooting position.
  • the photo images are rearranged in the order of their shooting date / time.
  • the content classification unit 105 sequentially selects each photographic image from the top, obtains a separation distance between the photographing position of one photographic image and the photographing position of the next photographic image, and this separation distance is less than the threshold value. If it is less than the threshold, classify the next photographic image into the same content group as one photographic image, and if it is greater than or equal to the threshold, newly set the content group, The next sequence of photographic images is classified into this new content group.
  • the photographic images can be classified into content groups using the shooting position and the shooting date / time.
  • the content classification unit 105 When the content classification unit 105 receives the list, the content classification unit 105 refers to the accompanying information of each photographic image, arranges the photographic images listed in the list in order of photographing date and time, and then determines whether there is a photographing position for each photographic image. (Step S601). If there is a photographic image accompanied by a photographing position, the photographing position of the photographic image is set as a base point, and photographing date / time information is acquired from the accompanying information of the photographic image (step S602).
  • the content classification unit 105 acquires shooting date / time information from the accompanying information of the photographic image whose shooting date / time is one order before the photographic image that is the base point (step S603), and the shooting date / time of these photographic images. Is determined, and it is determined whether or not the time difference is less than a threshold value (S604). If it is less than the threshold value (“yes” in step S604), the process returns to step S603, and the shooting date / time information is acquired from the accompanying information of the photographic image whose shooting date / time is the previous one (step S603).
  • step S604 The time difference between the photographing date and time of the photograph image and the photograph image whose photographing date and time information was acquired in the immediately preceding step S603 is obtained, and it is determined whether or not this time difference is less than the threshold value (S604). Thereafter, similarly, if it is less than the threshold value (“yes” in step S604), the process returns to step S603, and the shooting date / time information is acquired from the accompanying information of the photographic image whose shooting date / time is the previous one (step S603). Then, the time difference between the photographing dates and times of two photograph images arranged in succession is obtained, and it is determined whether or not this time difference is less than a threshold value (step S604).
  • step S604 When the value is equal to or greater than the threshold value (“no” in step S604), the photographic image is returned to the photographic image having the shooting date and time one after the photographic image for which the shooting date / time information of the content was acquired in the immediately preceding step S603. Is the first photographic image of the content group (step S605).
  • the photograph images are sequentially included in one content group.
  • the content classification unit 105 returns to the photographic image that is the base point (step S606), and acquires photographic date / time information from the incidental information of the photographic image in the order of the photographic date one after the photographic image (In step S607), a time difference between the photographing dates and times of these photographic images is obtained, and it is determined whether this time difference is less than a threshold value (S608). If it is less than the threshold (“yes” in step S608), the process returns to step S607, and the shooting date / time information is acquired from the accompanying information of the photographic image with the next shooting date / time (step S607).
  • step S608 The time difference between the photographing date and time of the photograph image and the photograph image whose photographing date and time information was acquired in the immediately preceding step S607 is obtained, and it is determined whether or not this time difference is less than the threshold value (S608). Thereafter, similarly, if it is less than the threshold (“yes” in step S608), the process returns to step S607, and the shooting date / time information is acquired from the incidental information of the photograph image with the next shooting date / time (step S607). Then, a time difference between the shooting dates and times of two adjacent photographic images on the list is obtained, and it is determined whether or not this time difference is less than a threshold value (step S608).
  • step S608 If the value is equal to or greater than the threshold (“no” in step S608), the photographic date is returned to the photographic image in the order that was one before the photographic date for which the photographic date / time information was acquired in the immediately preceding step S607.
  • the last photographic image of the group is set (step S609).
  • the photographic images are sequentially advanced and included in one content group.
  • the first photographic image and the last photographic image are obtained, and the contents from the first photographic image to the last photographic image are set as one content group (step S610).
  • the photographic images listed in the list are arranged in order of the shooting date and time. Are acquired in order from the shooting date and time of the photographic image that is the base point, and the processing proceeds from step S601 to step S605, and then the date and time is determined from the shooting date and time of the photographic image that is the base point. If the shooting date / time information of the subsequent photographic image can be acquired in order from the shooting date / time closest to the shooting date / time of the photographic image that is the base point, and the processing of step S606 to step S609 can proceed, the photographic image listed The process of arranging the images in the order of shooting date and time may be omitted.
  • a classification condition may be used in which photographic images taken within a certain time centered on the photographing date and time of the base photographic image are used as one content group. For example, if the shooting date and time of the base photographic image is AM 9:00 and the fixed time is 2 hours, all the photographic images whose shooting date and time are in the range of AM 8:00 to AM 10:00 are combined into one content group. Put together.
  • the photographic image can be classified into a content group using these comments or marks.
  • the content classification unit 105 refers to the accompanying information of each photographic image in the list of photographic images, determines the presence or absence of comment information ⁇ comment> for each photographic image, and extracts only photographic images having the comment information ⁇ comment>. Into one content group.
  • each photographic image in the order up to this photographic image is collected into one content group, and another photographic image having comment information ⁇ comment> from the next photographic image.
  • Each photographic image in the order up to is collected into another content group.
  • the photographic images in the order one order before this photographic image are collected into one content group, and the comment information ⁇ comment> is obtained from this photographic image.
  • Each photographic image in the order up to one prior to another photographic image may be combined into another content group.
  • each photographic image can be grouped into a content group using a mark as an index.
  • the user may manually classify.
  • the manual classification method is, for example, that thumbnails of photographic images are displayed on the screen, and when the user selects a photographic image belonging to each content group or displays a slide show of photographic images, the user selects each content group. For example, it is possible to sequentially input and specify photographic images belonging to.
  • the classification condition may be set by default, or may be input or changed by an input operation of the input unit 102.
  • a plurality of types of classification conditions are set in advance, these classification conditions are displayed on the screen of the display unit 109, and one of the classification conditions on the screen is selected by an input operation of the input unit 102. It doesn't matter.
  • the process proceeds to step S303 after the classification condition is input or selected, or the process proceeds to step S303 unless the classification condition is input or selected. May be prohibited.
  • the search condition generation unit 106 uses the accompanying information of the photographic images included in the content group to search for photographic images related to the content group from among the photographic images in the second content storage unit 205 of the server device 201. Generate search conditions for. For example, each shooting position is acquired from the accompanying information of each photo image of the content group in FIG. 4, and the center position of these shooting positions is obtained as a search condition. This search condition is transmitted from the terminal device 101 to the server device 201.
  • the search condition from the terminal device 101 is received by the communication unit 202 and input to the search unit 203.
  • the search unit 203 refers to the correspondence table in the conversion table storage unit 204 and searches for the identifier of the photographic image corresponding to the center position as the search condition.
  • the conversion table storage unit 204 stores in advance a correspondence table in which areas including a large number of positions are associated with identifiers of photographic images. With reference to the correspondence table, the conversion table storage unit 204 corresponds to an area including a specified position. The identifier of a photographic image can be searched.
  • the search unit 203 searches for an identifier corresponding to the center position that is the search condition
  • the search unit 203 refers to the accompanying information of each photographic image in the second content storage unit 205 and searches for a photographic image including the searched identifier in the accompanying information. Search for.
  • the retrieved photographic image is transmitted from the server device 201 to the terminal device 101.
  • a plurality of photographic images corresponding to the search condition exist in the second content storage unit 205, all of these photographic images may be transmitted from the server device 201 to the terminal device 101, or the server device
  • the upper limit number of photographic images may be set on the 201 side, and photographic images equal to or smaller than the upper limit number may be transmitted to the terminal device 101.
  • the place name corresponding to the search condition may be searched on the server apparatus 201 side, and the place name may be added to the accompanying information of the photograph image, and then the photograph image may be transmitted to the terminal apparatus 101.
  • a place name including the center position may be set instead of the center position of the shooting position of each photo image of the content group.
  • a data table in which an area including a large number of positions and a place name are associated is provided on the terminal device 101 side, and a place name corresponding to the area including the center position is searched from the data table by the search condition generation unit 106.
  • the location name is transmitted from the terminal device 101 to the server device 201 as a search condition.
  • the search unit 203 searches the place name of the search condition from the accompanying information of each photographic image in the second content storage unit 205, and obtains a photographic image including the place name in the accompanying information.
  • the photographic image is transmitted to the terminal device 101.
  • the search condition generation unit 106 extracts all the shooting positions from the accompanying information of each photographic image of the content group, and generates a list of the shooting positions of each photographic image as shown in FIG. This list is transmitted to the server apparatus 201 as a search condition.
  • the shooting position of each photographic image in the list is compared with the shooting position of the accompanying information of each photographic image in the second content storage unit 205, and the photographic image at the shooting position that matches the list side is The photographic image stored in the second content storage unit 205 is searched for and acquired, and the acquired photographic image is transmitted to the terminal device 101.
  • a photo image of the shooting position within the area centered on the shooting position of the list may be retrieved from the second content storage unit 205. May be distinguished from a photographic image at a shooting position in the area and transmitted to the terminal device 101, and these photographic images may be displayed separately on the terminal device 101 side.
  • a match tag ⁇ match> indicating whether or not the photographing positions completely match is added as accompanying information, and such distinction display is performed based on the match tag ⁇ match>.
  • a photographic image retrieved from photographic images stored in the second content storage unit 205 is received by the communication unit 107, and the received photographic image in the second content storage unit 205 is received as a content management unit. 103.
  • the content management unit 103 receives the input photographic image in the second content storage unit 205 and the photographic image of the content group previously classified by the content classification unit 105, that is, the photographic image in the first content storage unit 104.
  • the data is output to the display generation unit 108.
  • the display generation unit 108 sets the display order and display layout of these photographic images, and displays these photographic images on the screen of the display unit 109.
  • the photographic image P1 in the second content storage unit 205 is displayed on the screen of the display unit 109, and then in the first content storage unit 104 as shown in FIG. 8B.
  • the photographic images P2 and P3 are sequentially displayed.
  • the place name 11 of the search condition is displayed together with the photographic image P1 in the second content storage unit 205, or the photographic image in the second content storage unit 205 of the server device 201.
  • a mark 12 indicating this, or a separation distance 13 of the shooting position of each photographic image may be displayed. This facilitates the distinction between the photograph image taken by the photographer and the photograph image provided by the photograph service company.
  • the photographic image P11 in the second content storage unit 205 and the photographic image P12 in the first content storage unit 104 may be laid out and displayed on the screen of the display unit 109.
  • the photographic images acquired from the first content storage unit 104 and the second content storage unit 205 are automatically selected or searched, they are not necessarily preferable for the user, and are not intended by the user. It may not be suitable. For this reason, an unintended photographic image can be selected by an input operation of the input unit 102, and this photographic image can be deleted from the screen of the display unit 109.
  • the photographic image P21 when the photographic image P21 is displayed on the screen of the display unit 109, the photographic image P21 on the screen is selected by the input operation of the input unit 102, and the photographic image P21 is deleted. Instruct. In response to this, the content management unit 103 deletes the selected photographic image from the photographic images acquired from the first content storage unit 104 and the second content storage unit 205.
  • information such as a URL for accessing the work purchase screen of the photographic image provider may be included.
  • a button B1 or the like for starting the browser is displayed on the screen of the display unit 109, and when the button B1 on the screen is operated by an input operation of the input unit 102,
  • the content management unit 103 activates a browser or the like, calls a work purchase screen corresponding to the URL via the Internet, and displays a work purchase screen as shown in FIG. 11B on the screen of the display unit 109. To do. This makes it possible to widely introduce and sell photographic images taken by the photographic image provider.
  • step S701 in FIG. 12A when the button B1 on the screen is operated by an input operation of the input unit 102 (step S701 in FIG. 12A), in response to this, the content management unit 103 activates a browser or the like. Then, a request message including information such as a URL for accessing the work purchase screen is created, and this request message is transmitted to the server apparatus 201 through the network N (step S702 in FIG. 12A). The server apparatus 201 waits for a response from the server apparatus 201 (step S703 in FIG. 12A).
  • the server apparatus 201 Upon receiving the request message (step S721 “yes” in FIG. 12B), the server apparatus 201 analyzes the request message and extracts information such as a URL included in the message (FIG. 12B). Step S722)), using the information such as the URL, the contents necessary for the work purchase screen are collected, and a response message including the contents necessary for the work purchase screen is created (FIG. 12B). Step S723), this response message is returned to the terminal device 101 through the network N (Step S724 in FIG. 12B).
  • Step S704 in FIG. 12A When the terminal device 101 receives the response message (step S704 in FIG. 12A), the response message is analyzed, and the content necessary for the work purchase screen included in the message is extracted (FIG. 12 ( Step S705 of a), a work purchase screen is created using this content or the like (step S706 of FIG. 12A), and this work purchase screen is displayed on the screen of the display unit 109 (FIG. 12A). Step S707).
  • the request message may be simply transmitted according to a protocol such as HTTP without activating the browser. Also in this case, it is possible to receive the work purchase screen or the content necessary for the work purchase screen as a response to this request message.
  • the server device that receives and responds to the request message from the terminal device 101 is not specified by the server device 201 and may be any server device on the Internet.
  • the second content storage unit 205 of the server device 201 when content in the first content storage unit 104 of the terminal device 101 is output, other content related to this content is stored in the second content storage unit 205 of the server device 201. It is possible to search from the stored contents and output the contents together on the terminal device 101 side. For example, when a photographic image of a trip taken by the user is stored in the first content storage unit 104 and a photographic image provided by a photo service company is stored in the second content storage unit 205, the user can use the second content storage unit. Even without selecting a photographic image in 205, a photographic image related to the photographic image in the first content storage unit 104 is selected from the contents stored in the second content storage unit 205 and taken by the user. Since both the photograph image and the photograph image of the photograph service company are displayed and output, it is possible to display a high-quality photograph image or a slide show.
  • information such as the place name and URL for accessing the work purchase screen of the photograph image provider is set as accompanying information, so the place name of the shooting position of the photo image can be displayed, and the work purchase screen can be quickly displayed. Can be called to promote the purchase of works.
  • the photographic image and audio information such as BGM are stored in the second content storage unit 205.
  • the contents can be searched for and transmitted from the server apparatus 201 to the terminal apparatus 101.
  • the terminal apparatus 101 can reproduce BGM or the like by voice when displaying a photographic image or a slide show.
  • information such as URL for accessing the work purchase screen of the provider such as BGM is provided as accompanying information in the voice information, and the browser is started on the screen of the display unit 109 as shown in FIG. Button B2 and the like are displayed.
  • the content management unit 103 activates a browser or the like and calls the work purchase screen corresponding to the URL through the Internet. You may do it.
  • artists such as semi-professional and independent artists have a lower degree of music recognition than professional artists, and thus can provide a wide range of works and advertisements in cooperation with such a photo image providing service.
  • the system of this embodiment can be applied to various other information services.
  • the system of this embodiment can be used for EC services.
  • the user's purchased product image and accompanying information are stored in the first content storage unit 104, and the product image is classified into one or a plurality of content groups based on the accompanying information by the content classification unit 105.
  • the search condition generation unit 106 generates a search condition from the accompanying information of the product image of each content group, and transmits this search condition to the server device 201.
  • the server device 201 a product image corresponding to the search condition is searched from product images stored in the second content storage unit 205, and the product image is transmitted to the terminal device 101.
  • the terminal device 101 as shown in FIG.
  • the product image in the first content storage unit 104 is displayed on the screen of the display unit 109 as already purchased, and then as shown in FIG. 14 (b). 2. Display the recommended product image from the product images stored in the content storage unit 205 as a recommended product image. Alternatively, as shown in FIG. 15, the purchased product image and the recommended product image are laid out and displayed on the screen of the display unit 109.
  • FIG. 16 is a block diagram showing a modification of the terminal device 101 of FIG.
  • a first content storage unit 111 that stores user's personal content is provided in a server storage or the like on the network N.
  • the terminal device 101 is provided with a first content acquisition unit 112 for accessing the first content storage unit 111 on the network N.
  • a second content storage unit 113 that stores content that can be used by an unspecified number of people is also provided in a server storage or the like on the network N.
  • the first content acquisition unit 112 of the terminal device 101 accesses the first content storage unit 111 on the network N via the communication unit 107 and reads and acquires the content from the first content storage unit 111. Thereafter, the same processing as in the system of FIG. 1 is performed to classify the content in the first content storage unit 111 into one or a plurality of content groups, and a search condition is generated for each content group.
  • the content in the second content storage unit 113 is searched through the network N, the content is taken into the terminal device 101, and both the content in the first content storage unit 111 and the content in the second content storage unit 113 are combined. Output.
  • FIG. 17 shows a modification of the content output system of FIG.
  • the server device 201 is provided with the first content storage unit 104
  • the terminal device 101 is provided with the first content acquisition unit 112 for accessing the first content storage unit 104 on the network N.
  • the first content acquisition unit 112 of the terminal device 101 accesses the first content storage unit 104 of the server device 201 via the communication unit 107 and reads and acquires the content from the first content storage unit 104. Thereafter, the same processing as in the system of FIG. 1 is performed to classify the content in the first content storage unit 104 into one or a plurality of content groups, and a search condition is generated for each content group.
  • the content in the second content storage unit 205 of the server device 201 is searched based on the content, the content is taken into the terminal device 101, and the content in the first content storage unit 104 and the content in the second content storage unit 205 are retrieved. Output together.
  • FIG. 18 shows another modification of the content output system of FIG.
  • the server device 201 is provided with a first content storage unit 104, a content management unit 103, a content classification unit 105, and a search condition generation unit 106.
  • the control unit 115 transmits a content classification instruction from the communication unit 107 to the server device 201 through the network N. .
  • a content classification instruction is received by the communication unit 202 and input to the content management unit 103. Thereafter, the same processing as in the system of FIG. 1 is performed to classify the contents in the first content storage unit 111 into content groups, generate search conditions, and the second content storage unit 113 based on the search conditions. Search for content in Then, for each content group, the content in the first content storage unit 111 and the content in the second content storage unit 113 are read and returned to the terminal device 101 through the network N.
  • the terminal device 101 receives a plurality of contents for each content group and displays them on the screen of the display unit 109.
  • FIG. 19 is a block diagram showing an embodiment of the content output apparatus of the present invention.
  • the content output device 121 of this embodiment is configured by adding a second content storage unit 205 and a conversion table storage unit 204 to the terminal device 101 of FIG.
  • the second content storage unit 205 stores a large number of contents. These contents are collected through the network N from other terminal devices and server devices.
  • the conversion table storage unit 204 performs the same function as the conversion table storage unit 204 in the server apparatus 201 of FIG.
  • Such a content output device 121 includes the second content storage unit 205 and the conversion table storage unit 204, there is no need to access an external server device unlike the terminal device 101 of FIG. When outputting content in the content storage unit 104, it is possible to search for other content related to this content from the content stored in the second content storage unit 205 and output the content together. .
  • photographic images and audio information not only photographic images and audio information, but also contents such as graphics and other still images and moving images can be handled in the present invention in the same manner as photographic images and audio information.
  • the content is a still image such as a photographic image
  • the position information is the shooting position of the photographic image
  • the date / time information is the shooting date / time.
  • the date information is not limited to this.
  • the content may be a moving image, music, audio information, etc. in addition to a still image such as a photographic image.
  • the output of the content is a display of a still image or a moving image.
  • the output of the content is reproduction of music or audio.
  • the position information of the content may indicate, for example, the position where the still image or moving image such as a photographic image is captured. It may indicate the date and time of capturing a still image such as a photographic image or a moving image. In such a case, mutually related imaging positions or contents at the imaging positions are output.
  • the location information of the content may indicate, for example, the recording position of the music or audio information
  • the date and time information of the content is the recording of the music or audio information. It may indicate date and time or delivery date and time. In such a case, the contents of the recording position, recording date / time, or distribution date / time related to each other are output.
  • the content accompanying information described above may be any information as long as it is information attached to the content, and is not limited to date information and position information.
  • the accompanying information may be information indicating both date information and position information, or may be information indicating only one of date information and position information.
  • both the first and second content storage units it is not necessary for both the first and second content storage units to be single, and a plurality of them can be provided, and a plurality of types of memory devices may be mixed and applied.
  • a plurality of first content storage units can be provided in one or both of the terminal device and the server device, or distributed on the network.
  • a second content storage unit can be provided.
  • a TV image signal may be output from the terminal device 101 to the TV, and the display content similar to the screen of the display unit 109 may be displayed on the TV screen.
  • the function of the terminal device 101 can be incorporated in the TV set. In this case, content can be viewed and EC service can be received in the same manner as viewing TV programs.
  • an image signal for another type of display device may be output, and display content similar to the screen of the display unit 109 may be displayed on the screen of the other type of display device.
  • Other types of display devices include portable terminals.
  • the present invention is not limited to a content output device or a content output system, and stores a content output method, a content output program for causing a computer to execute each step of the content output method, and a content output program Includes recording media.
  • the computer may be any device that can execute the program.
  • the computer can implement the present invention by reading a program from a recording medium, receiving the program through a communication network, and executing the program.
  • a plurality of processes can be distributed to a plurality of terminals. Therefore, the program can be applied not only to a single terminal such as a computer but also to a system.
  • the present invention can be applied to a personal computer or the like that displays or reproduces content composed of images, sounds, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Information Transfer Between Computers (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention concerne un système de sortie de contenu qui récupère efficacement des contenus utiles parmi une grande quantité de contenus mémorisés dans un dispositif serveur, etc. sur un réseau et pour utiliser les contenus récupérés. L'invention concerne spécifiquement un système de sortie de contenu, la communication d'informations étant effectuée entre un dispositif terminal (101) et le dispositif serveur (201) par l'intermédiaire du réseau. Lorsqu'un contenu mémorisé dans une première unité de mémorisation de contenus (104) du dispositif terminal (101) est délivré, d'autres contenus relatifs au contenu sont récupérés parmi les contenus mémorisés dans une seconde unité de mémorisation de contenus (205) du dispositif serveur (201) et les contenus récupérés sont délivrés, ensemble, du côté du dispositif terminal (101). Ainsi, les opérations de l'utilisateur pour naviguer dans une grande quantité de contenus mémorisés dans la seconde unité de mémorisation de contenus (205) du dispositif serveur (201) et sélectionner des contenus peuvent être éliminées.
PCT/JP2010/057464 2009-04-27 2010-04-27 Système de sortie de contenu WO2010126042A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009107232A JP2010257266A (ja) 2009-04-27 2009-04-27 コンテンツ出力システム、サーバー装置、コンテンツ出力装置、コンテンツ出力方法、コンテンツ出力プログラム、及びコンテンツ出力プログラムを記憶した記録媒体
JP2009-107232 2009-04-27

Publications (1)

Publication Number Publication Date
WO2010126042A1 true WO2010126042A1 (fr) 2010-11-04

Family

ID=43032186

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/057464 WO2010126042A1 (fr) 2009-04-27 2010-04-27 Système de sortie de contenu

Country Status (2)

Country Link
JP (1) JP2010257266A (fr)
WO (1) WO2010126042A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2011202609B2 (en) * 2011-05-24 2013-05-16 Canon Kabushiki Kaisha Image clustering method
JP6168882B2 (ja) * 2013-07-04 2017-07-26 キヤノン株式会社 表示制御装置、その制御方法、および制御プログラム
JP2016031439A (ja) * 2014-07-28 2016-03-07 ソニー株式会社 情報処理装置及び情報処理方法、コンピューター・プログラム、並びに画像表示システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004222056A (ja) * 2003-01-16 2004-08-05 Fuji Photo Film Co Ltd 画像保管方法および装置並びにプログラム
JP2007034403A (ja) * 2005-07-22 2007-02-08 Nikon Corp 画像表示装置、および画像表示プログラム
JP2008102790A (ja) * 2006-10-19 2008-05-01 Kddi Corp 検索システム
JP2009516951A (ja) * 2005-11-21 2009-04-23 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ デジタル画像の内容特徴およびメタデータを使って関係したオーディオ随伴物をみつけるシステムおよび方法
JP2009086727A (ja) * 2007-09-27 2009-04-23 Fujifilm Corp 画像表示装置、及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004222056A (ja) * 2003-01-16 2004-08-05 Fuji Photo Film Co Ltd 画像保管方法および装置並びにプログラム
JP2007034403A (ja) * 2005-07-22 2007-02-08 Nikon Corp 画像表示装置、および画像表示プログラム
JP2009516951A (ja) * 2005-11-21 2009-04-23 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ デジタル画像の内容特徴およびメタデータを使って関係したオーディオ随伴物をみつけるシステムおよび方法
JP2008102790A (ja) * 2006-10-19 2008-05-01 Kddi Corp 検索システム
JP2009086727A (ja) * 2007-09-27 2009-04-23 Fujifilm Corp 画像表示装置、及びプログラム

Also Published As

Publication number Publication date
JP2010257266A (ja) 2010-11-11

Similar Documents

Publication Publication Date Title
US8196212B2 (en) Personal information management device
US8538968B2 (en) Saving device for image sharing, image sharing system, and image sharing method
US8294787B2 (en) Display device having album display function
JP3824137B2 (ja) データ再生方法、データ再生装置、プログラムおよびその記録媒体
CN103124968B (zh) 用于后仰式娱乐的内容转换
US20080028294A1 (en) Method and system for managing and maintaining multimedia content
WO2009081936A1 (fr) Système de gestion de publicité, serveur de gestion de publicité, procédé de gestion de publicité, programme et client de navigation
US20040174443A1 (en) System and method for storing of records in a database
JP2007052788A (ja) デジタル写真を電子ドキュメントにリンクするための方法およびシステム
JP2007517311A (ja) 画像を公開及び販売するためのウェブサイト
US8719329B2 (en) Imaging device, imaging system, image management server, image communication system, imaging method, and image management method
US20100228751A1 (en) Method and system for retrieving ucc image based on region of interest
JP2007047959A (ja) 情報編集表示装置、情報編集表示方法、サーバ、情報処理システムおよび情報編集表示プログラム
JP2009217828A (ja) 画像検索装置
WO2010126042A1 (fr) Système de sortie de contenu
JP2005196615A (ja) 情報処理システムおよび情報処理方法
KR101831663B1 (ko) 스마트 단말에서 생태관광 컨텐츠 표시 방법
US20150039643A1 (en) System for storing and searching image files, and cloud server
TWM564225U (zh) 影像資訊分享系統
JP2021005390A (ja) コンテンツ管理機器、および、制御方法
JP2007104326A (ja) コンテンツ作成装置及びコンテンツ作成方法
JP2002132825A (ja) 画像検索システム、画像検索方法、画像検索プログラム、画像検索プログラムを記録したコンピュータ読み取り可能な記憶媒体、および画像検索装置
JP2005196613A (ja) 情報処理装置および情報処理方法、情報処理システム、記録媒体、並びに、プログラム
JP2010079421A (ja) 集約コンテンツ生成装置、集約コンテンツ生成プログラム、集約コンテンツ生成方法、及びシステム
JP4561358B2 (ja) 電子アルバム作成装置および電子アルバム作成システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10769740

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10769740

Country of ref document: EP

Kind code of ref document: A1