US8316395B2 - Information processing apparatus and method, and program - Google Patents

Information processing apparatus and method, and program Download PDF

Info

Publication number
US8316395B2
US8316395B2 US11/969,468 US96946808A US8316395B2 US 8316395 B2 US8316395 B2 US 8316395B2 US 96946808 A US96946808 A US 96946808A US 8316395 B2 US8316395 B2 US 8316395B2
Authority
US
United States
Prior art keywords
content metadata
content
text data
category
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/969,468
Other versions
US20080168499A1 (en
Inventor
Tatsuo Kuroiwa
Masachika Sasaki
Satoshi Fujimura
Hisashi Hosaka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUJIMURA, SATOSHI, HOSAKA, HISASHI, KUROIWA, TATSUO, SASAKI, MASACHIKA
Publication of US20080168499A1 publication Critical patent/US20080168499A1/en
Application granted granted Critical
Publication of US8316395B2 publication Critical patent/US8316395B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4828End-user interface for program selection for searching program descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image

Definitions

  • the present invention contains subject matter related to Japanese Patent Application JP 2007-000345 filed in the Japanese Patent Office on Jan. 5, 2007, the entire contents of which are incorporated herein by reference.
  • the present invention relates to an information processing apparatus and method, and a program. More particularly, the present invention relates to an information processing apparatus and method capable of displaying related information in such a manner as to change the related information on the basis of the category of metadata of content and, to a program.
  • An EPG is distributed at predetermined time intervals in a state in which it is contained in a broadcast wave, is received by a television receiver or the like, and is displayed as an electronic program guide separately from display of a program.
  • an object desired to be searched for differs depending on the category of EPG. That is, in the case of, for example, persons' names contained in the category of performers among items represented by the EPG, in order to find information on individual performers, such information may be satisfactorily obtained by performing a search using the performer's name as a keyword as in the known case.
  • CDs Compact Discs
  • DVDs Digital Versatile Discs
  • keywords when it is desired to purchase CDs (Compact Discs), DVDs (Digital Versatile Discs), and the like, using a BGM title or an artist name, it is not possible to directly purchase such a CD or DVD having the BGM title or the artist name on the basis of the information obtained by searching by using the BGM title and the artist name as keywords.
  • the present invention has been made in view of such circumstances.
  • an information processing apparatus including: content metadata obtaining means for obtaining content metadata; determination means for determining the category of the content metadata; editing means for editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined by the determination means, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproduction means for reproducing the text data of the content metadata in synchronization with the reproduction of the content; and display means for displaying the information on the link destination on the basis of the content metadata edited by the editing means when the predetermined operation is performed on the text data of the content metadata that is reproduced by the reproduction means.
  • the editing means may access a search engine on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that shows a search result of the text data of the content metadata is displayed.
  • the editing means may access a sales site of a commodity related to the text data in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that the sales site is displayed.
  • the editing means may display a screen for inputting text data to be searched for by a search engine in a state in which the text data has been input in advance in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, may access the search engine after the input of the text data is completed, and may edit the content metadata so that a homepage that shows a search result of the text data is displayed.
  • the editing means may access a server of the predetermined address in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that exists at the predetermined address is displayed.
  • the editing means may display a selection screen for selecting either accessing a server indicated by the address and displaying a homepage of the address or accessing a search engine and displaying a homepage that shows a search result of the text data of the content metadata, and may edit the content metadata so that a homepage of the selected content is displayed.
  • an information processing method including the steps of: obtaining content metadata; determining the category of the content metadata; editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined in the determination, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproducing the text data of the content metadata in synchronization with the reproduction of the content; and displaying the information on the link destination on the basis of the content metadata edited in the editing when the predetermined operation is performed on the text data of the content metadata that is reproduced in the reproduction.
  • a program for enabling a computer to execute a method including the steps of: obtaining content metadata; determining the category of the content metadata; editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined in the determination, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproducing the text data of the content metadata in synchronization with the reproduction of the content; and displaying the information on the link destination on the basis of the content metadata edited in the editing when the predetermined operation is performed on the text data of the content metadata that is reproduced in the reproduction.
  • the computer program according to an embodiment of the present invention is stored on the program storage medium according to an embodiment of the present invention.
  • the computer program when obtaining content metadata, determining the category of content metadata, and reproducing the text data of the content metadata on the basis of the determined category of the content metadata, in the case that the content metadata is edited so that information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data, the text data of the content metadata is reproduced in synchronization with the reproduction of the content, and when the predetermined operation is performed on the reproduced text data of the content metadata, the information on the link destination is displayed on the basis of the edited content.
  • the information processing apparatus may be an independent apparatus and may also be a block for performing information processing.
  • FIG. 1 is a block diagram showing an example of the configuration of an embodiment of an image display apparatus to which the present invention is applied;
  • FIG. 2 is a flowchart illustrating a content recording process
  • FIG. 3 is a flowchart illustrating a content metadata storage process
  • FIG. 4 is a flowchart illustrating a content reproduction process
  • FIG. 5 illustrates an example of a window displayed by the content reproduction process
  • FIG. 6 is a flowchart illustrating a content metadata editing process
  • FIG. 7 is a flowchart illustrating a link display process
  • FIG. 8 illustrates an example of a display of a related information display part
  • FIG. 9 illustrates an example of a display of a related information display part
  • FIG. 10 illustrates an example of a selection screen
  • FIG. 11 illustrates an example of a keyword input screen
  • FIG. 12 illustrates an example of the configuration of a personal computer.
  • An information processing apparatus includes: content metadata obtaining means (e.g., an EPG obtaining unit 21 shown in FIG. 1 ) for obtaining content metadata; determination means (e.g., a category identification unit 51 shown in FIG. 1 ) for determining the category of the content metadata; editing means (e.g., an extended editing unit 52 shown in FIG. 1 ) for editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined by the determination means, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproduction means (e.g., a content reproduction unit 29 shown in FIG.
  • content metadata obtaining means e.g., an EPG obtaining unit 21 shown in FIG. 1
  • determination means e.g., a category identification unit 51 shown in FIG. 1
  • editing means e.g., an extended editing unit 52 shown in FIG. 1
  • reproduction means e.g., a content reproduction unit 29 shown in FIG
  • display means e.g., a display unit 30 shown in FIG. 1 ) for displaying the information on the link destination on the basis of the content metadata edited by the editing means when the predetermined operation is performed on the text data of the content metadata that is reproduced by the reproduction means.
  • the editing means may access a search engine in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that shows a search result of the text data of the content metadata is displayed.
  • the editing means may access a sales site of a merchandise related to the text data in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that the sales site is displayed.
  • the editing means may display a screen for inputting text data to be searched for by a search engine in a state in which the text data has been input in advance in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, may access the search engine after the input of the text data is completed, and may edit the content metadata so that a homepage that shows a search result of the text data is displayed.
  • the editing means may access a server of the predetermined address in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that exists at the predetermined address is displayed.
  • the editing means may display a selection screen for selecting either accessing a server indicated by the address and displaying a homepage of the address or accessing a search engine and displaying a homepage that shows a search result of the text data of the content metadata, and may edit the content metadata so that a homepage of the selected content is displayed.
  • An information processing method includes the steps of: obtaining (e.g., step S 22 shown in FIG. 3 ) content metadata; determining (e.g., step S 63 , S 65 , S 67 , S 69 , S 71 , S 73 , S 75 , or S 77 shown in FIG. 3 ) the category of the content metadata; editing (e.g., step S 64 , S 66 , S 68 , S 70 , S 72 , S 74 , S 76 , S 78 , or S 80 shown in FIG.
  • FIG. 1 shows an example of the configuration of an embodiment of an image display apparatus to which the present invention is applied.
  • An image display apparatus 1 is, for example, a television receiver.
  • the image display apparatus 1 is operated using a remote controller 2 , and receives and displays content distributed in the form of a broadcast wave from a broadcast station by using an antenna 4 and also, records the content. Furthermore, the image display apparatus 1 obtains and displays content distributed by a program distribution server 5 via a network 6 typified by the Internet and also, records the content, or reproduces and displays the content.
  • the image display apparatus 1 obtains, as content metadata, not only EPG data contained in the broadcast wave, but also EPG data distributed from an EPG data distribution server 3 via the network 6 . Furthermore, the image display apparatus 1 edits and displays content metadata, such as the obtained EPG, in an extended manner and also, performs a corresponding process when the displayed content metadata is selected from the remote controller 2 or the like, so that, for example, access is made to a search engine server 7 via the network 6 , and the related information of the selected content metadata is obtained and displayed.
  • content metadata such as the obtained EPG
  • An EPG obtaining unit 21 obtains an EPG which is contained in a broadcast wave from a broadcast station (not shown) and which is distributed from the antenna 4 , and allows a content metadata storage unit 22 to store the EPG as content metadata. Furthermore, the EPG obtaining unit 21 controls a communication unit 24 formed by a modem and the like in order to access the EPG data distribution server 3 via the network 6 , obtains EPG data, and allows the content metadata storage unit 22 to store the EPG data as content metadata.
  • the content metadata is text data in which information on content is described.
  • the content metadata is the title of a program, video recording date and time, broadcast time, a channel (the channel is a broadcast station in the case of a broadcast wave, and is the distribution source company in the case of network distribution) of the program, a genre, a subgenre, performers, or the like.
  • the content metadata includes a product name, a title, a commercial sponsor, a BGM title, an artist, performers, or the like.
  • a content video recorder 25 is controlled using the remote controller 2 , sets a tuner 23 to a predetermined channel, receives content distributed from a broadcast station (not shown) and distributed as a broadcast wave via the antenna 4 , and allows a content data storage unit 26 formed by, for example, an HDD (Hard Disk Drive) or the like to store the content as content data 111 . Furthermore, the content video recorder 25 controls the communication unit 24 , and allows the content data storage unit 26 to store content distributed by the program distribution server 5 via the network 6 as the content data 111 .
  • HDD Hard Disk Drive
  • the content video recorder 25 controls a scene change detector 101 in order to detect the timing at which a scene changes in units of frames, and supplies a detection result to a film roll data creation unit 102 .
  • the film roll data creation unit 102 records the information on the frame numbers in the content data corresponding to the timing of the scene change in film roll data 112 , and stores the film roll data 112 in the content data storage unit 26 in such a manner as to correspond to the content data 111 .
  • An image video recorder 103 allows the content data storage unit 26 to store, as content data, the information obtained as content by the content video recorder 25 .
  • content data 111 - 1 to 111 - n is stored, and film roll data 112 - 1 to 112 - n is stored in a corresponding manner. Details of the film roll data 112 will be described later.
  • content data 111 - 1 to 111 - n and the film roll data 112 - 1 to 112 - n do not need to be distinguished from each other, these will be referred to simply as content data 111 and film roll data 112 , respectively, and the other data will be referred to in a similar manner.
  • a content metadata extended editing unit 27 reads the corresponding content metadata (EPG data) from the content metadata storage unit 22 , extends and edits the content metadata, and stores the content metadata in an extended content metadata storage unit 28 . More specifically, in addition to being represented as text data, the content metadata extended editing unit 27 extends content metadata formed of text data and edits it to form data to which usable functions are attached by using a language such as XML (Extensible Markup Language). Then, the edited data is used and displayed as a related information display part 214 (to be described later), as shown in FIGS. 4 , 8 , and 9 .
  • a category identification unit 51 of the content metadata extended editing unit 27 identifies a category for each of the described content of the original content metadata read from the content metadata storage unit 22 , and supplies the identification result to an extended editing unit 52 . Furthermore, when a category is to be identified, a same information determination unit 51 a determines whether or not, for example, a plurality of pieces of the same information exist, that is, determines whether or not the same URL (Uniform Resource Locator) is contained although text data differs in the categories of product names and commercial sponsors in CM (commercial message) content, and supplies the determination result to a commercial sponsor editing unit 82 .
  • URL Uniform Resource Locator
  • a TV (television program) content editing unit 61 of the extended editing unit 52 extends and edits the content metadata of the TV content among the content metadata.
  • a program title editing unit 71 of the TV content editing unit 61 allows the search engine server 7 to perform a search by using the text data as a keyword, and extends and edits the content metadata so that an HP (homepage) that shows the search result is displayed.
  • a channel editing unit 72 of the TV content editing unit 61 allows the search engine server 7 to search for the content metadata by using the text data as a keyword, and extends and edits the content metadata so that an HP that shows the search result is displayed.
  • a performer editing unit 73 of the TV content editing unit 61 allows the search engine server 7 to search for the content metadata by using the text data as a keyword, and extends and edits the content metadata so that an HP that shows the search result is displayed.
  • the CM content editing unit 62 of the extended editing unit 52 extends and edits the content metadata of the CM content among the content metadata.
  • a product name editing unit 81 of the CM content editing unit 62 extends and edits the content metadata so that an HP of the address specified by the URL, at which the description of product names and the like are provided, is displayed.
  • content metadata belonging to the category of product names is assumed to be a URL.
  • the content metadata is not limited to a URL, and may be information other than that.
  • a commercial sponsor editing unit 82 of the CM content editing unit 62 extends and edits the content metadata so that the HP of the address specified by the URL at which the description of the commercial sponsor, and the like are provided, is displayed.
  • the content metadata belonging to the category of commercial sponsors is assumed to be a URL.
  • the content metadata is not limited to a URL and may be information other than that.
  • the commercial sponsor editing unit 82 allows the search engine server 7 to search for the content metadata by using the text data as a keyword, extends and edits the content metadata so that an HP that shows the search result is displayed and information differing from that when the text data of the category of product names is displayed and selected is provided. This is only for the purpose of avoiding the case that the same information is provided in spite of the fact that different information has been selected.
  • the search engine server 7 is made to search for the content metadata by using the text data as a keyword, and the content metadata is extended and edited so that an HP that shows the search result is displayed.
  • a BGM title editing unit 83 of the CM content editing unit 62 allows the search engine server 7 to perform a search by using the text data as a keyword, and extends and edits content metadata so that an HP that shows the search result is displayed.
  • an artist editing unit 84 of the CM content editing unit 62 displays a screen from which a selection can be made as to displaying an HP of a merchandise sales site (sales site) at which CDs, DVDs, and the like of the artist are sold or allowing the search engine server 7 to perform a search by using the text data as a keyword and displaying an HP that shows the search result, and further extends and edits content metadata so that the selected HP is displayed.
  • the selection display editing unit 63 generates information for displaying a selection screen used here as content metadata to be extended.
  • a performer editing unit 85 of the CM content editing unit 62 allows the search engine server 7 to perform a search by using the text data as a keyword, and extends and edits content metadata so that an HP that shows the search result is displayed.
  • a content reproduction unit 29 controls an image reproduction unit 131 in order to read and reproduce the corresponding content data 111 from the content data storage unit 26 , and allows a display unit 30 formed of a CRT (Cathode Ray Tube), an LCD (Liquid Crystal Display), or the like to display the content data.
  • a CRT Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the content reproduction unit 29 controls a film roll reproduction unit 132 in order to read the film roll data 112 corresponding to the content data 111 , and displays, on the display unit 30 , a thumbnail image of the image immediately after the scene is changed, which corresponds to a time code recorded in the film roll data 112 in the manner of a film roll in synchronization with the timing at which the content data 111 is sequentially reproduced by the image reproduction unit 131 .
  • An extended content metadata reproduction unit 133 reads extended content metadata from the extended content metadata storage unit 28 , and displays the extended content metadata on the display unit 30 in synchronization with the reproduced content data 111 .
  • a link instruction detector 32 On the basis of a signal such that a light-emission signal supplied from the remote controller 2 is received by a light-receiving unit 33 , a link instruction detector 32 detects whether or not one of the pieces of text displayed as content metadata has been clicked on, and supplies information on the extended content metadata corresponding to the detection result to a link display controller 34 .
  • the link display controller 34 controls a browser controller 141 in order to operate the browser 31 .
  • the browser 31 is controlled by the browser controller 141 , controls the communication unit 24 as necessary, accesses the search engine server 7 via the network 6 in order to perform a search process by using a keyword, and allows the display unit 30 to display an HP that shows the search result.
  • the browser 31 is controlled by the browser controller 141 , controls the communication unit 24 as necessary, accesses a server (not shown) specified by a URL, a predetermined merchandise sales site, or the like via the network 6 , and allows the display unit 30 to display the HP.
  • a selection display controller 142 allows the display unit 30 to display a selection screen for making a selection of the processing results.
  • step S 1 the content video recorder 25 determines whether or not video recording has been instructed from a signal obtained from the light-receiving unit 33 , and repeats the processing until video recording is instructed.
  • step S 1 when, for example, the operation unit 2 b is operated to instruct content video recording by the user, the light-emitting unit 2 a emits a corresponding signal as a light-emission signal.
  • the light-receiving unit 33 receives the light-emission signal of the light-emitting unit 2 a of the remote controller 2 , and supplies a signal instructing the start of the video recording corresponding to the light-received signal to the content video recorder 25 .
  • the content video recorder 25 determines that the video recording of the content has been instructed, and the process proceeds to step S 2 .
  • step S 2 the content video recorder 25 controls the tuner 23 in order to set a channel and obtains the content of the broadcast wave received by the antenna 4 at a predetermined channel.
  • the content is not limited to content distributed in the form of a broadcast wave.
  • the content video recorder 25 also controls the communication unit 24 in order to access the predetermined program distribution server 5 via the network 6 and obtains content that is so-called network-distributed by the program distribution server 5 .
  • step S 3 the content video recorder 25 controls the image video recorder 103 so that image data (containing audio data) of content is recorded as the content data 111 in the content data storage unit 26 .
  • step S 4 the content video recorder 25 controls the scene change detector 101 so as to detect whether or not there is a scene change in units of frames in the obtained image data and determines whether or not a scene has been changed. More specifically, for example, when the addition result of the difference value of the pixel value between the pixels at the same position between adjacent frames is greatly changed, the scene change detector 101 detects the change of the scene.
  • step S 5 the film roll data creation unit 102 converts the image of the frame immediately after a scene change has been detected into a thumbnail image, registers the thumbnail image, together with the time code at the time of reproduction, in the film roll data 112 , updates the thumbnail image, and stores the thumbnail image in the content data storage unit 26 . That is, in the film roll data 112 , the thumbnail image of the image at the timing at which the scene change has been detected, together with the time code, is sequentially stored.
  • step S 6 the content video recorder 25 determines whether or not the completion of the video recording has been instructed or whether or not the distribution of the content has been completed.
  • the process returns to step S 2 . That is, the processing of steps S 2 to S 6 is repeated from the timing at which the video recording of the content is instructed until the completion of the video recording is instructed or the distribution of the content is completed. Then, when the completion of the video recording is instructed or the distribution of the content is completed in step S 6 , the processing is completed.
  • the content data storage unit 26 the content data 111 for which video recording has been instructed and the film roll data 112 are stored.
  • step S 21 the EPG obtaining unit 21 determines whether or not, for example, a predetermined period of time of approximately 24 hours has passed, and repeats the same process until the predetermined period of time has passed.
  • step S 22 the EPG obtaining unit 21 obtains EPG data distributed in a state in which the EPG data is contained in a broadcast wave from a broadcast station (not shown), which is received via the antenna 4 , and stores the EPG data as content metadata in the content metadata storage unit 22 .
  • step S 23 the EPG obtaining unit 21 controls the communication unit 24 in order to obtain EPG data from the EPG data distribution server 3 via the network 6 , and stores the EPG data as content metadata in the content metadata storage unit 22 .
  • the EPG data distributed in the form of a broadcast wave and the EPG data distributed via the network 6 are stored as content metadata in the content metadata storage unit 22 .
  • one of the EPG data distributed in the form of a broadcast wave and the EPG data distributed via the network 6 need only to be capable of being obtained. Therefore, in the processing of the flowchart in FIG. 3 , one of the processes of steps S 22 and S 23 may be performed.
  • step S 41 the content reproduction unit 29 determines whether or not the reproduction of content has been instructed on the basis of a signal obtained by the light-receiving unit 33 , and repeats the processing until the reproduction is instructed.
  • step S 41 for example, when the operation unit 2 b is operated to instruct the reproduction of the content by the user, the light-emitting unit 2 a emits a corresponding signal as a light-emission signal.
  • the light-receiving unit 33 receives the light-emission signal of the light-emitting unit 2 a of the remote controller 2 , and supplies a signal instructing the start of the reproduction of the content corresponding to the light-received signal to the content reproduction unit 29 .
  • the content reproduction unit 29 determines that the reproduction of the content has been instructed, and the process proceeds to step S 42 .
  • step S 42 the content reproduction unit 29 controls the image reproduction unit 131 in order to read the content data 111 corresponding to the content for which reproduction has been instructed, and allows the display unit 30 to display the content data 111 as, for example, an image-display-part search engine server 7 of FIG. 5 .
  • a window 201 is displayed, and within the window 201 , an image display part 211 , a film roll display part 212 , a playlist display part 213 , and a related information display part 214 are provided.
  • An image reproduction unit 131 causes an image of content recorded using the content data 111 to be reproduced in the image display part 211 .
  • the content reproduction unit 29 displays the content name of the content data 111 that is currently being reproduced in a playlist display column 213 .
  • “ZZZ” is displayed as the content name that is being reproduced
  • “00:59:57” is displayed as a reproduction period of time.
  • step S 43 the film roll reproduction unit 132 reads the film roll data 112 corresponding to the content data 111 of the content that is being reproduced, and displays a film roll screen on the film roll display part 212 .
  • step S 44 on the basis of the film roll data 112 , the film roll reproduction unit 132 determines whether or not the current reproduction time is a timing at which the thumbnail image of the image after a scene is changed should be displayed.
  • step S 45 the thumbnail image is displayed on the film roll screen 221 of the film roll display part 212 .
  • the process of step S 45 is skipped.
  • thumbnail images 231 - 1 to 231 - 3 at the time of a scene change are displayed while being moved from the right to the left in the figure in synchronization with the change of the time code of the image to be reproduced.
  • “0:18:33” and “0:18:36” are displayed as time codes at the reproduction time on the film roll screen 221 . It is shown that the timing below the display time is a reproduction position of the video-recorded data corresponding to the time code, and the current reproduction position is a position indicated by the vertical straight line in the center of the film roll screen 221 .
  • the thumbnail image of the scene change image is displayed at the right end in the figure at a timing immediately before the image is displayed on the image display part 211 , and as the image reproduced on the image display part 211 progresses, the image moves to the left.
  • the same image as the image processed as a thumbnail image is displayed in the image display part 211 .
  • the image moves to the left end, and before long, it disappears to the left end from the film roll screen 221 .
  • buttons 223 shown in the lower left area in the figure enable the designation of playback start, halt, stop, fast rewinding, forward jump, or backward jump so as to operate a corresponding operation. Furthermore, when the thumbnail image 231 on the film roll screen 221 is designated, the reproduction position jumps to the reproduction position of the corresponding time code, and the playback is started.
  • step S 46 the content metadata extended editing unit 27 performs a content metadata editing process, extends and edits content metadata, and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 61 the content metadata extended editing unit 27 reads content metadata corresponding to the content that is currently being reproduced from the content metadata storage unit 22 .
  • step S 62 the category identification unit 51 determines whether or not the content that is currently being reproduced is TV (television) content, that is, whether or not the content is so-called content constituting a program.
  • step S 63 the category identification unit 51 determines whether or not content metadata belonging to the category of program titles exists on the basis of the read content metadata.
  • step S 64 the category identification unit 51 reads the content metadata belonging to the category of program titles and supplies the content metadata to the TV content editing unit 61 .
  • the TV content editing unit 61 controls the program title editing unit 71 .
  • the program title editing unit 71 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of program titles as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and stores the content metadata in the extended content metadata storage unit 28 .
  • step S 63 When it is determined in step S 63 that content data belonging to the category of program titles does not exist, the process of step S 64 is skipped.
  • step S 65 the category identification unit 51 determines whether or not the content metadata belonging to the category of channels exists on the basis of the read content metadata.
  • step S 66 the category identification unit 51 reads the content metadata belonging to the category of channels and supplies the content metadata to the TV content editing unit 61 .
  • the TV content editing unit 61 controls the channel editing unit 72 .
  • the channel editing unit 72 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of channels as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
  • the TV content editing unit 61 may control the channel editing unit 72 in order to access the URL, and may edit metadata so that an HP of the broadcast station of the channel is displayed.
  • step S 65 When it is determined in step S 65 that the content metadata belonging to the category of channels does not exist, the process of step S 66 is skipped.
  • step S 67 the category identification unit 51 determines whether or not content metadata belonging to the category of performers exists on the basis of the read content metadata.
  • step S 68 the category identification unit 51 reads the content metadata belonging to the category of performers, and supplies the content metadata to the TV content editing unit 61 .
  • the TV content editing unit 61 controls the performer editing unit 73 .
  • the performer editing unit 73 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of channels as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 67 When it is determined in step S 67 that the content metadata belonging to the category of performers does not exist, the process of step S 68 is skipped.
  • step S 69 the category identification unit 51 determines whether or not the content metadata belonging to the category of product names exists on the basis of the read content metadata.
  • step S 70 the category identification unit 51 reads the content metadata belonging to the category of product names and supplies the content metadata to the CM content editing unit 62 .
  • the CM content editing unit 62 controls the product name editing unit 81 .
  • the product name editing unit 81 accesses a corresponding server on the basis of the URL supplied as the text data of the content metadata belonging to the category of product names, edits the description of the content metadata so that an HP that introduces products is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 69 When it is determined in step S 69 that the content metadata belonging to the category of product names does not exist, the process of step S 70 is skipped.
  • step S 71 the category identification unit 51 determines whether or not the content metadata belonging to the category of commercial sponsors exists on the basis of the read content metadata.
  • step S 72 the category identification unit 51 reads the content metadata belonging to the category of commercial sponsors, and supplies the content metadata to the CM content editing unit 62 .
  • the CM content editing unit 62 controls the commercial sponsor editing unit 82 .
  • the CM content editing unit 62 accesses the corresponding server on the basis of the URL supplied as the text data of the content metadata belonging to the category of commercial sponsors, edits the description of the content metadata so that an HP that introduces companies that are commercial sponsors is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 72 When it is determined in step S 71 that the content metadata belonging to the category of product names does not exist, the process of step S 72 is skipped.
  • step S 73 the category identification unit 51 determines whether or not the content metadata belonging to the category of BGM titles exists on the basis of the read content metadata.
  • step S 74 the category identification unit 51 reads the content metadata belonging to the category of BGM titles and supplies the content metadata to the CM content editing unit 62 .
  • the CM content editing unit 62 controls the BGM title editing unit 83 .
  • the BGM title editing unit 83 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of BGM titles as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 73 When it is determined in step S 73 that the content metadata belonging to the category of BGM titles does not exist, the process of step S 74 is skipped.
  • step S 75 the category identification unit 51 determines whether or not the content metadata belonging to the category of artists exists on the basis of the read content metadata.
  • step S 76 the category identification unit 51 reads the content metadata belonging to the category of artists and supplies the content metadata to the CM content editing unit 62 .
  • the CM content editing unit 62 controls the artist editing unit 84 .
  • the artist editing unit 84 displays a selection screen for accessing a merchandise sales site where CDs, DVDs, and the like of the artist are sold and displaying the corresponding HP, or for allowing the search engine server 7 to perform a search by using the content metadata belonging to the category of artists as a keyword and displaying a selection screen for selecting one of HPs that show the search result, and edits the description of the content metadata so that the HP selected on the selection screen is displayed.
  • the selection display editing unit 63 creates the above-described selection screen, and the artist editing unit 84 edits the description of the content metadata so that the created selection screen is displayed and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 75 When it is determined in step S 75 that the content metadata belonging to the category of BGM titles does not exist, the process of step S 76 is skipped.
  • step S 77 the category identification unit 51 determines whether or not the content metadata belonging to the category of performers exists on the basis of the read content metadata.
  • step S 78 the category identification unit 51 reads the content metadata belonging to the category of performers and supplies the content metadata to the CM content editing unit 62 .
  • the CM content editing unit 62 controls the performer editing unit 85 .
  • the performer editing unit 85 displays a screen from which a keyword can be input in a state in which text data of the content metadata belonging to the category of performers has been input in advance, allows, when the keyword input is determined, the search engine server 7 to perform a search by using the determined keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
  • step S 77 When it is determined in step S 77 that the content metadata belonging to the category of performers does not exist, the process of step S 78 is skipped.
  • step S 79 the category identification unit 51 controls the same information determination unit 51 a so as to determine whether or not content metadata with which the same URL is displayed exists when the content metadata is selected.
  • the category identification unit 51 supplies the fact that the URL of the product name is the same as the URL of the commercial sponsor to the CM content editing unit 62 .
  • the CM content editing unit 62 controls the commercial sponsor editing unit 82 .
  • the CM content editing unit 62 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of commercial sponsors as a keyword, edits again the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata. That is, the URL is displayed for the information on the product name or the commercial sponsor. Therefore, regarding the information on the commercial sponsor, the URL is not directly displayed, but the HP that shows the search result in which the text data was used as a keyword is displayed, so that, when the commercial sponsor is selected, the same information can be made not to be displayed.
  • step S 79 When it is determined in step S 79 that the content metadata with which the same URL is displayed does not exist, the process of step S 80 is skipped.
  • the extended editing unit 52 stores other content metadata that is not extended as it is in the extended content metadata storage unit 28 . That is, when the content is a TV (television) program, the content metadata is video recording date and time, broadcast time, a genre, a subgenre, or the like. When the content is a CM (commercial message), here, since the content metadata, such as a title, is not particularly extended, the content metadata is stored as it is in the extended content metadata storage unit 28 .
  • CM commercial message
  • the content metadata of the content that is being reproduced is converted from simple text data to extended content metadata, when the content metadata is selected, it becomes possible to not only display the content metadata simply as text data, but also display corresponding extended related information for each category.
  • step S 47 the extended content metadata reproduction unit 133 accesses the extended content metadata storage unit 28 and determines whether or not the content metadata to be displayed exists. For example, as a result of the processing of step S 46 , when the extended content metadata is stored in the extended content metadata storage unit 28 , it is assumed that the content metadata to be displayed exists.
  • step S 48 the extended content metadata reproduction unit 133 reads the extended content metadata stored in the extended content metadata storage unit 28 and displays the extended content metadata on the display unit 30 , as shown in the related information display part 214 in FIG. 5 .
  • content metadata of CM content is displayed.
  • “Tantan label (draft)” is displayed as content metadata that fits into the category of product names, indicating that the product name introduced by the CM content that is currently being reproduced is “Tantan label (draft)”.
  • the content metadata that fits into the category of titles is displayed as “Tantan label (draft) picnic version”, indicating that the title of the CM content is “Tantan label (draft) picnic version”.
  • the text data of the content metadata is displayed only.
  • the pointer is moved to the position at which the “Tantan label”, which is a product name according to the above-described extended content metadata, “Karin beer”, which is a commercial sponsor, “Opening march”, which is a BGM title, “XXX”, which is an artist, or “ABC”, “DEF”, and “GHI” of the performers is displayed, and is selected by the pointer being clicked, a link display process (to be described later) causes the above-described extended link display process to be performed.
  • step S 47 When it is determined in step S 47 that the content metadata does not exist in the extended content metadata storage unit 28 , the process of step S 47 is skipped.
  • step S 49 the content reproduction unit 29 determines whether or not the stopping of the reproduction of the content has been instructed. When it is determined that the stopping of the reproduction of the content has not been instructed, the process returns to step S 42 . That is, as long as the reproduction is continued, the processing of steps S 42 to S 49 is repeated.
  • step S 49 when it is determined in step S 49 that the stopping of the reproduction has been instructed, the processing is completed.
  • step S 91 the link instruction detector 32 queries the content reproduction unit 29 in order to determine whether or not one of the content data 111 is currently being read and the content is being reproduced. The process is repeated until a state in which the content is being reproduced is reached. For example, in step S 91 , when content is being reproduced by the process described with reference to FIG. 4 described above, the process proceeds to step S 92 .
  • step S 92 the link instruction detector 32 queries the content reproduction unit 29 in order to determine whether or not the content that is currently being reproduced is TV content. For example, when the content is TV content, the process proceeds to step S 93 .
  • step S 93 the link instruction detector 32 determines whether or not the information on the program title has been selected on the basis of a signal from the light-receiving unit 33 . For example, when the related information display part 214 is displayed as shown in FIG. 8 , an area where “AAA” in the topmost area is displayed is selected in such a manner that the area is clicked on by the operation unit 2 b of the remote controller 2 being operated, the link instruction detector 32 assumes that the information on the program title is selected, and the process proceeds to step S 94 .
  • step S 94 the link instruction detector 32 reads extended content metadata corresponding to the program title of the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , controls the communication unit 24 in order to access the search engine server 7 via the network 6 , allows the search engine server 7 to perform a search by using the text data “AAA” belonging to the category of program titles of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • “AAA” is displayed as the program title, indicating that the program title is “AAA”.
  • the channel is displayed as “XCH BBB television”, indicating that the content that is being reproduced is content broadcast by the BBB television of channel X.
  • video recording date and time information containing the video recording time “2006/4/17 (Mon.) 21:00-22:09 (1 hour 9 minutes)” is displayed, indicating that the content is content of 1 hour 9 minutes from 21:00 to 22:09 and the date and time at which the content was recorded is Monday Apr. 17, 2006.
  • “Variety Show/Others” is displayed, indicating that the genre is “Variety Show/Others”.
  • Below the genre as performers, “CCC”, “DDD”, and “EEE” are displayed, indicating that the performers are “CCC”, “DDD”, and “EEE”.
  • step S 94 When the program title is not selected in step S 93 , the process of step S 94 is skipped.
  • step S 95 the link instruction detector 32 determines whether or not the information on the channel has been selected on the basis of the signal from the light-receiving unit 33 . For example, when the related information display part 214 is displayed as shown in FIG. 8 , in the case that an area where “BBB television” in the second area from the top is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2 , the link instruction detector 32 assumes that the information on the channel has been selected, and the process proceeds to step S 96 .
  • step S 96 the link instruction detector 32 reads the extended content metadata corresponding to the channel of the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , controls the communication unit 24 in order to access the search engine server 7 via the network 6 , allows the search engine server 7 to perform a search by using the text data “BBB television” belonging to the category of channels of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • step S 95 When the channel is not selected in step S 95 , the process of step S 96 is skipped.
  • step S 97 the link instruction detector 32 determines whether or not the information on the performers has been selected on the basis of the signal from the light-receiving unit 33 . For example, when the related information display part 214 is displayed as shown in FIG. 8 , in the case that an area where “DDD” among “CCC”, “DDD”, and “EEE” in the lowest area from the top is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2 , the link instruction detector 32 assumes that the information on the performer has been selected, and the process proceeds to step S 98 .
  • step S 98 the link instruction detector 32 reads the extended content metadata corresponding to the performer in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , controls the communication unit 24 in order to access the search engine server 7 via the network 6 , allows the search engine server 7 to perform a search by using the text data “DDD” belonging to the category of performers of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • step S 98 When the performer has not been selected in step S 97 , the process of step S 98 is skipped.
  • step S 99 when the content that is currently being reproduced is CM content, the process proceeds to step S 99 .
  • step S 99 the link instruction detector 32 determines whether or not the information on the product name has been selected on the basis of the signal from the light-receiving unit 33 . For example, when the related information display part 214 is displayed as shown in FIG. 9 , in the case that an area where “Lifeguard” in the topmost area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2 , the link instruction detector 32 assumes that the information on the product name has been selected, and the process proceeds to step S 100 .
  • step S 100 the link instruction detector 32 reads the extended content metadata corresponding to the product name in the selected area and supplies the extended content metadata to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , controls the communication unit 24 in order to access a server (not shown) corresponding to the URL via the network 6 , and displays the HP specified by a URL on another window (not shown) differing from the window 201 on the display unit 30 .
  • step S 100 When the product name has not been selected in step S 99 , the process of step S 100 is skipped.
  • step S 101 the link instruction detector 32 determines whether or not the information on the commercial sponsor has been selected on the basis of the signal from the light-receiving unit 33 . For example, when the related information display part 214 has been displayed as shown in FIG. 9 , in the case that an area where “Lifeguard” in the third area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2 , the link instruction detector 32 assumes that the information on the commercial sponsor has been selected, and the process proceeds to step S 102 .
  • step S 102 the link instruction detector 32 reads the extended content metadata corresponding to the commercial sponsor in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , controls the communication unit 24 in order to access a server (not shown) corresponding to the URL via the network 6 , and displays the HP specified by the URL on another window (not shown) differing from the window 201 on the display unit 30 , or starts up the browser 31 , controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “Lifeguard” belonging to the category of program titles of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • step S 102 a process that is set by one of the processes of steps S 72 and S 80 is performed depending on whether or not the URL specified in the category of product names in the content metadata matches the URL specified by the commercial sponsor.
  • step S 101 When the commercial sponsor has not been selected in step S 101 , the process of step S 102 is skipped.
  • step S 103 the link instruction detector 32 determines whether or not the information on the BGM title has been selected on the basis of the signal from the light-receiving unit 33 . For example, when the related information display part 214 is displayed as shown in FIG. 9 , in the case that an area where “AAAA” in the fourth area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2 , the link instruction detector 32 assumes that the information on the BGM title has been selected, and the process proceeds to step S 104 .
  • step S 104 the link instruction detector 32 reads the extended content metadata corresponding to the BGM title in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “AAAA” belonging to the category of program titles of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • step S 104 When the BGM title has not been selected in step S 103 , the process of step S 104 is skipped.
  • step S 105 the link instruction detector 32 determines whether or not the information on the artist has been selected on the basis of the signal from the light-receiving unit 33 . For example, when the related information display part 214 has been displayed as shown in FIG. 9 , in the case that an area where “BBBB” in the fifth area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2 , the link instruction detector 32 assumes that the information on the artist has been selected, and the process proceeds to step S 106 .
  • step S 106 the link instruction detector 32 reads the extended content metadata corresponding to the artist in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the selection display controller 142 of the link display controller 34 displays, for example, a selection screen shown in FIG. 10 .
  • a selection screen 261 is displayed.
  • “Regarding selected artist” is displayed.
  • “HP of merchandise sales is displayed” is displayed.
  • “HP for search result in which artist name was used as keyword is displayed” is displayed.
  • Check buttons 262 - 1 and 262 - 2 used when making a selection are provided to the left of the respective choices.
  • step S 107 on the basis of the signal from the light-receiving unit 33 , the link instruction detector 32 determines whether or not “HP of merchandise sales is displayed” among “HP of merchandise sales is displayed” and “HP for search result in which artist name was used as keyword is displayed” has been selected. For example, when the check box 262 - 1 is checked and a set button 263 is operated, “HP of merchandise sales is displayed” is assumed to be selected.
  • step S 108 the link instruction detector 32 supplies the selection result to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , and controls the communication unit 24 so that the HP for the predetermined merchandise sales site is displayed on another window (not shown) differing from the window 201 on the display unit 30 via the network 6 .
  • step S 109 the link instruction detector 32 supplies the selection result to the link display controller 34 .
  • the browser controller 141 of the link display controller 34 starts up the browser 31 , and controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “BBBB” belonging to the category of artist names of the supplied extended content metadata as a keyword, and displays the HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • step S 105 when the BGM title has not been selected, the processing of steps S 106 to S 109 is skipped.
  • step S 110 the link instruction detector 32 determines whether or not the information on the performer has been selected on the basis of the signal from the light-receiving unit 33 .
  • the link instruction detector 32 assumes that the information on the artist has been selected, and the process proceeds to step S 111 .
  • step S 111 the link instruction detector 32 reads the extended content metadata corresponding to the performer in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34 .
  • the link display controller 34 displays an input screen 281 for a search keyword, in which “EEEE”, which is the name of the selected performer, has been input in advance.
  • the browser controller 141 starts up the browser 31 , controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “EEEE” input via the input screen 281 as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30 .
  • API Application Program Interface
  • Information to be displayed can be changed for each category of content metadata. Furthermore, for example, by displaying the URL of the information whose URL has been set in advance and by displaying merchandise sales sites of CDs and DVDs with regard to an artist or the like, it is possible to quickly provide information having a comparatively high possibility of being demanded. In a similar manner, also, with regard to other categories, such as BGM titles and product names, a site that is most appropriate for the category, such as a merchandise sales site, may be displayed.
  • a state of a keyword input screen is presented so that information that the user really desires to search for can be input, making it possible to provide precise information with regard to a request. Also, since a state in which a keyword has been input in advance is displayed, when related information is to be input, a minimum necessary input is performed. Therefore, it is possible to save time and effort and possible to quickly provide necessary information.
  • a search engine server In order to perform a keyword search in a site most appropriate for each category, access is made to a search engine server different for each category, and a search may be performed by using text data input via the input screen 281 as a keyword.
  • API Application Program Interface
  • the above-described series of image processing can be performed by hardware and also by software.
  • a program constituting the software is installed from a recording medium into a computer that is incorporated in specialized hardware, or such a program is installed from a recording medium into a general-purpose computer capable of performing various processes by installing various programs.
  • FIG. 12 shows an example of the configuration of a general-purpose personal computer.
  • the personal computer incorporates a CPU (Central Processing Unit) 1001 .
  • An input/output interface 1005 is connected to the CPU 1001 via a bus 1004 .
  • a ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004 .
  • An input unit 1006 including input devices composed of a keyboard, a mouse, and the like, via which the user inputs an operation command; an output unit 1007 for outputting a processing operation screen and an image of a processing result to a display unit; a storage unit 1008 including a hard disk drive for storing programs and various kinds of data, and the like; and a communication unit 1009 including a LAN (Local Area Network) adaptor and the like, which performs a communication process via a network typified by the Internet, are connected to the input/output interface 1005 .
  • LAN Local Area Network
  • a drive 1010 for reading and writing data from and to a removable medium 1011 such as a magnetic disk (including a flexible disk), an optical disc (including a CD-ROM (Compact Disc-Read Only Memory) and a DVD (Digital Versatile Disc)), a magneto-optical disk (including an MD (Mini Disc)), or a semiconductor memory, is connected to the input/output interface 1005 .
  • a removable medium 1011 such as a magnetic disk (including a flexible disk), an optical disc (including a CD-ROM (Compact Disc-Read Only Memory) and a DVD (Digital Versatile Disc)), a magneto-optical disk (including an MD (Mini Disc)), or a semiconductor memory
  • the CPU 1001 performs various kinds of processing in accordance with a program stored in the ROM 1002 or a program, which is read from the removable medium 1011 , such as a magnetic disk, an optical disc, a magneto-optical disk, or a semiconductor memory, which is installed into the storage unit 1008 , and which is loaded from the storage unit 1008 into the RAM 1003 .
  • the removable medium 1011 such as a magnetic disk, an optical disc, a magneto-optical disk, or a semiconductor memory
  • the RAM 1003 data necessary for the CPU 1001 to perform various kinds of processing, and the like, are also stored as appropriate.
  • steps describing a program recorded on a recording medium include processes that are performed in a time-series manner according to the written order, but also processes that are performed in parallel or individually although they may not be performed in a time-series manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

An information processing apparatus includes a content metadata obtaining unit configured to obtain content metadata; a determination unit configured to determine the category of the content metadata; an editing unit configured to edit the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined by the determination unit, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; a reproduction unit configured to reproduce the text data of the content metadata in synchronization with the reproduction of the content; and a display unit configured to display the information on the link destination on the basis of the content metadata edited by the editing unit when the predetermined operation is performed on the text data of the content metadata reproduced by the reproduction unit.

Description

CROSS REFERENCES TO RELATED APPLICATIONS
The present invention contains subject matter related to Japanese Patent Application JP 2007-000345 filed in the Japanese Patent Office on Jan. 5, 2007, the entire contents of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an information processing apparatus and method, and a program. More particularly, the present invention relates to an information processing apparatus and method capable of displaying related information in such a manner as to change the related information on the basis of the category of metadata of content and, to a program.
2. Description of the Related Art
Technologies for receiving and displaying content metadata, such as an EPG (Electronic Program Guide), have become widely popular.
An EPG is distributed at predetermined time intervals in a state in which it is contained in a broadcast wave, is received by a television receiver or the like, and is displayed as an electronic program guide separately from display of a program.
In recent years, a technology has been proposed in which text information contained in an EPG is displayed, and also a search process is performed by a search engine via the Internet by using the text information as a keyword and a search result is displayed (see Japanese Unexamined Patent Application Publication No. 2004-23345).
SUMMARY OF THE INVENTION
However, in the case of a search using text information contained in an EPG as a keyword, although related information can be displayed, an object desired to be searched for differs depending on the category of EPG. That is, in the case of, for example, persons' names contained in the category of performers among items represented by the EPG, in order to find information on individual performers, such information may be satisfactorily obtained by performing a search using the performer's name as a keyword as in the known case.
However, when it is desired to purchase CDs (Compact Discs), DVDs (Digital Versatile Discs), and the like, using a BGM title or an artist name, it is not possible to directly purchase such a CD or DVD having the BGM title or the artist name on the basis of the information obtained by searching by using the BGM title and the artist name as keywords.
If a user is to purchase such a product, the user needs to access a sales site of CDs and DVDs and needs to search the sales site by using the BGM title and the artist name in a similar manner again, which is a troublesome process.
The present invention has been made in view of such circumstances. In particular, it is desirable to obtain desired information by changing an item to be searched for in correspondence with the category of content metadata, such as an EPG, and by only clicking on the content metadata.
According to an embodiment of the present invention, there is provided an information processing apparatus including: content metadata obtaining means for obtaining content metadata; determination means for determining the category of the content metadata; editing means for editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined by the determination means, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproduction means for reproducing the text data of the content metadata in synchronization with the reproduction of the content; and display means for displaying the information on the link destination on the basis of the content metadata edited by the editing means when the predetermined operation is performed on the text data of the content metadata that is reproduced by the reproduction means.
In response to the predetermined operation being performed, the editing means may access a search engine on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that shows a search result of the text data of the content metadata is displayed.
The editing means may access a sales site of a commodity related to the text data in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that the sales site is displayed.
The editing means may display a screen for inputting text data to be searched for by a search engine in a state in which the text data has been input in advance in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, may access the search engine after the input of the text data is completed, and may edit the content metadata so that a homepage that shows a search result of the text data is displayed.
When the text data is a predetermined address or contains a predetermined address, the editing means may access a server of the predetermined address in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that exists at the predetermined address is displayed.
When the text data is a predetermined address or contains the predetermined address, in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, the editing means may display a selection screen for selecting either accessing a server indicated by the address and displaying a homepage of the address or accessing a search engine and displaying a homepage that shows a search result of the text data of the content metadata, and may edit the content metadata so that a homepage of the selected content is displayed.
According to another embodiment of the present invention, there is provided an information processing method including the steps of: obtaining content metadata; determining the category of the content metadata; editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined in the determination, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproducing the text data of the content metadata in synchronization with the reproduction of the content; and displaying the information on the link destination on the basis of the content metadata edited in the editing when the predetermined operation is performed on the text data of the content metadata that is reproduced in the reproduction.
According to another embodiment of the present invention, there is provided a program for enabling a computer to execute a method including the steps of: obtaining content metadata; determining the category of the content metadata; editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined in the determination, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproducing the text data of the content metadata in synchronization with the reproduction of the content; and displaying the information on the link destination on the basis of the content metadata edited in the editing when the predetermined operation is performed on the text data of the content metadata that is reproduced in the reproduction.
The computer program according to an embodiment of the present invention is stored on the program storage medium according to an embodiment of the present invention.
In the information processing apparatus and method, and the computer program according to embodiments of the present invention, when obtaining content metadata, determining the category of content metadata, and reproducing the text data of the content metadata on the basis of the determined category of the content metadata, in the case that the content metadata is edited so that information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data, the text data of the content metadata is reproduced in synchronization with the reproduction of the content, and when the predetermined operation is performed on the reproduced text data of the content metadata, the information on the link destination is displayed on the basis of the edited content.
The information processing apparatus according to an embodiment of the present invention may be an independent apparatus and may also be a block for performing information processing.
According to the embodiment of the present invention, it is possible to display related information in such a manner as to change the related information on the basis of the category of the metadata of content.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram showing an example of the configuration of an embodiment of an image display apparatus to which the present invention is applied;
FIG. 2 is a flowchart illustrating a content recording process;
FIG. 3 is a flowchart illustrating a content metadata storage process;
FIG. 4 is a flowchart illustrating a content reproduction process;
FIG. 5 illustrates an example of a window displayed by the content reproduction process;
FIG. 6 is a flowchart illustrating a content metadata editing process;
FIG. 7 is a flowchart illustrating a link display process;
FIG. 8 illustrates an example of a display of a related information display part;
FIG. 9 illustrates an example of a display of a related information display part;
FIG. 10 illustrates an example of a selection screen;
FIG. 11 illustrates an example of a keyword input screen; and
FIG. 12 illustrates an example of the configuration of a personal computer.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Before describing an embodiment of the present invention, the correspondence between the features of the claims and the specific elements disclosed in an embodiment of the present invention is discussed below. This description is intended to assure that an embodiment supporting the claimed invention is described in this specification. Thus, even if an element in the following embodiment is not described as relating to a certain feature of the present invention, that does not necessarily mean that the element does not relate to that feature of the claims. Conversely, even if an element is described herein as relating to a certain feature of the claims, that does not necessarily mean that the element does not relate to other features of the claims.
Furthermore, this description should not be construed as restricting that all the embodiments of the invention disclosed in the embodiment are described in the claims. That is, the description does not deny the existence of embodiments of the present invention that are described in the embodiment but not claimed in the invention of this application, i.e., the existence of embodiments of the present invention that in future may be claimed by a divisional application, or that may be additionally claimed through amendments.
An information processing apparatus according to an embodiment of the present invention includes: content metadata obtaining means (e.g., an EPG obtaining unit 21 shown in FIG. 1) for obtaining content metadata; determination means (e.g., a category identification unit 51 shown in FIG. 1) for determining the category of the content metadata; editing means (e.g., an extended editing unit 52 shown in FIG. 1) for editing the content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined by the determination means, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproduction means (e.g., a content reproduction unit 29 shown in FIG. 1) for reproducing the text data of the content metadata in synchronization with the reproduction of the content; and display means (e.g., a display unit 30 shown in FIG. 1) for displaying the information on the link destination on the basis of the content metadata edited by the editing means when the predetermined operation is performed on the text data of the content metadata that is reproduced by the reproduction means.
The editing means (e.g., an extended editing unit 52 shown in FIG. 1) may access a search engine in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that shows a search result of the text data of the content metadata is displayed.
The editing means (e.g., an extended editing unit 52 shown in FIG. 1) may access a sales site of a merchandise related to the text data in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that the sales site is displayed.
The editing means (e.g., an extended editing unit 52 shown in FIG. 1) may display a screen for inputting text data to be searched for by a search engine in a state in which the text data has been input in advance in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, may access the search engine after the input of the text data is completed, and may edit the content metadata so that a homepage that shows a search result of the text data is displayed.
When the text data is a predetermined address or contains a predetermined address, the editing means (e.g., an extended editing unit 52 shown in FIG. 1) may access a server of the predetermined address in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, and may edit the content metadata so that a homepage that exists at the predetermined address is displayed.
When the text data is a predetermined address or contains the predetermined address, in response to the predetermined operation being performed on the basis of the category of content metadata determined by the determination means, the editing means (e.g., an extended editing unit 52 shown in FIG. 1) may display a selection screen for selecting either accessing a server indicated by the address and displaying a homepage of the address or accessing a search engine and displaying a homepage that shows a search result of the text data of the content metadata, and may edit the content metadata so that a homepage of the selected content is displayed.
An information processing method according to another embodiment of the present invention includes the steps of: obtaining (e.g., step S22 shown in FIG. 3) content metadata; determining (e.g., step S63, S65, S67, S69, S71, S73, S75, or S77 shown in FIG. 3) the category of the content metadata; editing (e.g., step S64, S66, S68, S70, S72, S74, S76, S78, or S80 shown in FIG. 3) content metadata so that, when text data of the content metadata is to be reproduced on the basis of the category of the content metadata determined in the determination, information on a predetermined link destination is displayed in response to a predetermined operation being performed on the reproduced text data; reproducing (e.g., step S42 shown in FIG. 4) the text data of the content metadata in synchronization with the reproduction of the content; and displaying (e.g., step S94, S96, S98, S100, S102, S104, S106, S108, S109, or S111 shown in FIG. 7) the information on the link destination on the basis of the content metadata edited in the editing when the predetermined operation is performed on the text data of the content metadata that is reproduced in the reproduction.
FIG. 1 shows an example of the configuration of an embodiment of an image display apparatus to which the present invention is applied.
An image display apparatus 1 is, for example, a television receiver. The image display apparatus 1 is operated using a remote controller 2, and receives and displays content distributed in the form of a broadcast wave from a broadcast station by using an antenna 4 and also, records the content. Furthermore, the image display apparatus 1 obtains and displays content distributed by a program distribution server 5 via a network 6 typified by the Internet and also, records the content, or reproduces and displays the content.
In addition, the image display apparatus 1 obtains, as content metadata, not only EPG data contained in the broadcast wave, but also EPG data distributed from an EPG data distribution server 3 via the network 6. Furthermore, the image display apparatus 1 edits and displays content metadata, such as the obtained EPG, in an extended manner and also, performs a corresponding process when the displayed content metadata is selected from the remote controller 2 or the like, so that, for example, access is made to a search engine server 7 via the network 6, and the related information of the selected content metadata is obtained and displayed.
An EPG obtaining unit 21 obtains an EPG which is contained in a broadcast wave from a broadcast station (not shown) and which is distributed from the antenna 4, and allows a content metadata storage unit 22 to store the EPG as content metadata. Furthermore, the EPG obtaining unit 21 controls a communication unit 24 formed by a modem and the like in order to access the EPG data distribution server 3 via the network 6, obtains EPG data, and allows the content metadata storage unit 22 to store the EPG data as content metadata. The content metadata is text data in which information on content is described. For example, when content is a TV (television) program, the content metadata is the title of a program, video recording date and time, broadcast time, a channel (the channel is a broadcast station in the case of a broadcast wave, and is the distribution source company in the case of network distribution) of the program, a genre, a subgenre, performers, or the like. When content is a CM (commercial message), the content metadata includes a product name, a title, a commercial sponsor, a BGM title, an artist, performers, or the like.
A content video recorder 25 is controlled using the remote controller 2, sets a tuner 23 to a predetermined channel, receives content distributed from a broadcast station (not shown) and distributed as a broadcast wave via the antenna 4, and allows a content data storage unit 26 formed by, for example, an HDD (Hard Disk Drive) or the like to store the content as content data 111. Furthermore, the content video recorder 25 controls the communication unit 24, and allows the content data storage unit 26 to store content distributed by the program distribution server 5 via the network 6 as the content data 111.
When the content video recorder 25 records content, the content video recorder 25 controls a scene change detector 101 in order to detect the timing at which a scene changes in units of frames, and supplies a detection result to a film roll data creation unit 102. The film roll data creation unit 102 records the information on the frame numbers in the content data corresponding to the timing of the scene change in film roll data 112, and stores the film roll data 112 in the content data storage unit 26 in such a manner as to correspond to the content data 111. An image video recorder 103 allows the content data storage unit 26 to store, as content data, the information obtained as content by the content video recorder 25.
Therefore, in the content data storage unit 26, content data 111-1 to 111-n is stored, and film roll data 112-1 to 112-n is stored in a corresponding manner. Details of the film roll data 112 will be described later. When the content data 111-1 to 111-n and the film roll data 112-1 to 112-n do not need to be distinguished from each other, these will be referred to simply as content data 111 and film roll data 112, respectively, and the other data will be referred to in a similar manner.
When the reproduction of one of the pieces of the content data 111 in the content data storage unit 26 is started, a content metadata extended editing unit 27 reads the corresponding content metadata (EPG data) from the content metadata storage unit 22, extends and edits the content metadata, and stores the content metadata in an extended content metadata storage unit 28. More specifically, in addition to being represented as text data, the content metadata extended editing unit 27 extends content metadata formed of text data and edits it to form data to which usable functions are attached by using a language such as XML (Extensible Markup Language). Then, the edited data is used and displayed as a related information display part 214 (to be described later), as shown in FIGS. 4, 8, and 9.
A category identification unit 51 of the content metadata extended editing unit 27 identifies a category for each of the described content of the original content metadata read from the content metadata storage unit 22, and supplies the identification result to an extended editing unit 52. Furthermore, when a category is to be identified, a same information determination unit 51 a determines whether or not, for example, a plurality of pieces of the same information exist, that is, determines whether or not the same URL (Uniform Resource Locator) is contained although text data differs in the categories of product names and commercial sponsors in CM (commercial message) content, and supplies the determination result to a commercial sponsor editing unit 82.
A TV (television program) content editing unit 61 of the extended editing unit 52 extends and edits the content metadata of the TV content among the content metadata.
When the text data of the content metadata belonging to the category of program titles among the content metadata of the TV content is displayed and further selected, a program title editing unit 71 of the TV content editing unit 61 allows the search engine server 7 to perform a search by using the text data as a keyword, and extends and edits the content metadata so that an HP (homepage) that shows the search result is displayed.
When the text data of the content metadata belonging to the category of channel names among the content metadata of the TV content is displayed and further selected, a channel editing unit 72 of the TV content editing unit 61 allows the search engine server 7 to search for the content metadata by using the text data as a keyword, and extends and edits the content metadata so that an HP that shows the search result is displayed.
When the text data of the content metadata belonging to the category of performer names among the content metadata of the TV content is displayed and further selected, a performer editing unit 73 of the TV content editing unit 61 allows the search engine server 7 to search for the content metadata by using the text data as a keyword, and extends and edits the content metadata so that an HP that shows the search result is displayed.
The CM content editing unit 62 of the extended editing unit 52 extends and edits the content metadata of the CM content among the content metadata.
When the text data of the content metadata belonging to the category of product names among the content metadata of the CM content is displayed and selected, a product name editing unit 81 of the CM content editing unit 62 extends and edits the content metadata so that an HP of the address specified by the URL, at which the description of product names and the like are provided, is displayed. In this example, content metadata belonging to the category of product names is assumed to be a URL. However, the content metadata is not limited to a URL, and may be information other than that.
When the text data of the content metadata belonging to the category of commercial sponsors among the content metadata of CM content is displayed and selected, a commercial sponsor editing unit 82 of the CM content editing unit 62 extends and edits the content metadata so that the HP of the address specified by the URL at which the description of the commercial sponsor, and the like are provided, is displayed. Here, the content metadata belonging to the category of commercial sponsors is assumed to be a URL. However, the content metadata is not limited to a URL and may be information other than that.
However, when the following fact is notified by the same information determination unit 51 a that, regarding the information corresponding to the category of product names and commercial sponsors, not only the text data but also the URL contained is the same, in the case that the text data of the content metadata belonging to the category of commercial sponsors among the content metadata of the CM content is displayed and selected, the commercial sponsor editing unit 82 allows the search engine server 7 to search for the content metadata by using the text data as a keyword, extends and edits the content metadata so that an HP that shows the search result is displayed and information differing from that when the text data of the category of product names is displayed and selected is provided. This is only for the purpose of avoiding the case that the same information is provided in spite of the fact that different information has been selected. When the text data of the content metadata belonging to the category of product names among the content metadata is displayed and selected by the product name editing unit 81 in place of the commercial sponsor editing unit 82, the search engine server 7 is made to search for the content metadata by using the text data as a keyword, and the content metadata is extended and edited so that an HP that shows the search result is displayed.
When the text data of the content metadata belonging to the category of BGM titles among the content metadata of the CM content is displayed and selected, a BGM title editing unit 83 of the CM content editing unit 62 allows the search engine server 7 to perform a search by using the text data as a keyword, and extends and edits content metadata so that an HP that shows the search result is displayed.
When the text data of the content metadata belonging to the category of artist names among the content metadata of the CM content is displayed and selected, an artist editing unit 84 of the CM content editing unit 62 displays a screen from which a selection can be made as to displaying an HP of a merchandise sales site (sales site) at which CDs, DVDs, and the like of the artist are sold or allowing the search engine server 7 to perform a search by using the text data as a keyword and displaying an HP that shows the search result, and further extends and edits content metadata so that the selected HP is displayed. At this time, the selection display editing unit 63 generates information for displaying a selection screen used here as content metadata to be extended.
When the text data of the content metadata belonging to the category of performers among the content metadata of the CM content is displayed and selected, a performer editing unit 85 of the CM content editing unit 62 allows the search engine server 7 to perform a search by using the text data as a keyword, and extends and edits content metadata so that an HP that shows the search result is displayed.
When predetermined video-recorded data is selected by the operation unit 2 b of the remote controller 2 being operated, and a light-emission signal indicating that reproduction has been instructed from the light-emitting unit 2 a is received by a light-receiving unit 33, a content reproduction unit 29 controls an image reproduction unit 131 in order to read and reproduce the corresponding content data 111 from the content data storage unit 26, and allows a display unit 30 formed of a CRT (Cathode Ray Tube), an LCD (Liquid Crystal Display), or the like to display the content data. Furthermore, the content reproduction unit 29 controls a film roll reproduction unit 132 in order to read the film roll data 112 corresponding to the content data 111, and displays, on the display unit 30, a thumbnail image of the image immediately after the scene is changed, which corresponds to a time code recorded in the film roll data 112 in the manner of a film roll in synchronization with the timing at which the content data 111 is sequentially reproduced by the image reproduction unit 131.
An extended content metadata reproduction unit 133 reads extended content metadata from the extended content metadata storage unit 28, and displays the extended content metadata on the display unit 30 in synchronization with the reproduced content data 111.
On the basis of a signal such that a light-emission signal supplied from the remote controller 2 is received by a light-receiving unit 33, a link instruction detector 32 detects whether or not one of the pieces of text displayed as content metadata has been clicked on, and supplies information on the extended content metadata corresponding to the detection result to a link display controller 34.
On the basis of the extended content metadata supplied from the link instruction detector 32, the link display controller 34 controls a browser controller 141 in order to operate the browser 31. The browser 31 is controlled by the browser controller 141, controls the communication unit 24 as necessary, accesses the search engine server 7 via the network 6 in order to perform a search process by using a keyword, and allows the display unit 30 to display an HP that shows the search result. Furthermore, the browser 31 is controlled by the browser controller 141, controls the communication unit 24 as necessary, accesses a server (not shown) specified by a URL, a predetermined merchandise sales site, or the like via the network 6, and allows the display unit 30 to display the HP. When there are a plurality of processing results as a result of using the extended content metadata and one of them is to be selected, a selection display controller 142 allows the display unit 30 to display a selection screen for making a selection of the processing results.
Next, a description will be given, with reference to the flowchart in FIG. 2, a content recording process.
In step S1, the content video recorder 25 determines whether or not video recording has been instructed from a signal obtained from the light-receiving unit 33, and repeats the processing until video recording is instructed. In step S1, when, for example, the operation unit 2 b is operated to instruct content video recording by the user, the light-emitting unit 2 a emits a corresponding signal as a light-emission signal. At this time, the light-receiving unit 33 receives the light-emission signal of the light-emitting unit 2 a of the remote controller 2, and supplies a signal instructing the start of the video recording corresponding to the light-received signal to the content video recorder 25. In response, the content video recorder 25 determines that the video recording of the content has been instructed, and the process proceeds to step S2.
In step S2, the content video recorder 25 controls the tuner 23 in order to set a channel and obtains the content of the broadcast wave received by the antenna 4 at a predetermined channel. The content is not limited to content distributed in the form of a broadcast wave. For example, the content video recorder 25 also controls the communication unit 24 in order to access the predetermined program distribution server 5 via the network 6 and obtains content that is so-called network-distributed by the program distribution server 5.
In step S3, the content video recorder 25 controls the image video recorder 103 so that image data (containing audio data) of content is recorded as the content data 111 in the content data storage unit 26.
In step S4, the content video recorder 25 controls the scene change detector 101 so as to detect whether or not there is a scene change in units of frames in the obtained image data and determines whether or not a scene has been changed. More specifically, for example, when the addition result of the difference value of the pixel value between the pixels at the same position between adjacent frames is greatly changed, the scene change detector 101 detects the change of the scene.
When a scene change is detected in step S4, in step S5, the film roll data creation unit 102 converts the image of the frame immediately after a scene change has been detected into a thumbnail image, registers the thumbnail image, together with the time code at the time of reproduction, in the film roll data 112, updates the thumbnail image, and stores the thumbnail image in the content data storage unit 26. That is, in the film roll data 112, the thumbnail image of the image at the timing at which the scene change has been detected, together with the time code, is sequentially stored.
In step S6, the content video recorder 25 determines whether or not the completion of the video recording has been instructed or whether or not the distribution of the content has been completed. When the completion has not been instructed and the distribution of the content is continued, the process returns to step S2. That is, the processing of steps S2 to S6 is repeated from the timing at which the video recording of the content is instructed until the completion of the video recording is instructed or the distribution of the content is completed. Then, when the completion of the video recording is instructed or the distribution of the content is completed in step S6, the processing is completed.
As a result of the above processing, in the content data storage unit 26, the content data 111 for which video recording has been instructed and the film roll data 112 are stored.
Next, a description will be given, with reference to the flowchart in FIG. 3, of a content metadata storage process.
In step S21, the EPG obtaining unit 21 determines whether or not, for example, a predetermined period of time of approximately 24 hours has passed, and repeats the same process until the predetermined period of time has passed.
When it is determined in step S21 that, for example, a predetermined period of time has passed, in step S22, the EPG obtaining unit 21 obtains EPG data distributed in a state in which the EPG data is contained in a broadcast wave from a broadcast station (not shown), which is received via the antenna 4, and stores the EPG data as content metadata in the content metadata storage unit 22.
In step S23, the EPG obtaining unit 21 controls the communication unit 24 in order to obtain EPG data from the EPG data distribution server 3 via the network 6, and stores the EPG data as content metadata in the content metadata storage unit 22.
As a result of the above processing, the EPG data distributed in the form of a broadcast wave and the EPG data distributed via the network 6 are stored as content metadata in the content metadata storage unit 22. Basically, one of the EPG data distributed in the form of a broadcast wave and the EPG data distributed via the network 6 need only to be capable of being obtained. Therefore, in the processing of the flowchart in FIG. 3, one of the processes of steps S22 and S23 may be performed.
Next, a description will be given, with reference to the flowchart in FIG. 4, of a content reproduction process.
In step S41, the content reproduction unit 29 determines whether or not the reproduction of content has been instructed on the basis of a signal obtained by the light-receiving unit 33, and repeats the processing until the reproduction is instructed. In step S41, for example, when the operation unit 2 b is operated to instruct the reproduction of the content by the user, the light-emitting unit 2 a emits a corresponding signal as a light-emission signal. At this time, the light-receiving unit 33 receives the light-emission signal of the light-emitting unit 2 a of the remote controller 2, and supplies a signal instructing the start of the reproduction of the content corresponding to the light-received signal to the content reproduction unit 29. In response, the content reproduction unit 29 determines that the reproduction of the content has been instructed, and the process proceeds to step S42.
In step S42, the content reproduction unit 29 controls the image reproduction unit 131 in order to read the content data 111 corresponding to the content for which reproduction has been instructed, and allows the display unit 30 to display the content data 111 as, for example, an image-display-part search engine server 7 of FIG. 5. In FIG. 5, a window 201 is displayed, and within the window 201, an image display part 211, a film roll display part 212, a playlist display part 213, and a related information display part 214 are provided.
An image reproduction unit 131 causes an image of content recorded using the content data 111 to be reproduced in the image display part 211. At this time, the content reproduction unit 29 displays the content name of the content data 111 that is currently being reproduced in a playlist display column 213. In FIG. 5, in the playlist display column 213, “ZZZ” is displayed as the content name that is being reproduced, and “00:59:57” is displayed as a reproduction period of time.
In step S43, the film roll reproduction unit 132 reads the film roll data 112 corresponding to the content data 111 of the content that is being reproduced, and displays a film roll screen on the film roll display part 212.
In step S44, on the basis of the film roll data 112, the film roll reproduction unit 132 determines whether or not the current reproduction time is a timing at which the thumbnail image of the image after a scene is changed should be displayed. When it is a timing at which the thumbnail image should be displayed, in step S45, the thumbnail image is displayed on the film roll screen 221 of the film roll display part 212. When it is a timing at which the thumbnail image should not be displayed, the process of step S45 is skipped.
That is, as shown in FIG. 5, in the upper area in the center of the film roll display part 212, a film roll screen 221 is displayed, and thumbnail images 231-1 to 231-3 at the time of a scene change are displayed while being moved from the right to the left in the figure in synchronization with the change of the time code of the image to be reproduced. In FIG. 5, “0:18:33” and “0:18:36” are displayed as time codes at the reproduction time on the film roll screen 221. It is shown that the timing below the display time is a reproduction position of the video-recorded data corresponding to the time code, and the current reproduction position is a position indicated by the vertical straight line in the center of the film roll screen 221. Therefore, on the film roll screen 221, the thumbnail image of the scene change image is displayed at the right end in the figure at a timing immediately before the image is displayed on the image display part 211, and as the image reproduced on the image display part 211 progresses, the image moves to the left. At the timing at which the image is moved to the position of the vertical straight line in the center, the same image as the image processed as a thumbnail image is displayed in the image display part 211. Thereafter, the image moves to the left end, and before long, it disappears to the left end from the film roll screen 221.
On the film roll screen 221, it is possible to scroll the film roll screen 221 independently of the progress of the content data 111 reproduced by the image display part 211 by dragging the film roll screen 221 using a pointer or the like. At this time, when a button 222 designated as “present location” is operated, the film roll screen 221 is returned to the current reproduction position. Various kinds of operation buttons 223 shown in the lower left area in the figure enable the designation of playback start, halt, stop, fast rewinding, forward jump, or backward jump so as to operate a corresponding operation. Furthermore, when the thumbnail image 231 on the film roll screen 221 is designated, the reproduction position jumps to the reproduction position of the corresponding time code, and the playback is started.
In step S46, the content metadata extended editing unit 27 performs a content metadata editing process, extends and edits content metadata, and allows the extended content metadata storage unit 28 to store the content metadata.
A description will now be given, with reference to the flowchart in FIG. 6, of a content metadata editing process.
In step S61, the content metadata extended editing unit 27 reads content metadata corresponding to the content that is currently being reproduced from the content metadata storage unit 22.
In step S62, the category identification unit 51 determines whether or not the content that is currently being reproduced is TV (television) content, that is, whether or not the content is so-called content constituting a program.
When it is determined in step S62 that the content is, for example, TV content, in step S63, the category identification unit 51 determines whether or not content metadata belonging to the category of program titles exists on the basis of the read content metadata. When it is determined in step S63 that the content metadata belonging to the category of program titles exists, in step S64, the category identification unit 51 reads the content metadata belonging to the category of program titles and supplies the content metadata to the TV content editing unit 61. The TV content editing unit 61 controls the program title editing unit 71. When the information on a program title is, for example, selected by being clicked using a pointer or the like, the program title editing unit 71 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of program titles as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and stores the content metadata in the extended content metadata storage unit 28.
When it is determined in step S63 that content data belonging to the category of program titles does not exist, the process of step S64 is skipped.
In step S65, the category identification unit 51 determines whether or not the content metadata belonging to the category of channels exists on the basis of the read content metadata. When it is determined in step S65 that the content metadata belonging to the category of channels exists, in step S66, the category identification unit 51 reads the content metadata belonging to the category of channels and supplies the content metadata to the TV content editing unit 61. The TV content editing unit 61 controls the channel editing unit 72. When the information on the channel is selected, for example, by being clicked using a pointer or the like, the channel editing unit 72 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of channels as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
As the content metadata, when the text data belonging to the category of channels represents an URL of a broadcast station of a channel, or the like, the TV content editing unit 61 may control the channel editing unit 72 in order to access the URL, and may edit metadata so that an HP of the broadcast station of the channel is displayed.
When it is determined in step S65 that the content metadata belonging to the category of channels does not exist, the process of step S66 is skipped.
In step S67, the category identification unit 51 determines whether or not content metadata belonging to the category of performers exists on the basis of the read content metadata. When it is determined in step S67 that the content metadata belonging to the category of performers exists, in step S68, the category identification unit 51 reads the content metadata belonging to the category of performers, and supplies the content metadata to the TV content editing unit 61. The TV content editing unit 61 controls the performer editing unit 73. When the information on the performers is selected, for example, by being clicked using a pointer or the like, the performer editing unit 73 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of channels as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
When it is determined in step S67 that the content metadata belonging to the category of performers does not exist, the process of step S68 is skipped.
On the other hand, when it is determined in step S62 that, for example, the content metadata is not TV content, for example, the content metadata is CM content, in step S69, the category identification unit 51 determines whether or not the content metadata belonging to the category of product names exists on the basis of the read content metadata. When it is determined in step S69 that the content metadata belonging to the category of product names exists, in step S70, the category identification unit 51 reads the content metadata belonging to the category of product names and supplies the content metadata to the CM content editing unit 62. The CM content editing unit 62 controls the product name editing unit 81. When the information on the product name is selected, for example, by being clicked using a pointer or the like, the product name editing unit 81 accesses a corresponding server on the basis of the URL supplied as the text data of the content metadata belonging to the category of product names, edits the description of the content metadata so that an HP that introduces products is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
When it is determined in step S69 that the content metadata belonging to the category of product names does not exist, the process of step S70 is skipped.
In step S71, the category identification unit 51 determines whether or not the content metadata belonging to the category of commercial sponsors exists on the basis of the read content metadata. When it is determined in step S71 that the content metadata belonging to the category of commercial sponsors exists, in step S72, the category identification unit 51 reads the content metadata belonging to the category of commercial sponsors, and supplies the content metadata to the CM content editing unit 62. The CM content editing unit 62 controls the commercial sponsor editing unit 82. When the information on the commercial sponsor is selected, for example, by being clicked using a pointer or the like, the CM content editing unit 62 accesses the corresponding server on the basis of the URL supplied as the text data of the content metadata belonging to the category of commercial sponsors, edits the description of the content metadata so that an HP that introduces companies that are commercial sponsors is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
When it is determined in step S71 that the content metadata belonging to the category of product names does not exist, the process of step S72 is skipped.
In step S73, the category identification unit 51 determines whether or not the content metadata belonging to the category of BGM titles exists on the basis of the read content metadata. When it is determined in step S73 that the content metadata belonging to the category of BGM titles exists, in step S74, the category identification unit 51 reads the content metadata belonging to the category of BGM titles and supplies the content metadata to the CM content editing unit 62. The CM content editing unit 62 controls the BGM title editing unit 83. When the information on the BGM title is selected, for example, by being clicked using a pointer or the like, the BGM title editing unit 83 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of BGM titles as a keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
When it is determined in step S73 that the content metadata belonging to the category of BGM titles does not exist, the process of step S74 is skipped.
In step S75, the category identification unit 51 determines whether or not the content metadata belonging to the category of artists exists on the basis of the read content metadata. When it is determined in step S75 that the content metadata belonging to the category of artists exists, in step S76, the category identification unit 51 reads the content metadata belonging to the category of artists and supplies the content metadata to the CM content editing unit 62. The CM content editing unit 62 controls the artist editing unit 84. When the information on the artist is selected, for example, by being clicked using a pointer or the like, the artist editing unit 84 displays a selection screen for accessing a merchandise sales site where CDs, DVDs, and the like of the artist are sold and displaying the corresponding HP, or for allowing the search engine server 7 to perform a search by using the content metadata belonging to the category of artists as a keyword and displaying a selection screen for selecting one of HPs that show the search result, and edits the description of the content metadata so that the HP selected on the selection screen is displayed. At this time, the selection display editing unit 63 creates the above-described selection screen, and the artist editing unit 84 edits the description of the content metadata so that the created selection screen is displayed and allows the extended content metadata storage unit 28 to store the content metadata.
When it is determined in step S75 that the content metadata belonging to the category of BGM titles does not exist, the process of step S76 is skipped.
In step S77, the category identification unit 51 determines whether or not the content metadata belonging to the category of performers exists on the basis of the read content metadata. When it is determined in step S77 that the content metadata belonging to the category of performers exists, in step S78, the category identification unit 51 reads the content metadata belonging to the category of performers and supplies the content metadata to the CM content editing unit 62. The CM content editing unit 62 controls the performer editing unit 85. When the information on the performers is selected, for example, by being clicked using a pointer or the like, the performer editing unit 85 displays a screen from which a keyword can be input in a state in which text data of the content metadata belonging to the category of performers has been input in advance, allows, when the keyword input is determined, the search engine server 7 to perform a search by using the determined keyword, edits the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata.
When it is determined in step S77 that the content metadata belonging to the category of performers does not exist, the process of step S78 is skipped.
In step S79, the category identification unit 51 controls the same information determination unit 51 a so as to determine whether or not content metadata with which the same URL is displayed exists when the content metadata is selected. When, for example, the URL of the product name is the same as the URL of the commercial sponsor in step S79, it is assumed that the same URL is possibly displayed. In step S80, the category identification unit 51 supplies the fact that the URL of the product name is the same as the URL of the commercial sponsor to the CM content editing unit 62. The CM content editing unit 62 controls the commercial sponsor editing unit 82. When the information on the commercial sponsors is selected, for example, by being clicked using a pointer or the like, the CM content editing unit 62 allows the search engine server 7 to perform a search by using the text data of the content metadata belonging to the category of commercial sponsors as a keyword, edits again the description of the content metadata so that an HP that shows the search result is displayed, and allows the extended content metadata storage unit 28 to store the content metadata. That is, the URL is displayed for the information on the product name or the commercial sponsor. Therefore, regarding the information on the commercial sponsor, the URL is not directly displayed, but the HP that shows the search result in which the text data was used as a keyword is displayed, so that, when the commercial sponsor is selected, the same information can be made not to be displayed.
When it is determined in step S79 that the content metadata with which the same URL is displayed does not exist, the process of step S80 is skipped.
In step S81, the extended editing unit 52 stores other content metadata that is not extended as it is in the extended content metadata storage unit 28. That is, when the content is a TV (television) program, the content metadata is video recording date and time, broadcast time, a genre, a subgenre, or the like. When the content is a CM (commercial message), here, since the content metadata, such as a title, is not particularly extended, the content metadata is stored as it is in the extended content metadata storage unit 28.
As a result of the above processing, since the content metadata of the content that is being reproduced is converted from simple text data to extended content metadata, when the content metadata is selected, it becomes possible to not only display the content metadata simply as text data, but also display corresponding extended related information for each category.
Here, the description returns to the flowchart in FIG. 4.
In step S47, the extended content metadata reproduction unit 133 accesses the extended content metadata storage unit 28 and determines whether or not the content metadata to be displayed exists. For example, as a result of the processing of step S46, when the extended content metadata is stored in the extended content metadata storage unit 28, it is assumed that the content metadata to be displayed exists. In step S48, the extended content metadata reproduction unit 133 reads the extended content metadata stored in the extended content metadata storage unit 28 and displays the extended content metadata on the display unit 30, as shown in the related information display part 214 in FIG. 5.
In FIG. 5, in the related information display part 214, content metadata of CM content is displayed. In the topmost area, “Tantan label (draft)” is displayed as content metadata that fits into the category of product names, indicating that the product name introduced by the CM content that is currently being reproduced is “Tantan label (draft)”.
Below the product name, the content metadata that fits into the category of titles is displayed as “Tantan label (draft) picnic version”, indicating that the title of the CM content is “Tantan label (draft) picnic version”.
Below the title, as content metadata that fits into the category of commercial sponsors, “Karin beer” is displayed, indicating that the commercial sponsor of the CM content is “Karin beer”.
Below the commercial sponsor, as content metadata that fits into the category of BGM titles, “Opening march” is displayed, indicating that the piece of music that is being broadcast as a BGM of the CM content is “Opening march”.
Below the BGM title, content metadata that fits into the category of artists, “XXX (composed)” is displayed, indicating that the composer of the BGM of the CM content is “XXX”.
Below the artist, as the content metadata that fits into the category of performers, “ABC”, “DEF”, and “GHI” are displayed, indicating that the performers for the CM content are “ABC”, “DEF”, and “GHI”.
For these pieces of the related information, the text data of the content metadata is displayed only. When the pointer is moved to the position at which the “Tantan label”, which is a product name according to the above-described extended content metadata, “Karin beer”, which is a commercial sponsor, “Opening march”, which is a BGM title, “XXX”, which is an artist, or “ABC”, “DEF”, and “GHI” of the performers is displayed, and is selected by the pointer being clicked, a link display process (to be described later) causes the above-described extended link display process to be performed.
When it is determined in step S47 that the content metadata does not exist in the extended content metadata storage unit 28, the process of step S47 is skipped.
In step S49, the content reproduction unit 29 determines whether or not the stopping of the reproduction of the content has been instructed. When it is determined that the stopping of the reproduction of the content has not been instructed, the process returns to step S42. That is, as long as the reproduction is continued, the processing of steps S42 to S49 is repeated.
Then, when it is determined in step S49 that the stopping of the reproduction has been instructed, the processing is completed.
As a result of the above processing, when the reproduction of the video-recorded content is started, as shown in FIG. 5, an image shown on a window 201 including the image display part 211, the film roll display part 212, the playlist display part 213, and the related information display part 214 is continued to be displayed.
Next, a description will be given, with reference to the flowchart in FIG. 7, of a link display process.
In step S91, the link instruction detector 32 queries the content reproduction unit 29 in order to determine whether or not one of the content data 111 is currently being read and the content is being reproduced. The process is repeated until a state in which the content is being reproduced is reached. For example, in step S91, when content is being reproduced by the process described with reference to FIG. 4 described above, the process proceeds to step S92.
In step S92, the link instruction detector 32 queries the content reproduction unit 29 in order to determine whether or not the content that is currently being reproduced is TV content. For example, when the content is TV content, the process proceeds to step S93.
In step S93, the link instruction detector 32 determines whether or not the information on the program title has been selected on the basis of a signal from the light-receiving unit 33. For example, when the related information display part 214 is displayed as shown in FIG. 8, an area where “AAA” in the topmost area is displayed is selected in such a manner that the area is clicked on by the operation unit 2 b of the remote controller 2 being operated, the link instruction detector 32 assumes that the information on the program title is selected, and the process proceeds to step S94.
In step S94, the link instruction detector 32 reads extended content metadata corresponding to the program title of the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, controls the communication unit 24 in order to access the search engine server 7 via the network 6, allows the search engine server 7 to perform a search by using the text data “AAA” belonging to the category of program titles of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
In the topmost area in FIG. 8, “AAA” is displayed as the program title, indicating that the program title is “AAA”. Below the program title, the channel is displayed as “XCH BBB television”, indicating that the content that is being reproduced is content broadcast by the BBB television of channel X. Below the channel, as video recording date and time information containing the video recording time, “2006/4/17 (Mon.) 21:00-22:09 (1 hour 9 minutes)” is displayed, indicating that the content is content of 1 hour 9 minutes from 21:00 to 22:09 and the date and time at which the content was recorded is Monday Apr. 17, 2006. Below the channel, as the genre, “Variety Show/Others” is displayed, indicating that the genre is “Variety Show/Others”. Below the genre, as performers, “CCC”, “DDD”, and “EEE” are displayed, indicating that the performers are “CCC”, “DDD”, and “EEE”.
When the program title is not selected in step S93, the process of step S94 is skipped.
In step S95, the link instruction detector 32 determines whether or not the information on the channel has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 is displayed as shown in FIG. 8, in the case that an area where “BBB television” in the second area from the top is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2, the link instruction detector 32 assumes that the information on the channel has been selected, and the process proceeds to step S96.
In step S96, the link instruction detector 32 reads the extended content metadata corresponding to the channel of the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, controls the communication unit 24 in order to access the search engine server 7 via the network 6, allows the search engine server 7 to perform a search by using the text data “BBB television” belonging to the category of channels of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
When the channel is not selected in step S95, the process of step S96 is skipped.
In step S97, the link instruction detector 32 determines whether or not the information on the performers has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 is displayed as shown in FIG. 8, in the case that an area where “DDD” among “CCC”, “DDD”, and “EEE” in the lowest area from the top is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2, the link instruction detector 32 assumes that the information on the performer has been selected, and the process proceeds to step S98.
In step S98, the link instruction detector 32 reads the extended content metadata corresponding to the performer in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, controls the communication unit 24 in order to access the search engine server 7 via the network 6, allows the search engine server 7 to perform a search by using the text data “DDD” belonging to the category of performers of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
When the performer has not been selected in step S97, the process of step S98 is skipped.
On the other hand, when the content that is currently being reproduced is CM content, the process proceeds to step S99.
In step S99, the link instruction detector 32 determines whether or not the information on the product name has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 is displayed as shown in FIG. 9, in the case that an area where “Lifeguard” in the topmost area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2, the link instruction detector 32 assumes that the information on the product name has been selected, and the process proceeds to step S100.
In step S100, the link instruction detector 32 reads the extended content metadata corresponding to the product name in the selected area and supplies the extended content metadata to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, controls the communication unit 24 in order to access a server (not shown) corresponding to the URL via the network 6, and displays the HP specified by a URL on another window (not shown) differing from the window 201 on the display unit 30.
In the topmost area in FIG. 9, as the product name, “Lifeguard” is displayed, indicating that the product name is “Lifeguard”. Below the product name, as a title, “Lifeguard “Financial conglomerate version”” is displayed, indicating that the title of the CM content is “Lifeguard “Financial conglomerate version””. Below the title, as a BGM title, “AAAA” is displayed, indicating that the BGM of the CM content is “AAAA”. Below the BGM title, as an artist, “BBBB” is displayed, indicating that the artist is “BBBB”. Below the artist, as performers, “CCCC”, “DDDD”, and “EEEE” are displayed, indicating that the performers are “CCCC”, “DDDD”, and “EEEE”.
When the product name has not been selected in step S99, the process of step S100 is skipped.
In step S101, the link instruction detector 32 determines whether or not the information on the commercial sponsor has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 has been displayed as shown in FIG. 9, in the case that an area where “Lifeguard” in the third area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2, the link instruction detector 32 assumes that the information on the commercial sponsor has been selected, and the process proceeds to step S102.
In step S102, the link instruction detector 32 reads the extended content metadata corresponding to the commercial sponsor in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, controls the communication unit 24 in order to access a server (not shown) corresponding to the URL via the network 6, and displays the HP specified by the URL on another window (not shown) differing from the window 201 on the display unit 30, or starts up the browser 31, controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “Lifeguard” belonging to the category of program titles of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
That is, regarding the process of step S102, a process that is set by one of the processes of steps S72 and S80 is performed depending on whether or not the URL specified in the category of product names in the content metadata matches the URL specified by the commercial sponsor.
When the commercial sponsor has not been selected in step S101, the process of step S102 is skipped.
In step S103, the link instruction detector 32 determines whether or not the information on the BGM title has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 is displayed as shown in FIG. 9, in the case that an area where “AAAA” in the fourth area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2, the link instruction detector 32 assumes that the information on the BGM title has been selected, and the process proceeds to step S104.
In step S104, the link instruction detector 32 reads the extended content metadata corresponding to the BGM title in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “AAAA” belonging to the category of program titles of the supplied extended content metadata as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
When the BGM title has not been selected in step S103, the process of step S104 is skipped.
In step S105, the link instruction detector 32 determines whether or not the information on the artist has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 has been displayed as shown in FIG. 9, in the case that an area where “BBBB” in the fifth area is displayed is selected by being clicked by operating the operation unit 2 b of the remote controller 2, the link instruction detector 32 assumes that the information on the artist has been selected, and the process proceeds to step S106.
In step S106, the link instruction detector 32 reads the extended content metadata corresponding to the artist in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. The selection display controller 142 of the link display controller 34 displays, for example, a selection screen shown in FIG. 10. In FIG. 10, a selection screen 261 is displayed. In the top area, “Regarding selected artist” is displayed. In the next area, “HP of merchandise sales is displayed” is displayed. In the area further below, “HP for search result in which artist name was used as keyword is displayed” is displayed. Check buttons 262-1 and 262-2 used when making a selection are provided to the left of the respective choices.
In step S107, on the basis of the signal from the light-receiving unit 33, the link instruction detector 32 determines whether or not “HP of merchandise sales is displayed” among “HP of merchandise sales is displayed” and “HP for search result in which artist name was used as keyword is displayed” has been selected. For example, when the check box 262-1 is checked and a set button 263 is operated, “HP of merchandise sales is displayed” is assumed to be selected. In step S108, the link instruction detector 32 supplies the selection result to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, and controls the communication unit 24 so that the HP for the predetermined merchandise sales site is displayed on another window (not shown) differing from the window 201 on the display unit 30 via the network 6.
On the other hand, when it is determined in step S107 that the check box 262-2 has been checked and the set button 263 has been operated, “HP for search result in which artist name was used as keyword is displayed” is assumed to be selected. In step S109, the link instruction detector 32 supplies the selection result to the link display controller 34. The browser controller 141 of the link display controller 34 starts up the browser 31, and controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “BBBB” belonging to the category of artist names of the supplied extended content metadata as a keyword, and displays the HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
In step S105, when the BGM title has not been selected, the processing of steps S106 to S109 is skipped.
In step S110, the link instruction detector 32 determines whether or not the information on the performer has been selected on the basis of the signal from the light-receiving unit 33. For example, when the related information display part 214 is displayed as shown in FIG. 9, in the case that an area where one of “CCCC”, “DDDD”, and “EEEE” in the lowest area, for example, “EEEE”, is displayed is selected in such a manner that the operation unit 2 b of the remote controller 2 is operated to be clicked, the link instruction detector 32 assumes that the information on the artist has been selected, and the process proceeds to step S111.
In step S111, the link instruction detector 32 reads the extended content metadata corresponding to the performer in the selected area from the content reproduction unit 29 and supplies the extended content metadata to the link display controller 34. As shown in FIG. 11, the link display controller 34 displays an input screen 281 for a search keyword, in which “EEEE”, which is the name of the selected performer, has been input in advance. Also, when the input of the keyword is completed and a search button 292 is operated, the browser controller 141 starts up the browser 31, controls the communication unit 24 in order to access the search engine server 7 via the network 6 so that a search is performed by using the text data “EEEE” input via the input screen 281 as a keyword, and displays an HP that shows the search result on another window (not shown) differing from the window 201 of the display unit 30.
In order to access a desired search engine server so as to perform a search by using a desired keyword, this can easily be performed by using API (Application Program Interface), which is a collection of commands and functions that are prescribed by the search engine server 7.
As a result of the above processing, not only text data of content metadata is simply displayed, but also can a search result be displayed in which a search was performed using text of content metadata by a search engine. Therefore, it is possible to extend content metadata to information to be searched for.
Information to be displayed can be changed for each category of content metadata. Furthermore, for example, by displaying the URL of the information whose URL has been set in advance and by displaying merchandise sales sites of CDs and DVDs with regard to an artist or the like, it is possible to quickly provide information having a comparatively high possibility of being demanded. In a similar manner, also, with regard to other categories, such as BGM titles and product names, a site that is most appropriate for the category, such as a merchandise sales site, may be displayed.
Furthermore, even in the case of the same keyword, by setting a selection screen for displaying a site of merchandise sales or for simply performing a search, only making a selection enables a variety of information to be quickly switched and provided.
Regarding a keyword search, for example, as shown in FIG. 11, a state of a keyword input screen is presented so that information that the user really desires to search for can be input, making it possible to provide precise information with regard to a request. Also, since a state in which a keyword has been input in advance is displayed, when related information is to be input, a minimum necessary input is performed. Therefore, it is possible to save time and effort and possible to quickly provide necessary information.
In order to perform a keyword search in a site most appropriate for each category, access is made to a search engine server different for each category, and a search may be performed by using text data input via the input screen 281 as a keyword.
Accessing a desired search engine server so as to perform a search using a keyword can be easily realized by using API (Application Program Interface), which is a collection of commands and functions, which are prescribed for each search engine server.
The above-described series of image processing can be performed by hardware and also by software. When the series of processing is to be performed by software, a program constituting the software is installed from a recording medium into a computer that is incorporated in specialized hardware, or such a program is installed from a recording medium into a general-purpose computer capable of performing various processes by installing various programs.
FIG. 12 shows an example of the configuration of a general-purpose personal computer. The personal computer incorporates a CPU (Central Processing Unit) 1001. An input/output interface 1005 is connected to the CPU 1001 via a bus 1004. A ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004.
An input unit 1006 including input devices composed of a keyboard, a mouse, and the like, via which the user inputs an operation command; an output unit 1007 for outputting a processing operation screen and an image of a processing result to a display unit; a storage unit 1008 including a hard disk drive for storing programs and various kinds of data, and the like; and a communication unit 1009 including a LAN (Local Area Network) adaptor and the like, which performs a communication process via a network typified by the Internet, are connected to the input/output interface 1005. A drive 1010 for reading and writing data from and to a removable medium 1011, such as a magnetic disk (including a flexible disk), an optical disc (including a CD-ROM (Compact Disc-Read Only Memory) and a DVD (Digital Versatile Disc)), a magneto-optical disk (including an MD (Mini Disc)), or a semiconductor memory, is connected to the input/output interface 1005.
The CPU 1001 performs various kinds of processing in accordance with a program stored in the ROM 1002 or a program, which is read from the removable medium 1011, such as a magnetic disk, an optical disc, a magneto-optical disk, or a semiconductor memory, which is installed into the storage unit 1008, and which is loaded from the storage unit 1008 into the RAM 1003. In the RAM 1003, data necessary for the CPU 1001 to perform various kinds of processing, and the like, are also stored as appropriate.
In this specification, steps describing a program recorded on a recording medium include processes that are performed in a time-series manner according to the written order, but also processes that are performed in parallel or individually although they may not be performed in a time-series manner.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. An information processing apparatus comprising:
content obtaining means for obtaining content from a source;
content metadata obtaining means for obtaining previously generated content metadata which includes text data corresponding to the obtained content;
a storage unit that stores the obtained content and the content metadata;
determination means for determining whether the obtained text data is included in a predetermined content metadata category;
editing means for editing, when the obtained text data is determined to be included in the predetermined content metadata category, the obtained content metadata to create and store extended content metadata which includes the obtained text data in association with information on a link destination corresponding to the category of the obtained text data to be displayed when a predetermined operation is performed on the text data of the obtained content metadata being reproduced, and for storing, when the obtained text data is determined to not be included in the predetermined content metadata category, the obtained content metadata in an unmodified format;
reproduction means for reproducing, at the information processing apparatus, the text data of the extended content metadata in synchronization with the reproduction of associated content; and
display means for displaying, at the information processing apparatus, the information on the link destination included in the extended content metadata created by the editing means when the predetermined operation is performed on the text data of the extended content metadata that is reproduced by the reproduction means,
wherein the determination means automatically determines whether the obtained text data read from the storage unit and corresponding to content being currently reproduced by the reproduction means and displayed on the display means is included in the predetermined content metadata category, and the editing means automatically edits the obtained content metadata corresponding to the content being currently reproduced by the reproduction means and displayed on the display means to create and store the extended content metadata.
2. The information processing apparatus according to claim 1, wherein the editing means accesses a search engine in response to the predetermined operation being performed on the obtained text data, performs a search based on the text data of the content metadata, obtains a search result containing the information on the link destination and a corresponding homepage, and edits the obtained content metadata so that the obtained information on the link destination and corresponding homepage are included in the extended content metadata and may be displayed.
3. The information processing apparatus according to claim 1, wherein the editing means accesses a sales site of a commodity related to the text data in response to the predetermined operation being performed on the obtained text data, and edits the obtained content metadata so that the sales site may be displayed.
4. The information processing apparatus according to claim 1, wherein the editing means displays, on the display means, a screen for inputting text data to be searched for by a search engine in a state in which the text data has been input in advance in response to the predetermined operation being performed on the obtained text data, accesses the search engine after the input of the text data is completed, performs a search based on the input text data, obtains a search result containing a homepage, and edits the obtained content metadata so that the homepage is included in the extended content metadata and-may be displayed.
5. The information processing apparatus according to claim 1, wherein, when the text data contains a predetermined link destination, the editing means accesses a server of the predetermined link destination in response to the predetermined operation being performed on the obtained text data, and edits the obtained content metadata so that a homepage that exists at the predetermined link destination may be displayed.
6. The information processing apparatus according to claim 1, wherein, when the text data contains a predetermined link destination, in response to the predetermined operation being performed on the obtained text data, the editing means displays a selection screen for selecting either accessing a server indicated by the predetermined link destination and displaying a homepage of the predetermined link destination or accessing a search engine and displaying a homepage that shows a search result of the text data of the content metadata, and edits the obtained content metadata so that the homepage of the selected content may be displayed.
7. An information processing method, implemented on an information processing apparatus, comprising:
obtaining content from a source;
obtaining, at the information processing apparatus, previously generated content metadata which includes text data corresponding to the obtained content;
storing, in a storage unit of the information processing apparatus, the obtained content and the content metadata;
determining whether the obtained text data is included in a predetermined content metadata category;
editing, when the obtained text data is determined to be included in the predetermined content metadata category, the obtained content metadata to create and store extended content metadata which includes the obtained text data in association with information on a link destination corresponding to the category of the obtained text data to be when a predetermined operation is performed on the text data of the obtained content metadata being reproduced, and storing, when the obtained text data is determined to not be included in the predetermined content metadata category, the obtained content metadata in an unmodified format;
reproducing, at the information processing apparatus, the text data of the extended content metadata in synchronization with the reproduction of associated content; and
displaying, at the information processing apparatus, the information on the link destination included in the extended content metadata created in the editing step when the predetermined operation is performed on the text data of the extended content metadata that is reproduced in the reproduction step,
wherein the determining includes automatically determining whether the obtained text data read from the storage unit and corresponding to content being currently reproduced and displayed is included in the predetermined content metadata category, and the editing includes automatically editing the obtained content metadata corresponding to the content being currently reproduced and displayed to create and store the extended content metadata.
8. A non-transitory computer readable storage medium for storing therein a computer program that includes instructions which when executed on an information processing apparatus including at least one processor, causes the at least one processor to execute a method comprising:
obtaining, at the information processing apparatus, content from a source;
obtaining, at the information processing apparatus, previously generated content metadata which includes text data corresponding to the obtained content;
storing, in a storage unit of the information processing apparatus, the obtained content and the content metadata;
determining whether the obtained text data is included in a predetermined content metadata category;
editing, when the obtained text data is determined to be included in the predetermined content metadata category, the obtained text data to create and store extended content metadata which includes the obtained content metadata in association with information on a link destination corresponding to the category of the obtained text data to be when a predetermined operation is performed on the text data of the obtained content metadata being reproduced, and storing, when the obtained text data is determined to not be included in the predetermined content metadata category, the obtained content metadata in an unmodified format;
reproducing, at the information processing apparatus, the text data of the extended content metadata in synchronization with the reproduction of associated content; and
displaying, at the information processing apparatus, the information on the link destination included in the extended content metadata created in the editing step when the predetermined operation is performed on the text data of the extended content metadata that is reproduced in the reproduction step,
wherein the determining includes automatically determining whether the obtained text data read from the storage unit and corresponding to content being currently reproduced and displayed is included in the predetermined content metadata category, and the editing includes automatically editing the obtained content metadata corresponding to the content being currently reproduced and displayed to create and store the extended content metadata.
9. An information processing apparatus including at least one processor comprising:
a content obtaining unit that obtains content from a source;
a content metadata obtaining unit that obtains previously generated content metadata which includes text data corresponding to the obtained content;
a storage unit that stores the obtained content and the content metadata;
a determination unit that determines whether the obtained text data is included in a predetermined content metadata category;
an editing unit that edits, when the obtained text data is determined to be included in the predetermined content metadata category by the determination unit, the obtained content metadata to create and store extended content metadata which includes the obtained text data in association with information on a link destination corresponding to the category of the obtained text data to be displayed when a predetermined operation is performed on the text data of the obtained content metadata being reproduced, and that stores, when the obtained text data is determined to not be included in the predetermined content metadata category, the obtained content metadata in an unmodified format;
a reproduction unit that reproduces, at the information processing apparatus, the text data of the extended content metadata in synchronization with the reproduction of associated content; and
a display unit that displays, at the information processing apparatus, the information on the link destination included in the extended content metadata created by the editing unit when the predetermined operation is performed on the text data of the extended content metadata that is reproduced by the reproduction unit,
wherein the determination unit automatically determines whether the obtained text data read from the storage unit and corresponding to content being currently reproduced by the reproduction unit and displayed on the display unit is included in the predetermined content metadata category, and the editing unit automatically edits the obtained content metadata corresponding to the content being currently reproduced by the reproduction unit and displayed on the display unit to create and store the extended content metadata,
wherein the at least one processor includes the content obtaining unit, the content metadata obtaining unit, the determination unit, and the editing unit.
US11/969,468 2007-01-05 2008-01-04 Information processing apparatus and method, and program Active 2030-01-05 US8316395B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007000345A JP2008167363A (en) 2007-01-05 2007-01-05 Information processor and information processing method, and program
JP2007-000345 2007-01-05

Publications (2)

Publication Number Publication Date
US20080168499A1 US20080168499A1 (en) 2008-07-10
US8316395B2 true US8316395B2 (en) 2012-11-20

Family

ID=39595410

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/969,468 Active 2030-01-05 US8316395B2 (en) 2007-01-05 2008-01-04 Information processing apparatus and method, and program

Country Status (3)

Country Link
US (1) US8316395B2 (en)
JP (1) JP2008167363A (en)
CN (1) CN101247494B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080065925A1 (en) * 2006-09-08 2008-03-13 Oliverio James C System and methods for synchronizing performances of geographically-disparate performers
US8237807B2 (en) 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
JP4807386B2 (en) * 2008-08-29 2011-11-02 ソニー株式会社 Display device and display method
KR101550886B1 (en) 2009-03-27 2015-09-08 삼성전자 주식회사 Apparatus and method for generating additional information of moving picture contents
JP5424798B2 (en) * 2009-09-30 2014-02-26 株式会社日立ソリューションズ METADATA SETTING METHOD, METADATA SETTING SYSTEM, AND PROGRAM
JP2011145813A (en) * 2010-01-13 2011-07-28 Ntt Docomo Inc Search support apparatus and search support method
TW201207643A (en) * 2010-08-09 2012-02-16 Hon Hai Prec Ind Co Ltd System and method for searching information of images
JP6042596B2 (en) 2011-01-21 2016-12-14 ソニー株式会社 Information processing apparatus, television receiver, information processing method, program, and information processing system
US10078695B2 (en) 2011-04-11 2018-09-18 Evertz Microsystems Ltd. Methods and systems for network based video clip generation and management
JP5994974B2 (en) * 2012-05-31 2016-09-21 サターン ライセンシング エルエルシーSaturn Licensing LLC Information processing apparatus, program, and information processing method
US9871842B2 (en) 2012-12-08 2018-01-16 Evertz Microsystems Ltd. Methods and systems for network based video clip processing and management
CN104185080B (en) * 2014-03-24 2018-05-08 无锡天脉聚源传媒科技有限公司 A kind of generation method and device of digital television program list
EP3240296B1 (en) * 2014-12-26 2023-04-05 Sony Group Corporation Information processing device, information processing method, and program
CN104703019A (en) * 2015-03-25 2015-06-10 京东方科技集团股份有限公司 TV display method and device
CN111125565A (en) * 2019-11-01 2020-05-08 上海掌门科技有限公司 Method and equipment for inputting information in application

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6005565A (en) * 1997-03-25 1999-12-21 Sony Corporation Integrated search of electronic program guide, internet and other information resources
US6256631B1 (en) * 1997-09-30 2001-07-03 International Business Machines Corporation Automatic creation of hyperlinks
US20020042920A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for supplementing on-demand media
US20030163815A1 (en) * 2001-04-06 2003-08-28 Lee Begeja Method and system for personalized multimedia delivery service
US6665870B1 (en) * 1999-03-29 2003-12-16 Hughes Electronics Corporation Narrative electronic program guide with hyper-links
JP2004023345A (en) 2002-06-14 2004-01-22 Sony Corp Information searching method, information searching system, receiver, information processing apparatus
JP2005004663A (en) 2003-06-13 2005-01-06 Sony Corp Content providing method and retrieval device
US20050015815A1 (en) * 1996-03-29 2005-01-20 Microsoft Corporation Interactive entertainment system for presenting supplemental interactive content together with continuous video programs
US20050240964A1 (en) * 2004-04-27 2005-10-27 Microsoft Corporation Specialized media presentation via an electronic program guide (EPG)
JP2006246064A (en) 2005-03-03 2006-09-14 Ntt Comware Corp Digital broadcasting system and method
US20060230021A1 (en) * 2004-03-15 2006-10-12 Yahoo! Inc. Integration of personalized portals with web content syndication
WO2006123744A1 (en) 2005-05-18 2006-11-23 Nec Corporation Content display system and content display method
US20060277167A1 (en) * 2005-05-20 2006-12-07 William Gross Search apparatus having a search result matrix display
US20070073704A1 (en) * 2005-09-23 2007-03-29 Bowden Jeffrey L Information service that gathers information from multiple information sources, processes the information, and distributes the information to multiple users and user communities through an information-service interface

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0687109B1 (en) * 1994-06-10 2000-04-12 Matsushita Electric Industrial Co., Ltd. Multimedia data presentation device and editing device
JPH08115312A (en) * 1994-10-14 1996-05-07 Fuji Xerox Co Ltd Multimedia document reproducing device, multimedia document editing device, and multimedia document editing and reproducing device
JP2004234157A (en) * 2003-01-29 2004-08-19 Sony Corp Information processor and method, and computer program

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050015815A1 (en) * 1996-03-29 2005-01-20 Microsoft Corporation Interactive entertainment system for presenting supplemental interactive content together with continuous video programs
US6005565A (en) * 1997-03-25 1999-12-21 Sony Corporation Integrated search of electronic program guide, internet and other information resources
US6256631B1 (en) * 1997-09-30 2001-07-03 International Business Machines Corporation Automatic creation of hyperlinks
US6665870B1 (en) * 1999-03-29 2003-12-16 Hughes Electronics Corporation Narrative electronic program guide with hyper-links
US20020042920A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for supplementing on-demand media
US20030163815A1 (en) * 2001-04-06 2003-08-28 Lee Begeja Method and system for personalized multimedia delivery service
JP2004023345A (en) 2002-06-14 2004-01-22 Sony Corp Information searching method, information searching system, receiver, information processing apparatus
JP2005004663A (en) 2003-06-13 2005-01-06 Sony Corp Content providing method and retrieval device
US20060230021A1 (en) * 2004-03-15 2006-10-12 Yahoo! Inc. Integration of personalized portals with web content syndication
US20050240964A1 (en) * 2004-04-27 2005-10-27 Microsoft Corporation Specialized media presentation via an electronic program guide (EPG)
JP2006246064A (en) 2005-03-03 2006-09-14 Ntt Comware Corp Digital broadcasting system and method
WO2006123744A1 (en) 2005-05-18 2006-11-23 Nec Corporation Content display system and content display method
US20060277167A1 (en) * 2005-05-20 2006-12-07 William Gross Search apparatus having a search result matrix display
US20070073704A1 (en) * 2005-09-23 2007-03-29 Bowden Jeffrey L Information service that gathers information from multiple information sources, processes the information, and distributes the information to multiple users and user communities through an information-service interface

Also Published As

Publication number Publication date
CN101247494A (en) 2008-08-20
US20080168499A1 (en) 2008-07-10
JP2008167363A (en) 2008-07-17
CN101247494B (en) 2011-04-13

Similar Documents

Publication Publication Date Title
US8316395B2 (en) Information processing apparatus and method, and program
US7917553B2 (en) System and methods for enhanced metadata entry
JP5009906B2 (en) Preview mode for content
JP4406848B2 (en) Information processing apparatus, information processing method, and program
US20040175159A1 (en) Searchable DVD incorporating metadata
US20030084460A1 (en) Method and apparatus reproducing contents from information storage medium in interactive mode
US20150301995A1 (en) Method for editing and processing contents file and navigation information
JP2007104312A (en) Information processing method using electronic guide information and apparatus thereof
JP2006339794A (en) Information processor, processing method and program
US20080235729A1 (en) User interface apparatus, display method, and computer program product
US8213764B2 (en) Information processing apparatus, method and program
JP2011034394A (en) Content providing device, content provision program, and content providing method
US20140193136A1 (en) Information processing apparatus and information processing method
US20090100470A1 (en) Information processing device
JP2009116845A (en) Information processing apparatus, information presentation apparatus, information presentation method, information presentation program, and computer-readable recording medium with the program recorded
US20060126471A1 (en) Information recording apparatus, information recording method, information playback apparatus, information playback method, and information recording/playback apparatus
US8493320B2 (en) Display control device, content output system, method for controlling display of image used in retrieval operation, and program
JP4945497B2 (en) Content information display method
EP2144240B1 (en) Method of searching for meta data
US7075861B2 (en) Reproduction apparatus capable of selecting pieces of information for reproducing
JP2012004687A (en) Content playback device, content output device, and content playback system
JP4549282B2 (en) Search support method and content playback apparatus
JP4592491B2 (en) Video recording device
JP2009044212A (en) Information processor, information processing method, and program
JP2006129349A (en) Television broadcast recorder, database server, and cm information providing system utilizing the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUROIWA, TATSUO;SASAKI, MASACHIKA;FUJIMURA, SATOSHI;AND OTHERS;REEL/FRAME:020319/0615;SIGNING DATES FROM 20071126 TO 20071205

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUROIWA, TATSUO;SASAKI, MASACHIKA;FUJIMURA, SATOSHI;AND OTHERS;SIGNING DATES FROM 20071126 TO 20071205;REEL/FRAME:020319/0615

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8