US20040117830A1 - Receiving apparatus and method - Google Patents

Receiving apparatus and method Download PDF

Info

Publication number
US20040117830A1
US20040117830A1 US10/717,560 US71756003A US2004117830A1 US 20040117830 A1 US20040117830 A1 US 20040117830A1 US 71756003 A US71756003 A US 71756003A US 2004117830 A1 US2004117830 A1 US 2004117830A1
Authority
US
United States
Prior art keywords
data
information
unit
contents
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/717,560
Other languages
English (en)
Inventor
Tomoyuki Ohno
Yoshikazu Shibamiya
Yuichi Matsumoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATSUMOTO, YUICHI, OHNO, TOMOYUKI, SHIBAMIYA, YOSHIKAZU
Publication of US20040117830A1 publication Critical patent/US20040117830A1/en
Priority to US12/179,689 priority Critical patent/US8074244B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23406Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving management of server-side video buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4347Demultiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4383Accessing a communication channel
    • H04N21/4384Accessing a communication channel involving operations to reduce the access time, e.g. fast-tuning for reducing channel switching latency
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Definitions

  • the present invention relates to a receiving apparatus and a receiving method which are used with a computer, a television set, a settop box, an image recording/replaying device, etc., connected to a network and capable of receiving various data from the network, and outputting image data relating to the received data on a display. More particularly, the present invention relates to a receiving apparatus and a receiving method for streaming data through a network.
  • streaming replay for sequentially replaying moving picture data and voice data on a server on the Internet (hereinafter referred to as streaming contents) while receiving the data over the network has made remarkable progress.
  • streaming replay a user can enjoy the streaming contents on the server at any time.
  • the present invention aims at providing a viewer with a receiving apparatus and a receiving method capable of displaying smooth images by maintaining the continuity of replayed images when two pieces of image data related to each other are consecutively displayed on the same display.
  • an information processing unit which processes the received data, generates image data corresponding to the received data, and outputs the image data to a display unit;
  • controller controls output of the information processing unit such that image data displayed on the display unit is switched to image data corresponding to the information data from image data corresponding to the partial data after reception of the information data is started.
  • the foregoing object is also attained by providing a receiving method comprising:
  • a receiving apparatus comprising:
  • a reception unit which receives a plurality of information data streams through a network
  • an information processing unit which generates image data corresponding to the information data stream by processing the information data, and outputting the image data to a display unit;
  • a generation unit which receives partial data of a plurality of information data streams through the reception unit, and generates a composite stream formed by plural pieces of received partial data
  • an accumulation unit which accumulates the composite stream generated by the generation unit
  • an instruction unit which selects one of the plurality of information data streams, and designates reception of the selected information data stream
  • a controller which reads partial data corresponding to the information data stream designated by the instruction unit from the composite stream accumulated in the accumulation unit, outputs the read data to the information processing unit, and outputs the processed partial data to the display unit.
  • FIG. 1 is an block diagram illustrating the entire configuration of the data receiving apparatus according to a first embodiment of the present invention
  • FIG. 2 shows the configuration of the entire system according to the first embodiment of the present invention
  • FIG. 3 shows the configuration of the operation unit shown in FIG. 1;
  • FIG. 4 shows the configuration of the remote controller shown in FIG. 1;
  • FIG. 5 is a flowchart of the operation of the data receiving apparatus according to the first embodiment of the present invention.
  • FIG. 6 is a flowchart of the data distributing operation to the data receiving apparatus in a Web server
  • FIG. 7 is a flowchart of the process of configuring the zapping stream according to the first embodiment of the present invention.
  • FIG. 8 is a flowchart of the data distributing operation to the data receiving apparatus in a contents distribution server
  • FIG. 9 shows an example of the portal screen according to the first embodiment of the present invention.
  • FIG. 10 shows an example of the contents menu screen according to the first embodiment of the present invention
  • FIG. 11 shows the concept of configuration of the zapping stream according to the first embodiment of the present invention
  • FIG. 12 shows an example of the zapping stream supplementary information data according to the first embodiment of the present invention
  • FIG. 13 shows an example of the contents menu screen displayed when the zapping stream is configured according to the first embodiment of the present invention
  • FIG. 14 shows an example of the contents menu screen displayed after the zapping stream has been configured according to the first embodiment of the present invention
  • FIG. 15 is a flowchart for explanation of the contents replay process according to the first embodiment of the present invention.
  • FIG. 16 is a flowchart for explanation of the contents replay process according to the first embodiment of the present invention.
  • FIG. 17 shows the concept of the zapping stream and the state of the content A data of the contents distribution server according to the first embodiment of the present invention
  • FIG. 18 is a schematic chart of the window display control status immediately before matching of a time-stamp according to the first embodiment of the present invention.
  • FIG. 19 shows another concept of configuration of the zapping stream according to the first embodiment of the present invention.
  • FIG. 20 shows the configuration of an operation unit according to a second embodiment of the present invention.
  • FIG. 21 shows the configuration of a remote controller according to the second embodiment of the present invention.
  • FIG. 22 is a flowchart of the operation of the data receiving apparatus according to the second embodiment of the present invention.
  • FIG. 23 is a flowchart of the process of configuring the zapping stream according to a third embodiment of the present invention.
  • FIG. 24 is a block diagram of the entire configuration of the data receiving apparatus according to a fourth embodiment of the present invention.
  • FIG. 25 is a flowchart of the contents replay process according to the fourth embodiment of the present invention.
  • FIG. 26 is a block diagram of the entire configuration of the data receiving apparatus according to a fifth embodiment of the present invention.
  • FIG. 27 is a flowchart of the contents replay process according to the fifth embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating the entire configuration of the data receiving apparatus according to the first embodiment of the present invention.
  • a network 200 (the Internet according to the first embodiment of the present invention) transmits streaming contents information, streaming contents, etc.
  • a data receiving apparatus 201 comprises a communications control unit 100 for communications with the network 200 , a buffer 101 , an accumulation unit 102 , a streaming contents reconfiguration unit 103 , a CPU 104 , a video decoder 105 , a voice decoder 106 , a screen configuration unit 107 , a display control unit 108 , a voice control unit 109 , an operation unit 110 , a photoreceiver 111 , an image display unit 113 , a voice output unit 114 , a control bus 115 , a cursor control unit 116 , a decoder switch control unit 117 , a synchronous control unit 118 , and a buffer control unit 121 .
  • the video decoder 105 comprises a first video decoder 105 a and a second video decoder 105 b .
  • the image display unit 113 comprises a first window 113 a and a second window 113 b .
  • a remote controller 112 is for remotely operating a data receiving apparatus 201 .
  • FIG. 2 shows the configuration of the entire system according to the first embodiment of the present invention.
  • the Internet 200 transmits streaming contents information, streaming contents, etc. as described above.
  • the data receiving apparatus 201 having the configuration shown in FIG. 1 receives streaming contents information and streaming contents through the Internet 200 .
  • a Web server 202 distributes streaming contents information to the data receiving apparatus 201 through the Internet 200 .
  • a contents distribution server 203 distributes streaming contents to the data receiving apparatus 201 through the Internet 200 .
  • a plurality of Web servers 202 and contents distribution servers 203 are arranged on the Internet 200 .
  • the communications control unit 100 communicates data with the Web server 202 and the contents distribution server 203 through the Internet 200 .
  • a buffer control unit 121 controls buffering to the buffer 101 of plural pieces of streaming contents data from the communications control unit 100 and the accumulation unit 102 .
  • the buffer 101 temporarily buffers plural pieces of streaming contents data. It should be noted that the communications control unit 100 , the buffer control unit 121 , the buffer 101 , and the accumulation unit 102 can collectively form a communications unit.
  • the decoder switch control unit 117 switches the data destination (the first video decoder 105 a or the second video decoder 105 b ) of the data from the buffer 101 to the video-decoder 105 .
  • the video decoder 105 has the capability of simultaneously decoding a plurality of streaming contents using the first video decoder 105 a and the second video decoder 105 b .
  • the display control unit 108 controls for display of plural pieces of decoded contents data from the video decoder 105 in a plurality of display windows (the first windows 113 a and 113 b ) of the image display unit 113 .
  • the video decoder 105 , the display control unit 108 , the decoder switch control unit 117 , etc. form an information processing unit.
  • the control bus 115 is a bus line for use by the CPU 104 controlling each function block shown in FIG. 1.
  • the accumulation unit 102 accumulates a zapping stream generated by the streaming contents reconfiguration unit 103 .
  • the streaming contents reconfiguration unit 103 configures a zapping stream from a plurality of received streaming contents.
  • the CPU 104 as a control unit controls each block in the data receiving apparatus 201 through the control bus 115 .
  • the synchronous control unit 118 is described later.
  • the operation unit 110 is provided with a button, etc. for performing an operation of the data receiving apparatus by a user, and is explained below by referring to FIG. 3.
  • FIG. 3 shows a main power button 301 , a power button 302 , a set button 303 , cursor buttons 304 , a replay button 306 , a stop button 307 , a portal button 311 , a channel up/down button 312 , and a ten-key 313 .
  • FIG. 4 shows a transmission unit 401 , a power button 402 , a set button 403 , cursor buttons 404 , a replay button 406 , a stop button 407 , a portal button 411 , a channel up/down button 412 , and a ten-key 413 .
  • FIGS. 3 and 4 the same names refer to the same functions.
  • a signal indicating each button operation using the remote controller 112 by a user is received by the photoreceiver 111 of the data receiving apparatus 201 through the transmission unit 401 shown in FIG. 4.
  • the main power button 301 shown in FIG. 3 is a button for control of the energization of each block in the data receiving apparatus 201 shown in FIG. 1.
  • the communications control unit 100 , the buffer control unit 121 , the buffer 101 , the accumulation unit 102 , the streaming contents reconfiguration unit 103 , the CPU 104 , the operation unit 110 , and the photoreceiver 111 are energized.
  • the power button 302 shown in FIG. 3 and the power button 402 shown in FIG. 4 are also the buttons for control of the energization of each block in the data receiving apparatus 201 shown in FIG. 1.
  • the decoder switch control unit 117 When the power button 302 or 402 is turned on with the main power button 301 placed in the on state, the decoder switch control unit 117 , the video decoder 105 , the voice decoder 106 , the screen configuration unit 107 , the display control unit 108 , the voice control unit 109 , the image display unit 113 , the voice output unit 114 , the cursor control unit 116 , and the synchronous control unit 118 are energized.
  • the remote controller 112 is operated independent of the data receiving apparatus 201 by power supply means such as batteries, etc.
  • FIG. 5 is a flowchart of the operation of the data receiving apparatus 201 according to the first embodiment of the present invention.
  • FIG. 6 is a flowchart of the data distributing operation to the data receiving apparatus 201 in a Web server 202 .
  • FIG. 7 is a flowchart of the process of configuring a zapping stream.
  • FIG. 8 is a flowchart of the data distributing operation to the data receiving apparatus 201 in a contents distribution server 203 .
  • the operations and processes are described below by following the flowcharts in FIGS. 5 to 8 , and by referring to FIGS. 1 to 4 .
  • step S 501 When a user presses the portal button 311 or 411 with the power button 302 or 402 placed in the on state (step S 501 ), the CPU 104 controls the communications control unit 100 through the control bus 115 , connects it to the Web server 202 in the Internet 200 , and makes a data request for portal screen data of the streaming contents (step S 502 ).
  • the Web server 202 receives the data request from the data receiving apparatus 201 (YES in step S 601 shown in FIG. 6), recognizes it as a request for portal screen data (YES in step S 602 ), and transmits the requested portal screen data in step S 603 .
  • the data distributed from the Web server 202 is described in the page description language such as the xHTML, etc.
  • the data receiving apparatus 201 awaits the portal screen data in-step S 505 , and receives the portal screen data transmitted from the Web server 202 .
  • Control is passed to step S 506 when the portal screen data is received, and the CPU 104 transmits the received data to the screen configuration unit 107 which forms the screen data to be displayed on the image display unit 113 , and the formed screen data is displayed on the image display unit 113 through the display control unit 108 .
  • FIG. 9 shows an example of the display screen based on the screen data formed by the screen configuration unit 107 . While watching the portal screen to the streaming contents, the user can select a desired category of contents.
  • a cursor 601 is configured and controlled by the cursor control unit 116 .
  • the cursor 601 can be moved by the operation of the cursor buttons 304 or 404 . For example, by pressing the down arrow button of the cursor buttons 304 or 404 when the displayed screen image is as shown in FIG. 9, the cursor 601 can be moved from the current “recommended contents” position to the “movie” position.
  • step S 507 upon receipt of an operation event by a user on the cursor buttons 304 or 404 and the set button 303 or 403 of the operation unit 110 or the remote controller 112 , the CPU 104 discriminates the received event, that is, recognizes it as a movement of the cursor in the vertical direction or in the horizontal direction, a press of the set button 303 or 403 , etc., and displays an image after the cursor is moved on the image display unit 113 through the cursor control unit 116 , the screen configuration unit 107 , the display control unit 108 , and the image display unit 113 .
  • step S 508 it is determined whether or not the received event is a press on the set button 303 or 403 . If NO, control is returned to step S 507 and awaits the next event. If YES, control is passed to step S 509 .
  • the Web server 202 receives the data request from the data receiving apparatus 201 (YES in step S 601 shown in FIG. 6). If it recognizes the received request as a request for the contents menu screen data (YES in step S 604 ), then transmits the requested contents menu screen data in step S 605 .
  • the data receiving apparatus 201 awaits the contents menu screen data in step S 512 , and receives the contents menu screen data transmitted from the Web server 202 .
  • a cursor 701 is configured and controlled by the cursor control unit 116 .
  • a still image 702 corresponds to one scene of each content.
  • a content name 703 is the name of each content.
  • Introductory text 704 describes each content.
  • a user can roughly select desired contents while watching the menu screen of the streaming contents.
  • a channel number 705 is assigned to each streaming content in the data receiving apparatus 201 . For example, content A is assigned ch 1 , content B is assigned ch 3 , content I is assigned ch 9 , and content J is assigned ch 10 .
  • a user can select a desired streaming content by pressing the set button 303 or 403 of the operation unit 110 or the remote controller 112 when the cursor is at the corresponding position.
  • the desired streaming contents can be also selected by pressing the ten-key 313 or 413 corresponding to the channel number assigned to the content.
  • step S 516 the related information about the contents displayed on the contents menu screen is requested.
  • the contents related information transmitted from the Web server 202 necessarily includes the URL (uniform resource locator) containing the streaming contents.
  • a URL can include the information such as:
  • the CPU 104 of the data receiving apparatus 201 analyzes all of the acquired contents related information (step S 520 ), and detects the URL information about the location of each streaming content. After analyzing all contents related information, the zapping stream, which is the feature of the present invention, is configured according to the detected URL information (step S 525 ).
  • step S 525 The process of configuring the zapping stream performed in step S 525 is explained below by referring to the flowchart shown in FIG. 7.
  • the configuration of the zapping stream is performed by the streaming contents reconfiguration unit 103 .
  • step S 700 shown in FIG. 7 the streaming contents reconfiguration unit 103 selects contents data to be acquired according to the URL information detected in the analysis of the contents related information performed by the CPU 104 in step S 520 , and issues a data acquire instruction to the CPU 104 .
  • a data acquisition instruction is issued for the contents A to J.
  • These contents are located in an accumulation unit 800 of the contents distribution server 203 as shown in FIG. 11.
  • step S 701 the CPU 104 issues a contents data request to the contents distribution server 203 in response to the data acquisition instruction from the streaming contents reconfiguration unit 103 .
  • the streaming contents reconfiguration unit 103 of the data receiving apparatus 201 awaits the contents data from the contents distribution server 203 in step S 702 , and control is passed to step S 703 when it receives the data.
  • Data request in step S 701 and data reception in step S 702 are repeatedly performed until the amount of received data of each streaming content reaches a predetermined amount (until YES in step S 703 ).
  • the amount of data acquired from each streaming content for configuration of the zapping stream is, for example, 6 Mbytes for each content.
  • ten contents A to J are displayed, and the amount of data is 6 Mbytes for each of the contents A to J.
  • FIG. 11 shows this concept.
  • the contents A to J are stored in the accumulation unit 800 of the contents distribution server 203 .
  • supplementary information 812 relating to the configured zapping stream is generated in step S 706 .
  • FIG. 12 shows an example of the zapping stream supplementary information.
  • the supplementary information 812 is formed by the contents menu data acquired in step S 512 shown in FIG. 5, the contents related information acquired in step S 519 , and the header information about the streaming contents acquired in step S 702 shown in FIG. 7, and described in the XML (extensible Markup Language). From the supplementary information 812 , it is possible to identify the following information about the configured zapping stream:
  • FIG. 13 shows an example of the displayed screen.
  • the progress of the configuration of the zapping stream and the time taken to complete the configuration of the zapping stream are displayed.
  • a message 1200 notifying that the configuration of the zapping stream has been completed is displayed as shown in FIG. 14.
  • the notification to the user is made using a text message, but an icon can be displayed to represent the similar meaning.
  • step S 526 The contents replay process performed in step S 526 is described below by referring to the flowcharts shown in FIGS. 15 and 16.
  • step S 101 when the user operates the button of the operation unit 110 or the remote controller 112 while watching the contents menu screen such as the one shown in FIG. 10 or 14 , the data receiving apparatus 201 receives an operation event, and the CPU 104 determines in step S 102 whether the received event refers to a press of the set button 303 or 403 , the replay button 306 or 406 , the ten-key 313 or 413 , or the channel up/down button 312 or 412 .
  • step S 103 counting the continuous service hour for the selected contents is started.
  • step S 104 the CPU 104 reads the supplementary information 812 and the zapping stream 811 configured as described above and stored in the accumulation unit 102 .
  • the supplementary information 812 stored in the accumulation unit 102 a file position of a desired contents stream in the accumulation unit 102 is obtained and the replay of the contents in the position is started.
  • the zapping stream data read from the accumulation unit 102 is transmitted to the video decoder 105 and the voice decoder 106 , and the data is decoded.
  • the video data is output through the display control unit 108 and the image display unit 113
  • the voice data is output through the voice control unit 109 and the voice output unit 114 (step S 105 ).
  • step S 107 it is determined whether or not the operations of various buttons of the operation unit 110 or the remote controller 112 have been performed. If not, control is returned to step S 104 , and the replay of the contents being replayed is continued. On the other hand, if any operation has been performed, then it is determined in step S 108 whether or not the operation is a press of the stop button 307 or 407 . If YES, the replay is stopped, the screen as shown in FIG. 10 or 14 is displayed, and then control is passed to step S 527 shown in FIG. 5.
  • step S 109 it is determined in step S 109 whether or not the operation is a press of the corresponding channel number on the ten-key 313 or 413 , or a press of the channel up/down button 312 or 412 . If NO, control is returned to step S 104 , and the replay of the contents being replayed is continued. If the operation is a press of any of the above-mentioned buttons, then control is returned to step S 103 , the count of the continuous service hour is newly started, and the above-mentioned processes in and after step S 104 are repeated.
  • step S 203 If one of the first video decoder 105 a and the second video decoder 105 b of the video decoder 105 (let us assume the first video decoder 105 a ) decodes the partial data A′ of the zapping stream 811 , and the other (the second video decoder 105 b ) is not decoding data, then it is determined in step S 203 that the data can be decoded additionally.
  • the CPU 104 instructs the communications control unit 100 to start receiving data in step S 204 .
  • the communications control unit 100 issues a transmission request for the content A (information data) from the point corresponding to the partial data A′ being replayed, to the contents distribution server 203 .
  • the contents distribution server 203 starts transmitting data to the data receiving apparatus 201 , and the data receiving apparatus 201 receives data by the communications control unit 100 .
  • FIG. 17 is a view showing the zapping stream 118 and the concept of the state of the data of the content A of the contents distribution server 203 , and shows the state of starting the distribution of the content A if the time T 1 has passed.
  • the data received by the communications control unit 100 is temporarily buffered in an area of the buffer 101 assigned for contents data additionally received from the contents distribution server 203 for continuous replay of the remaining data (information data) of and after the partial data accumulated for a zapping stream by the buffer control unit 102 .
  • the buffered data is transmitted to the video decoder 105 through the decoder switch control unit 117 .
  • step S 205 the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • first video decoder 105 a and second video decoder 105 b the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • step S 206 the synchronous control unit 118 acquires the time-stamp information contained in each piece of contents data decoded by the decoders 105 a and 105 b of the video decoder 105 , and compares the time-stamp values.
  • a time stamp refers to time information used in synchronously outputting video data and voice data in streaming contents.
  • the operations in steps S 204 to S 207 are repeated until the time-stamp values match (until YES in step S 207 ).
  • control is passed to step S 208 . In FIG. 17, it is represented by the time T 2 .
  • FIG. 18 is a schematic chart showing a window display control state immediately before the matching of time-stamp values in step S 207 .
  • FIG. 18 shows a displayed picture 900 in the first window 113 a , a displayed picture 901 in the second window 113 b , coordinates 902 of the upper left corner of the first window 113 a , coordinates 903 of the upper left corner of the second window 113 b , coordinates 904 of the upper right corner of the first window 113 a , coordinates 905 of the upper right corner of the second window 113 b , coordinates 906 of the lower left corner of the first window 113 a , coordinates 907 of the lower left corner of the second window 113 b , and coordinates 908 of the lower right corner of the first window 113 a .
  • the coordinate positions on the display screen of the coordinates 902 of the upper left corner of the first window 113 a and the coordinates 903 of the upper left corner of the second window 113 b are the same.
  • the coordinates positions on the display screen of the coordinates 904 of the upper right corner of the first window 113 a and the coordinates 905 of the upper right corner of the second window 113 b are the same.
  • the positions of the coordinates 908 of the lower right corner of the first window 113 a and the coordinates of the lower right corner (not shown) of the second window 113 b are the same.
  • the display position of the displayed picture 900 of the first window 113 a completely overlaps the displayed picture 901 of the second window 113 b , and the priority of the first window 113 a is set higher at this stage. Therefore, from the user, only the displayed picture 900 of the first window 113 a , that is, the image acquired by decoding the data of the zapping stream 811 can be watched.
  • step S 208 When the time-stamp value match each other, control is passed to step S 208 , and the CPU 104 controls the decoder switch control unit 117 .
  • the display priorities are exchanged between the displayed picture 900 of the first window 113 a and the displayed picture 901 of the second window 113 b .
  • the user can watch only the displayed picture 901 of the second window 113 b .
  • the data of the zapping stream 811 (for example, the partial data such as the picture at the commencement of a program) and the data of the streaming contents being received from the contents distribution server 203 and replayed (information data related to the partial data, for example, the information data such as a picture in and after the commencement of the program) can be smoothly exchanged. Subsequently, the data of the streaming contents from the contents distribution server 203 is displayed.
  • the case in which it is determined in the check of the load of the decoder in step S 202 that the data cannot be additionally decoded is described.
  • the case can be set such that another application in the data receiving apparatus 201 according to the first embodiment decodes the other contents stored in the accumulation unit 102 using the first video decoder 105 a , and the zapping stream is decoded by the second video decoder 105 b , etc.
  • step S 209 a request to transmit data of the content A is issued to the contents distribution server 203 , and the contents distribution server 203 starts transmitting data to the data receiving apparatus 201 , and the data receiving apparatus 201 receives data by the communications control unit 100 .
  • the data received by the communications control unit 100 is temporarily stored in the buffer 101 through the buffer control unit 121 .
  • step S 210 the synchronous control unit 118 acquires the time-stamp information contained in the data of the zapping stream and the time-stamp information contained in the contents data input to the buffer control unit 121 , and compares the time-stamp values.
  • the operations in steps S 209 to S 211 are repeated until the time-stamp values match (until YES in step S 211 ).
  • control is passed to step S 212 .
  • step S 212 the decoder switch control unit 117 switches the data to be transmitted to the video decoder 105 from the data of the zapping stream 811 to the contents data input from the buffer 101 , that is, the data of the content A received from the contents distribution server 203 and buffered.
  • the input data to the second video decoder 105 b is switched into the data of the content A, and the zapping stream and the streaming contents received from the contents distribution server 203 and being replayed can be smoothly switched.
  • the ten-key 313 or 413 or the channel up/down button 312 or 412 is pressed, control is returned to step S 103 shown in FIG. 15.
  • the streaming contents data after the switching of contents is not requested to the Web server 202 and the contents distribution server 203 , but the zapping stream 811 is replayed from a file position where a partial data of the desired content is located according to the information in the zapping stream supplementary information 812 stored in the accumulation unit 102 . Therefore, since the access time for communications of data with the Web server 202 and the contents distribution server 203 is not required, the contents to be watched can be smoothly and easily switched, thereby reducing the user discomfort.
  • the streaming contents contained in the category (FIG. 9) of the streaming portal are scattered in a plurality of contents distribution servers, the address of the request for the contents can be easily discriminated according to the URL information acquired in step S 520 shown in FIG. 5. Therefore, the zapping stream can be configured in the same procedure shown in FIG. 7, and data can be replayed with the same effect as in acquiring the data from one contents distribution server.
  • FIGS. 20 and 21 show the configuration of the operation unit 110 and the remote controller 112 . They are different from FIGS. 3 and 4 in that the zapping buttons 305 and 405 are added, but other components are the same as those shown in FIGS. 3 and 4. Therefore, the same reference numerals are assigned to the corresponding components, and the detailed explanation is omitted here.
  • control is passed directly to step S 516 and the zapping stream configuring operation is started immediately after configuring and displaying the contents menu screen in step S 513 shown in FIG. 5.
  • a press of the zapping button 305 or 405 is awaited.
  • control is passed to step S 516 , and the zapping stream configuring operation is performed.
  • FIG. 23 is a flowchart of the zapping stream configuring process (process in step S 525 shown in FIG. 5) performed by the data receiving apparatus 201 according to the third embodiment of the present invention.
  • the third embodiment aims at setting a constant replay time of the partial data acquired from each of the streaming contents A to J.
  • FIG. 23 the process similar to that shown in FIG. 7 is assigned the same reference numeral, and the detailed explanation is omitted here.
  • the streaming contents reconfiguration unit 103 Upon receipt of the data from the contents distribution server 203 in step S 702 , the streaming contents reconfiguration unit 103 repeats requesting of data in step S 701 and receiving of data in step S 702 until the replay time of each streaming content received in step S 903 reaches the amount of data corresponding to a predetermined replay time.
  • the amount of received data can be determined according to the bit rate information contained in the received contents data or the bit rate information possibly contained in the contents related information data acquired in advance. For example, when the amount of data acquired from each of the streaming contents A to J for configuration of a zapping stream is set for three minutes, the time is divided by a bit rate to calculate the necessary amount of data, thereby making determination in step S 903 .
  • another switch control is performed from the partial data of the zapping stream to the streaming contents distributed by the contents distribution server 203 .
  • a low brightness level frame or a field is selected from the decoded zapping stream video data, and a timing of switching from the zapping stream to the streaming contents received from the contents distribution server 203 is determined.
  • FIG. 25 shows the contents replay process according to the fourth embodiment of the present invention. Since the process performed when the continuous service hour is T 1 or less is the same as that explained above by referring to FIG. 15, the explanation is omitted here.
  • step S 106 As explained above in step S 106 shown in FIG. 15, if it is determined that the continuous service hour has exceeded a predetermined time T 1 , control is passed to the process shown in FIG. 25 by considering that the streaming contents beyond the data portion (for example, the partial data A′ shown in FIG. 11) of the zapping stream 811 can be continuously watched.
  • the CPU 104 checks the load of the decoding operation of the video decoder 105 in step S 202 . In step S 202 , whether or not the video decoder 105 can decode the streaming contents of the content A to be received hereinafter is checked.
  • step S 203 If one of the first video decoder 105 a and the second video decoder 105 b of the video decoder 105 (let us assume the first video decoder 105 a ) decodes the partial data A′ of the zapping stream 811 , and the other (the second video decoder 105 b ) is not decoding data, then it is determined in step S 203 that the data can be decoded additionally.
  • step S 1101 the frame or filed of decoded video data is detected using the picture data detection/comparison unit 119 , and is compared with the video data in the preceding frame or field.
  • the comparison of the video data is performed to detect a frame or a field of a lower brightness level.
  • step S 1102 if it is determined that the data of a newly detected frame or field is lower in brightness level than the picture data of the preceding frame of field, then the time-stamp value of the frame of field is stored in step S 1103 , and control is passed to step S 1104 . If it is not determined that the brightness level is lower, control is passed directly to step S 1104 .
  • step S 1104 It is determined in step S 1104 on the partial data (for example, the partial data A′) of the zapping stream 811 at and after the contents switch probable point T 2 whether or not the process of detecting the frame or field of partial data and comparing the data with the picture data of the previous frame or field has been completed. If not, then control is returned to step S 1101 , and the processes in steps S 1101 to S 1104 are repeated. If YES, then control is passed to step S 1105 .
  • the partial data for example, the partial data A′
  • step S 1105 the contents distribution server 203 is requested to transmit the data of the content A from the point (for example, T 1 shown in FIG. 17) corresponding to the partial data A′ being replayed.
  • the contents distribution server 203 starts transmitting data to the data receiving apparatus 201 ′, and the data receiving apparatus 201 ′ receives data by the communications control unit 100 .
  • the data received by the communications control unit 100 is temporarily buffered in an area of the buffer 101 assigned for contents data.
  • step S 1106 the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • step S 1107 the synchronous control unit 118 acquires the time-stamp information contained in each piece of contents data decoded by the decoders 105 a and 105 b of the video decoder 105 , and compares the time-stamp information with the time-stamp value stored in step S 1103 .
  • control is passed to step S 208 , and the display is switched.
  • the switching control in step S 208 is the same as that according to the first embodiment, and the detailed explanation is omitted here.
  • step S 203 If it is determined in step S 203 that the data cannot be further decoded, then the processes in steps S 209 to 212 are performed as in the first embodiment explained by referring to FIG. 16.
  • display windows can be more smoothly and less conspicuously switched at a timing of a frame or a field of a low brightness level.
  • FIG. 26 is a block diagram of the entire configuration of the data receiving apparatus 201 ′′ according to the fifth embodiment of the present invention. It is different from FIG. 1 in that a movement detection/comparison unit 120 is added, but other configuration is the same as that shown in FIG. 1. Therefore, the same reference numerals are assigned, and the explanation is omitted here.
  • FIG. 27 shows the contents replay process according to the fifth embodiment. Since the process in which the continuous service hour is T 1 or less is the same as that shown in FIG. 15, the explanation is omitted here.
  • step S 106 As explained above in step S 106 shown in FIG. 15, if it is determined that the continuous service hour has exceeded the predetermined time T 1 , it is considered that the streaming contents (information data) beyond the data portion (partial data A′ shown in FIG. 11) of the zapping stream 811 can be continuously watched, and control is passed to the process in FIG. 27.
  • the CPU 104 checks the load of the decoding operation of the video decoder 105 in step S 202 . In step S 202 , whether or not the video decoder 105 can decode the streaming contents of the content A to be received hereinafter is checked.
  • step S 203 If one of the first video decoder 105 a and the second video decoder 105 b of the video decoder 105 (let us assume the first video decoder 105 a ) decodes the partial data A′ of the zapping stream 811 , and the other (the second video decoder 105 b ) is not decoding data, then it is determined in step S 203 that the data can be decoded additionally.
  • decoding of the partial data A′ of the zapping stream 811 is started using the second video decoder 105 b from the contents display switch probable point (for example, T 2 shown in FIG. 17) in step S 1300 .
  • step S 1301 a movement vector is detected using the frame or field of decoded video data and its preceding frame or field of video data, and is compared with a movement vector obtained in the preceding movement vector detection process by the movement detection/comparison unit 120 .
  • the comparison of the video data is performed to detect a frame or a field of a lower movement vector.
  • step S 1302 if it is determined that the newly detected movement vector is smaller than the movement vector obtained in the preceding movement vector detection process, then the time-stamp value of the frame of field is stored in step S 1303 , and control is passed to step S 1304 . If it is not determined that the new movement vector is smaller, control is passed directly to step S 1304 .
  • step S 1304 It is determined in step S 1304 on the partial data (for example, the partial data A′) of the zapping stream 811 at and after the contents switch probable point T 2 whether or not the process of detecting and comparing movement vectors has been completed. If not, then control is returned to step S 1301 , and the processes in steps S 1301 to S 1304 are repeated. If YES, then control is passed to step S 1305 .
  • the partial data for example, the partial data A′
  • step S 1305 the contents distribution server 203 is requested to transmit the data of the content A from the point (for example, T 1 shown in FIG. 17) corresponding to the partial data A′ being replayed.
  • the contents distribution server 203 starts transmitting data to the data receiving apparatus 201 ′′, and the data receiving apparatus 201 ′′ receives data by the communications control unit 100 .
  • the data received by the communications control unit 100 is temporarily buffered in an area of the buffer 101 assigned for contents data.
  • step S 1306 the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • first video decoder 105 a and second video decoder 105 b the decoder switch control unit 117 switches the input data for a plurality of decoders (first video decoder 105 a and second video decoder 105 b ) in the video decoder 105 .
  • step S 1307 the synchronous control unit 118 acquires the time-stamp information contained in each piece of contents data decoded by the decoders 105 a and 105 b of the video decoder 105 , and compares the time-stamp information with the time-stamp value stored in step S 1303 .
  • control is passed to step S 208 , and the display is switched.
  • the switching control in step S 208 is the same as that according to the first embodiment, and the detailed explanation is omitted here.
  • step S 203 If it is determined in step S 203 that the data cannot be further decoded, then the processes in steps S 209 to 212 are performed as in the first embodiment by referring to FIG. 16.
  • display windows can be more smoothly and less conspicuously switched at a timing of a frame or a field of a small movement.
  • the invention can be implemented by supplying a software program, which implements the functions of the foregoing embodiments, directly or indirectly to a system or apparatus, reading the supplied program code with a computer of the system or apparatus, and then executing the program code. In this case, so long as the system or apparatus has the functions of the program, the mode of implementation need not rely upon a program.
  • the program may be executed in any form, e.g., as object code, a program executed by an interpreter, or scrip data supplied to an operating system.
  • Example of storage media that can be used for supplying the program are a floppy disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a CD-RW, a magnetic tape, a non-volatile type memory card, a ROM, and a DVD (DVD-ROM and a DVD-R).
  • a client computer can be connected to a website on the Internet using a browser of the client computer, and the computer program of the present invention or an automatically-installable compressed file of the program can be downloaded to a recording medium such as a hard disk.
  • the program of the present invention can be supplied by dividing the program code constituting the program into a plurality of files and downloading the files from different websites.
  • a WWW World Wide Web
  • partial data and information data are represented by data of a series of programs having continuous portions in time such as a part of pictures in a program and the remaining parts of pictures.
  • any partial data and information data which are related to each other can be applied to the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
US10/717,560 2002-11-29 2003-11-21 Receiving apparatus and method Abandoned US20040117830A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/179,689 US8074244B2 (en) 2002-11-29 2008-07-25 Receiving apparatus and method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2002-348722 2002-11-29
JP2002348722 2002-11-29
JP2003-349769 2003-10-08
JP2003349769A JP4408677B2 (ja) 2002-11-29 2003-10-08 受信装置及び受信方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/179,689 Division US8074244B2 (en) 2002-11-29 2008-07-25 Receiving apparatus and method

Publications (1)

Publication Number Publication Date
US20040117830A1 true US20040117830A1 (en) 2004-06-17

Family

ID=32510592

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/717,560 Abandoned US20040117830A1 (en) 2002-11-29 2003-11-21 Receiving apparatus and method
US12/179,689 Expired - Fee Related US8074244B2 (en) 2002-11-29 2008-07-25 Receiving apparatus and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/179,689 Expired - Fee Related US8074244B2 (en) 2002-11-29 2008-07-25 Receiving apparatus and method

Country Status (2)

Country Link
US (2) US20040117830A1 (ja)
JP (1) JP4408677B2 (ja)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050044112A1 (en) * 2003-08-19 2005-02-24 Canon Kabushiki Kaisha Metadata processing method, metadata storing method, metadata adding apparatus, control program and recording medium, and contents displaying apparatus and contents imaging apparatus
US20050091700A1 (en) * 2003-10-22 2005-04-28 Canon Kabushiki Kaisha Data receiving-processing apparatus
US20050130613A1 (en) * 2003-12-11 2005-06-16 Canon Kabushiki Kaisha Program selecting apparatus
US20050160462A1 (en) * 2003-12-11 2005-07-21 Canon Kabushiki Kaisha Signal generating method, program, and storing apparatus for automatically storing broadcast programs
US20050166242A1 (en) * 2003-12-15 2005-07-28 Canon Kabushiki Kaisha Visual communications system and method of controlling the same
US20060239299A1 (en) * 2004-01-08 2006-10-26 Albrecht Scheid Extra error correcting method for zapping stream ts packet
US20060284810A1 (en) * 2005-06-15 2006-12-21 Canon Kabushiki Kaisha Image Display Method and Image Display Apparatus
US20060285034A1 (en) * 2005-06-15 2006-12-21 Canon Kabushiki Kaisha Image Display Method and Image Display Apparatus
US20070188659A1 (en) * 2006-02-13 2007-08-16 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20080178217A1 (en) * 2003-06-20 2008-07-24 Canon Kabushiki Kaisha Image Display Method and Program
US20090116821A1 (en) * 2005-08-15 2009-05-07 Canon Kabushiki Kaisha Reproduction control method, reproduction apparatus, and television set
US20090168902A1 (en) * 2005-04-06 2009-07-02 Matsushita Electric Industrial Co., Ltd. Method for arranging zapping streams in mpe-fec frame and receiver
US7716696B2 (en) 2003-12-15 2010-05-11 Canon Kabushiki Kaisha Television receiver, information processing method and program
US20100238996A1 (en) * 2007-11-16 2010-09-23 Panasonic Corporation Mobile terminal and video output method
CN104035674A (zh) * 2014-05-23 2014-09-10 小米科技有限责任公司 图片显示方法和装置
CN106648320A (zh) * 2016-12-20 2017-05-10 天脉聚源(北京)传媒科技有限公司 一种图片轮询展示的动态提示的方法及装置

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006027846A1 (ja) * 2004-09-10 2006-03-16 Matsushita Electric Industrial Co., Ltd. ザッピングストリームの生成装置とその方法
JP4534997B2 (ja) * 2006-02-13 2010-09-01 ソニー株式会社 送受信システム、受信装置、受信方法
FR2898236A1 (fr) * 2006-03-03 2007-09-07 Thomson Licensing Sas Procede de transmission de flux audiovisuels en anticipant les commandes de l'utilisateurs, recepteur et emetteur pour la mise en oeuvre du procede
WO2008069032A1 (ja) * 2006-11-28 2008-06-12 Nec Corporation 動画像配信システム、動画像配信装置および動画像配信方法
JP2008252476A (ja) * 2007-03-30 2008-10-16 Nec Personal Products Co Ltd デジタル放送受信機
JP5101938B2 (ja) * 2007-07-05 2012-12-19 株式会社 エヌティーアイ クライアント装置、コンテンツ受信方法、管理装置、コンテンツ配信の管理方法、プログラム
JP5230267B2 (ja) * 2008-05-27 2013-07-10 キヤノン株式会社 機器制御装置および制御方法
JP5243871B2 (ja) * 2008-07-18 2013-07-24 シャープ株式会社 映像再生装置
JP2010187206A (ja) * 2009-02-12 2010-08-26 Canon Inc 録画再生装置及びその制御方法
JP6253036B2 (ja) * 2016-03-31 2017-12-27 シャープ株式会社 コンテンツ処理装置、テレビジョン受信装置、コンテンツ処理装置における情報処理方法、プログラム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5933192A (en) * 1997-06-18 1999-08-03 Hughes Electronics Corporation Multi-channel digital video transmission receiver with improved channel-changing response
US6115080A (en) * 1998-06-05 2000-09-05 Sarnoff Corporation Channel selection methodology in an ATSC/NTSC television receiver
US20020019982A1 (en) * 2000-08-10 2002-02-14 Shuntaro Aratani Data processing apparatus, data processing system, television signal receiving apparatus, and printing apparatus
US20020021373A1 (en) * 2000-08-04 2002-02-21 Yoshikazu Shibamiya Signal receiving apparatus, remote controller, signal receiving system, and apparatus to be controlled
US20020089610A1 (en) * 2000-12-26 2002-07-11 Tomoyuki Ohno Broadcast receiver, broadcast reception method, digital TV broadcast receiver, external terminal, broadcast receiver control system, and storage medium
US20020138829A1 (en) * 2001-03-06 2002-09-26 Canon Kabushiki Kaisha Receiving apparatus and method thereof, and storage medium therefor
US20050081244A1 (en) * 2003-10-10 2005-04-14 Barrett Peter T. Fast channel change

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2989680B2 (ja) * 1991-02-15 1999-12-13 株式会社山田製作所 チルト・テレスコピックステアリング装置
US6611607B1 (en) * 1993-11-18 2003-08-26 Digimarc Corporation Integrating digital watermarks in multimedia content
JP3371186B2 (ja) 1995-11-27 2003-01-27 ソニー株式会社 ビデオデータ配信システムおよびビデオデータ受信装置
ATE331390T1 (de) * 1997-02-14 2006-07-15 Univ Columbia Objektbasiertes audiovisuelles endgerät und entsprechende bitstromstruktur
WO1998056176A1 (en) * 1997-06-03 1998-12-10 Koninklijke Philips Electronics N.V. Navigating through television programs
US6295646B1 (en) * 1998-09-30 2001-09-25 Intel Corporation Method and apparatus for displaying video data and corresponding entertainment data for multiple entertainment selection sources
US6204887B1 (en) * 1998-12-11 2001-03-20 Hitachi America, Ltd. Methods and apparatus for decoding and displaying multiple images using a common processor
US7142777B1 (en) 1999-02-23 2006-11-28 Canon Kabushiki Kaisha Recording and reproducing apparatus and method generating recording location table for plurality of programs received in multiplexed data train
US6425129B1 (en) * 1999-03-31 2002-07-23 Sony Corporation Channel preview with rate dependent channel information
JP4046886B2 (ja) 1999-04-02 2008-02-13 キヤノン株式会社 記録装置及び記録装置の制御方法
JP3721009B2 (ja) 1999-05-31 2005-11-30 株式会社リコー 録画視聴システム
US6931660B1 (en) * 2000-01-28 2005-08-16 Opentv, Inc. Interactive television system and method for simultaneous transmission and rendering of multiple MPEG-encoded video streams
JP2002091863A (ja) 2000-09-12 2002-03-29 Sony Corp 情報提供方法
JP4132630B2 (ja) 2000-10-11 2008-08-13 日本電気株式会社 放送システム及びその方法
US20020066101A1 (en) * 2000-11-27 2002-05-30 Gordon Donald F. Method and apparatus for delivering and displaying information for a multi-layer user interface
US7174512B2 (en) * 2000-12-01 2007-02-06 Thomson Licensing S.A. Portal for a communications system
US6931449B2 (en) * 2001-03-22 2005-08-16 Sun Microsystems, Inc. Method migrating open network connections
US20020194608A1 (en) * 2001-04-26 2002-12-19 Goldhor Richard S. Method and apparatus for a playback enhancement system implementing a "Say Again" feature
US7873972B2 (en) * 2001-06-01 2011-01-18 Jlb Ventures Llc Method and apparatus for generating a mosaic style electronic program guide
US20040049788A1 (en) 2002-09-10 2004-03-11 Canon Kabushiki Kaisha Receiving apparatus, receiving method, and method of predicting audience rating
JP4298282B2 (ja) 2002-12-13 2009-07-15 キヤノン株式会社 表示制御装置およびその制御方法
KR100998899B1 (ko) * 2003-08-30 2010-12-09 엘지전자 주식회사 썸네일 영상 서비스 방법 및 방송 수신기

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5933192A (en) * 1997-06-18 1999-08-03 Hughes Electronics Corporation Multi-channel digital video transmission receiver with improved channel-changing response
US6115080A (en) * 1998-06-05 2000-09-05 Sarnoff Corporation Channel selection methodology in an ATSC/NTSC television receiver
US20020021373A1 (en) * 2000-08-04 2002-02-21 Yoshikazu Shibamiya Signal receiving apparatus, remote controller, signal receiving system, and apparatus to be controlled
US20020019982A1 (en) * 2000-08-10 2002-02-14 Shuntaro Aratani Data processing apparatus, data processing system, television signal receiving apparatus, and printing apparatus
US20020089610A1 (en) * 2000-12-26 2002-07-11 Tomoyuki Ohno Broadcast receiver, broadcast reception method, digital TV broadcast receiver, external terminal, broadcast receiver control system, and storage medium
US20020138829A1 (en) * 2001-03-06 2002-09-26 Canon Kabushiki Kaisha Receiving apparatus and method thereof, and storage medium therefor
US20050081244A1 (en) * 2003-10-10 2005-04-14 Barrett Peter T. Fast channel change

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080178217A1 (en) * 2003-06-20 2008-07-24 Canon Kabushiki Kaisha Image Display Method and Program
US7620910B2 (en) 2003-06-20 2009-11-17 Canon Kabushiki Kaisha Image display method and program with usage of numeric keys and cursor keys
US20050044112A1 (en) * 2003-08-19 2005-02-24 Canon Kabushiki Kaisha Metadata processing method, metadata storing method, metadata adding apparatus, control program and recording medium, and contents displaying apparatus and contents imaging apparatus
US7599960B2 (en) 2003-08-19 2009-10-06 Canon Kabushiki Kaisha Metadata processing method, metadata storing method, metadata adding apparatus, control program and recording medium, and contents displaying apparatus and contents imaging apparatus
US7817301B2 (en) 2003-10-22 2010-10-19 Canon Kabushiki Kaisha Data receiving-processing apparatus
US20050091700A1 (en) * 2003-10-22 2005-04-28 Canon Kabushiki Kaisha Data receiving-processing apparatus
US20050160462A1 (en) * 2003-12-11 2005-07-21 Canon Kabushiki Kaisha Signal generating method, program, and storing apparatus for automatically storing broadcast programs
US20050130613A1 (en) * 2003-12-11 2005-06-16 Canon Kabushiki Kaisha Program selecting apparatus
US20050166242A1 (en) * 2003-12-15 2005-07-28 Canon Kabushiki Kaisha Visual communications system and method of controlling the same
US7716696B2 (en) 2003-12-15 2010-05-11 Canon Kabushiki Kaisha Television receiver, information processing method and program
US7536707B2 (en) 2003-12-15 2009-05-19 Canon Kabushiki Kaisha Visual communications system and method of controlling the same
US20060239299A1 (en) * 2004-01-08 2006-10-26 Albrecht Scheid Extra error correcting method for zapping stream ts packet
US20090168902A1 (en) * 2005-04-06 2009-07-02 Matsushita Electric Industrial Co., Ltd. Method for arranging zapping streams in mpe-fec frame and receiver
US20060284810A1 (en) * 2005-06-15 2006-12-21 Canon Kabushiki Kaisha Image Display Method and Image Display Apparatus
US7586491B2 (en) 2005-06-15 2009-09-08 Canon Kabushiki Kaisha Image display method and image display apparatus
US7808555B2 (en) 2005-06-15 2010-10-05 Canon Kabushiki Kaisha Image display method and image display apparatus with zoom-in to face area of still image
US20060285034A1 (en) * 2005-06-15 2006-12-21 Canon Kabushiki Kaisha Image Display Method and Image Display Apparatus
US20090116821A1 (en) * 2005-08-15 2009-05-07 Canon Kabushiki Kaisha Reproduction control method, reproduction apparatus, and television set
US8532473B2 (en) 2005-08-15 2013-09-10 Canon Kabushiki Kaisha Reproduction control method, reproduction apparatus, and television set
US20070188659A1 (en) * 2006-02-13 2007-08-16 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US8701006B2 (en) 2006-02-13 2014-04-15 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20100238996A1 (en) * 2007-11-16 2010-09-23 Panasonic Corporation Mobile terminal and video output method
CN104035674A (zh) * 2014-05-23 2014-09-10 小米科技有限责任公司 图片显示方法和装置
CN106648320A (zh) * 2016-12-20 2017-05-10 天脉聚源(北京)传媒科技有限公司 一种图片轮询展示的动态提示的方法及装置

Also Published As

Publication number Publication date
US8074244B2 (en) 2011-12-06
JP2004194294A (ja) 2004-07-08
JP4408677B2 (ja) 2010-02-03
US20080282292A1 (en) 2008-11-13

Similar Documents

Publication Publication Date Title
US8074244B2 (en) Receiving apparatus and method
US10462530B2 (en) Systems and methods for providing a multi-perspective video display
US10547880B2 (en) Information processor, information processing method and program
US9055328B2 (en) Network-based service to provide on-demand video summaries of television programs
US8918800B2 (en) Receiving apparatus and receiving method, broadcasting apparatus and broadcasting method, information processing apparatus and information processing method, bidirectional communication system and bidirectional communication method, and providing medium
EP1528809B1 (en) Interactivity with audiovisual programming
US8015584B2 (en) Delivering interactive content to a remote subscriber
KR100950111B1 (ko) Mpeg-4 원격 통신 장치
KR100575995B1 (ko) 수신장치
KR101965806B1 (ko) 콘텐츠 재생장치, 콘텐츠 재생방법, 콘텐츠 재생프로그램 및 콘텐츠 제공시스템
US20090178092A1 (en) Video picture information delivering apparatus and receiving apparatus
JP2004357184A (ja) 情報処理装置及び情報処理方法、並びにコンピュータ・プログラム
JPH11103452A (ja) インタラクティブ番組における対話及び画面制御方法
KR100406631B1 (ko) 방송신호를 통한 상품정보 제공 및 이를 통한 상품정보획득 방법 및 장치
US8166503B1 (en) Systems and methods for providing multiple video streams per audio stream
US20100088724A1 (en) Broadcast program display apparatus and method
JP4421666B1 (ja) コンテンツ受信装置及びコンテンツ再生方法
JPH11205708A (ja) ディジタル放送受信システム
JP2005278123A (ja) 映像受信装置およびコンピュータを映像受信装置として機能させるためのプログラム、ならびに映像提供装置およびコンピュータを映像提供装置として機能させるためのプログラム
JP2004363914A (ja) 映像視聴制御システム,映像視聴制御方法,映像関連情報管理サーバ,視聴シーン選択端末,それらのプログラムおよびそれらのプログラムの記録媒体
JP3886892B2 (ja) 動画像蓄積装置
JP2010109993A (ja) コンテンツ受信装置及びコンテンツ再生方法
KR20110115837A (ko) 전자프로그램가이드 표시 장치 및 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OHNO, TOMOYUKI;SHIBAMIYA, YOSHIKAZU;MATSUMOTO, YUICHI;REEL/FRAME:014729/0650

Effective date: 20031113

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION