US20120201515A1 - Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method - Google Patents

Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method Download PDF

Info

Publication number
US20120201515A1
US20120201515A1 US13/327,224 US201113327224A US2012201515A1 US 20120201515 A1 US20120201515 A1 US 20120201515A1 US 201113327224 A US201113327224 A US 201113327224A US 2012201515 A1 US2012201515 A1 US 2012201515A1
Authority
US
United States
Prior art keywords
program
video
information
display
descriptor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/327,224
Other languages
English (en)
Inventor
Takashi Kanemaru
Sadao Tsuruga
Satoshi Otsuka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Consumer Electronics Co Ltd
Original Assignee
Hitachi Consumer Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Consumer Electronics Co Ltd filed Critical Hitachi Consumer Electronics Co Ltd
Assigned to HITACHI CONSUMER ELECTRONICS CO., LTD. reassignment HITACHI CONSUMER ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TSURUGA, SADAO, OTSUKA, SATOSHI, KANEMARU, TAKASHI
Publication of US20120201515A1 publication Critical patent/US20120201515A1/en
Priority to US14/023,319 priority Critical patent/US9094668B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/005Reproducing at a different information rate from the information rate of recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • H04N5/783Adaptations for reproducing at a rate different from the recording rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8211Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a sound signal

Definitions

  • the technical field of the present invention relates to a broadcast receiving apparatus, a receiving method and a receiving/transmitting method of three-dimension (hereinafter, “3D”) video.
  • 3D three-dimension
  • Patent Documents 1 To provide a digital broadcast receiving apparatus for enabling to notice, actively, that a program desired by a user will start on a certain channel, etc.” (see [0005] of the Patent Document 1), as a problem to be dissolved, and also as a dissolving means thereof is described “comprising a means for taking out program information included in a digital airwave and for selecting a target program to be noticed with using selection information, which is registered by the user, and a means for displaying a message informing an existence of the selected target program to be noticed, cutting into a screen being presently displayed” (see [0006] of the Patent Document 1).
  • Patent Document 2 to provide a video data producing apparatus of allowing the video data to have flexibility for the purpose of the 3D display, and a video data reproducing apparatus for reproducing that data” (see of the Patent Document 2), as a problem to be dissolved, and also as a dissolving means thereof is described “a video data recording apparatus comprises a 3D display control information producing portion while inputting and encoding a parameter presenting a condition of the time when photographing 3D pictures, and a file producing portion for producing a multimedia information file including both photographing condition information and 3D video data or at least either one of the 3D video data video data or 2D video data” (see [0015] of the Patent Document 2), etc.
  • Patent Document 3 to provide a reproducing apparatus for compensating stereoscopic vision of a graphic when doing random accessing” (see the abstract of the Patent Document 3), as a problem to be dissolved, and also as a dissolving means thereof is described “each of a left-view graphic stream and a right-view graphic stream, which are included in a digital stream received from a distributing apparatus, includes one (1) or more numbers of display set(s) (hereinafter, being called “DS”), and each DS is a group of data to be used for displaying a graphic object for one (1) screen.
  • DS display set(s)
  • One (1) or more numbers of DS(s) included in the left-view graphic stream mentioned above correspond(s) to one (1) or more numbers of DS(s) included in the right-view graphic stream mentioned above, 1 to 1, and in each DS corresponding to is set up a reproduction time, being same on a reproduction time axis of the vide stream mentioned above.
  • the above-mentioned DS includes therein all of the data necessary for displaying the graphic objects for one (1) piece of screen, or condition information indicative of being the difference or not from the DS just before, and the condition information included in the DS corresponding thereto indicates the same contents” (see the abstract of the Patent Document 3).
  • Patent Document 1 is no disclosure relating to viewing/listening of 3D content. For that reason, there is a problem to be dissolved, i.e., recognition cannot be made on whether the program, which a receiver is receiving at the present or will receive in future is the 3D program or not.
  • Patent Document 2 is no disclosure, in particular, from a viewpoint, how to publish the information of a 3D method, which is recorded, to the user. For that reason, there is a problem to be dissolved, i.e., it is impossible to fully enjoy an advantage as the receiving apparatus.
  • FIG. 1 is a block diagram for showing an example of the structure of a system
  • FIG. 2 is a block diagram for showing the structure of a transmitting apparatus 1 ;
  • FIG. 3 is a view for showing an example of assignment for each stream format classification
  • FIG. 4 is a view for showing the structure of a component descriptor
  • FIG. 5A is a view for showing an example of component content and component classification, as the constituent elements of the component descriptor
  • FIG. 5B is a view for showing an example of component content and component classification, as the constituent elements of the component descriptor
  • FIG. 5C is a view for showing an example of component content and component classification, as the constituent elements of the component descriptor
  • FIG. 5D is a view for showing an example of component content and component classification, as the constituent elements of the component descriptor
  • FIG. 5E is a view for showing an example of component content and component classification, as the constituent elements of the component descriptor
  • FIG. 6 is a view for showing an example of the structure of a component group descriptor
  • FIG. 7 is a view for showing an example of a component group classification
  • FIG. 8 is a view for showing an example of component group identification
  • FIG. 9 is a view for showing an example of counting unit identification
  • FIG. 10A is a view for showing an example of the structure of a 3D program details descriptor
  • FIG. 10B is a view for showing an example of a 3D/2D classification
  • FIG. 11 is a view for showing an example of a format classification of 3D
  • FIG. 12 is a view for showing an example of the structure of a service descriptor
  • FIG. 13 is a view for showing an example of a service format classification
  • FIG. 14 is a view for showing an example of the structure of a service list descriptor
  • FIG. 15 is a view for showing an example of a transmission operation regulation of the component descriptor in the transmitting apparatus 1 ;
  • FIG. 16 is a view for showing an example of a transmission operation regulation of the component group descriptor in the transmitting apparatus 1 ;
  • FIG. 17 is a view for showing an example of a transmission operation regulation of the 3D program details descriptor in the transmitting apparatus 1 ;
  • FIG. 18 is a view for showing an example of a transmission operation regulation of the service descriptor in the transmitting apparatus 1 ;
  • FIG. 19 is a view for showing an example of a transmission operation regulation of the service list descriptor in the transmitting apparatus 1 ;
  • FIG. 20 is a view for showing an example of a process for each field of the component descriptor, in a receiving apparatus 4 ;
  • FIG. 21 is a view for showing an example of a process for each field of the component group descriptor, in a receiving apparatus 4 ;
  • FIG. 22 is a view for showing an example of a process for each field of the 3D program details descriptor, in a receiving apparatus 4 ;
  • FIG. 23 is a view for showing an example of a process for each field of the service descriptor, in a receiving apparatus 4 ;
  • FIG. 24 is a view for showing an example of a process for each field of the service list descriptor, in a receiving apparatus 4 ;
  • FIG. 25 is a view for showing an example of the structure of a receiving apparatus according to the present invention.
  • FIG. 26 is a view for showing an example of an outlook block diagram of internal functions inside a CPU in the receiving apparatus according to the present invention.
  • FIG. 27 is a view for showing an example of a flowchart of a 2D/3D video displaying process upon basis of whether a next program is 3D content or not;
  • FIG. 28 is a view for showing an example of a message display
  • FIG. 29 is a view for showing an example of a message display
  • FIG. 30 is a view for showing an example of a message display
  • FIG. 31 is a view for showing an example of a message display
  • FIG. 32 is a view for showing an example of a flowchart of a system controller portion when the next program starts;
  • FIG. 33 is a view for showing an example of a message display
  • FIG. 34 is a view for showing an example of a message display
  • FIG. 35 is an example of block diagram for showing an example of the structure of a system
  • FIG. 36 is an example of block diagram for showing an example of the structure of a system
  • FIGS. 37A and 37B are views for explaining an example of a 3D reproducing/outputting/displaying process of the 3D content
  • FIG. 38 is a view for explaining an example of a 2D reproducing/outputting/displaying process of the 3D content
  • FIGS. 39A and 39B are views for explaining an example of a 3D reproducing/outputting/displaying process of the 3D content
  • FIGS. 40A to 40D are views for explaining an example of a 2D reproducing/outputting/displaying process of the 3D content
  • FIG. 41 is a view for showing an example of a flowchart of a 2D/3D video displaying process upon basis of whether a present program is 3D content or not;
  • FIG. 42 is a view for showing an example of a message display
  • FIG. 43 is a view for showing an example of a flowchart of display processing process after user selection
  • FIG. 44 is a view for showing an example of a message display
  • FIG. 45 is a view for showing an example of a flowchart of a 2D/3D video displaying process upon basis of whether a present program is 3D content or not;
  • FIG. 46 is a view for showing an example of a message display
  • FIG. 47 is a view for showing an example of combination of streams when transmitting the 3D video
  • FIG. 48 is a view for showing an example of display of a program table
  • FIG. 49 is a view for showing an example of display of a program table
  • FIG. 50 is a view for showing an example of a message display
  • FIG. 51 is a view for showing a flowchart when displaying a message of a 3D method, which is not yet supported;
  • FIGS. 52A to 52C are views for showing an example of a message display
  • FIG. 53 is a view for showing an example of display of a program
  • FIG. 54 is a view for showing an example of the structure of a content descriptor
  • FIG. 55 is a view for showing an example of an encode table for program genres
  • FIG. 56 is a view for showing an example of an encode table for program characteristics
  • FIG. 57 is a view for showing an example of an encode table for program characteristics
  • FIGS. 58A and 58B are examples of the block diagram for showing an example of a record/reproduce portion 27 ;
  • FIG. 59 is a view for showing an example of the structure of a folder file of data to be recorded onto a recoding medium 26 ;
  • FIG. 60 is a view for showing an example of the structure of a folder file of data to be recorded onto a recoding medium 26 ;
  • FIG. 61 is a view for showing an example of a flowchart when recording 2D/3D identification information
  • FIG. 62 is a view for showing an example of a flowchart when recording 2D/3D identification information
  • FIG. 63 is a view for showing an example of a flowchart when recording 2D/3D identification information
  • FIG. 64A is a view for showing an example of the structure of a folder file of data to be recorded onto a recoding medium 26 ;
  • FIG. 64B is a view for showing an example of the structure of a folder file of data to be recorded onto a recoding medium 26 ;
  • FIGS. 65A and 65B are views for showing an example of 3D information display when displaying a list of the programs, which are recorded on the recording medium 26 ;
  • FIG. 66 is a view for showing an example of 3D information display when displaying a list of the programs, which are recorded on the recording medium 26 ;
  • FIG. 67 is a view for showing an example of 3D information display when displaying a list of the programs, which are recorded on the recording medium 26 ;
  • FIGS. 68A and 68B are views for showing an example of 3D information display when displaying a list of the programs, which are recorded on the recording medium 26 ;
  • FIG. 69 is a view for showing an example of 3D information display when displaying a list of the programs, which are recorded on the recording medium 26 ;
  • FIG. 70 is a view for showing an example of 3D information display when producing a play list
  • FIG. 71 is a view for showing an example of the structure of a local network
  • FIGS. 72A and 72B are views for showing an example of 3D information display when displaying a list of the programs, which are recorded on the recording medium 26 ;
  • FIG. 73 is a view for showing an example of a flowchart when transmitting program information of 3D program to network connected equipments;
  • FIGS. 74A and 74B are views for showing an example of 3D information display when executing dubbing
  • FIG. 75 is a view for showing an example of picture structure of the video data and video display when executing a high-speed search
  • FIG. 76A is a view for showing an example of the structure of a folder file of data to be recorded onto a recoding medium 26 ;
  • FIG. 76B is a view for showing an example of the structure of a folder file of data to be recorded onto a recoding medium 26 ;
  • FIG. 77 is a view for showing an example of picture structure of the video data and video display when executing a high-speed search
  • FIG. 78 is a view for showing an example of a flowchart for determining a method for controlling a special reproduction
  • FIG. 79 is a view for showing an example of a flowchart when starting a special reproduction of 3D video
  • FIG. 80 is a view for showing an example of a flowchart for starting the method for controlling the special reproduction
  • FIG. 81 is a view for showing an example of a flowchart when starting a special reproduction of 3D video
  • FIG. 82 is a block diagram for showing an example of the structure of a system.
  • FIG. 83 is a view for showing an example of a flowchart for determining a method for controlling a special reproduction.
  • FIG. 1 is a block diagram for showing an example of the structure of a system according to the present embodiment.
  • information is transmitted/received through broadcasting.
  • this should not be limited to the broadcasting, but may be VOD through communication, and they also may be called “distributing”, collectively.
  • a reference numeral 1 depicts a transmitting apparatus to be provided in an information providing station, such as, a broadcasting station, etc., 2 a relay apparatus to be provided in a relay station or a satellite for use of broadcasting, etc., 3 a network, which may be a public network for connecting between a general home and a broadcasting station, such as, the Internet, etc., or may be provided within a house of a user, and 10 a receiving record reproducing portion, which is built within the receiving apparatus 4 . With the receiving record reproducing portion 10 , it is possible to record/reproduce the broadcasted information, or to reproduce the content from an external removable medium, etc.
  • the transmitting apparatus 1 transmits a modulated signal radio wave through the relay apparatus 2 .
  • a modulated signal radio wave may be used, for example, it is also possible to use the transmission with using a cable, the transmission with using a telephone line, the transmission with using a terrestrial broadcasting, and the transmission passing through a network 3 , such as, the Internet through the network 3 including the public network.
  • This signal radio wave received by the receiving apparatus 4 as will be mentioned later, after being demodulated into an information signal, will be recorded on a recording medium depending on a necessity thereof.
  • a format such as, a data format (an IP packet) in accordance with a protocol suitable to the public network (for example, TCP/IP), etc.
  • the receiving apparatus 4 receiving the data mentioned above decodes it into the information signal, being a signal suitable to be recorded depending on a necessity thereof, and it is recorded on the recoding medium.
  • the user can view/listen the video/audio shown by the information signal, on a display if this display is built within the receiving apparatus 4 , or by connecting a display not show in the figure to the receiving apparatus 4 if it is not built therein.
  • FIG. 2 is a block diagram for showing an example of the structure of the transmitting apparatus among those of the system shown in FIG. 1 /
  • a reference numeral 11 depicts a source generator portion, 12 an encode portion for conducting compression with using a method, such as, MPEG 2 or H.264, etc., and thereby adding program information or the like thereto, 13 a scramble portion, 14 a modulator portion, 15 a transmission antenna, and 16 a management data supply portion, respectively.
  • the information, which generated in the source generator portion 11 composed of a camera and/or a recording/reproducing apparatus, such as, video/audio, etc., is treated with compression of the data volume thereof in the encode portion 12 , so that it can be transmitted with occupying a less bandwidth. It is encrypted in the scramble portion 13 depending on a necessity thereof, so that it can be viewed/listened only for a specific viewer.
  • the management information supply portion 16 it is supplied with program identification information, such as the property of the content, which is produced in the source generator portion 11 (for example, encoded information of video and/or audio, encoded information of audio, structure of program, if it is 3D video or not, etc.), and also is supplied with program arrangement information, which the broadcasting station produces (for example, the structure of a present program or a next program, a format of service, information of structures of programs for one (1) week, etc.).
  • program identification information such as the property of the content, which is produced in the source generator portion 11
  • program arrangement information which the broadcasting station produces
  • the signal produced in the encode portion 12 is encrypted in an encryption portion 17 depending on a necessity thereof, so that it can be viewed/listened only for the specific viewer.
  • the signal produced in the encode portion 12 is encrypted in an encryption portion 17 depending on a necessity thereof, so that it can be viewed/listened only for the specific viewer.
  • After being encoded into a signal suitable to transmit through the public network within a communication path coding portion 18 it is transmitted from a network I/F (Interface) portion 19 directing to the public network.
  • a network I/F (Interface) portion 19 directing to the public network.
  • the existing MPEG 2 Moving Picture Experts Group 2
  • H.264 AVC is utilized as a video compression method, and the characteristic thereof lies in that it has a compatibility with the existing broadcast, as well as, enabling to utilize an existing relay infrastructure, and it can be received by an existing receiver (such as, STB, etc.); however, it is transmission of the 3D video having a resolution of a half (1 ⁇ 2) of the highest resolution of the existing broadcast (in the vertical direction or the horizontal direction). For example, as is shown in FIG.
  • a “Side-by-Side” method i.e., diving one (1) piece of picture to the left and the right and storing them into a screen size of the 2D program, by reducing the video for use of the left-side eye (L) and the video for use of the right-side eye (R) down to about a half (1 ⁇ 2) of the 2D program in width of the horizontal direction, respectively, while keeping them to be equal thereto in width of the vertical direction
  • a “Top-and-Bottom” method i.e., diving one (1) piece of picture up and down and storing them into the screen size equal to that of the 2D program, by keeping the video for use of the left-side eye (L) and the video for use of the right-side eye (R) to be equal thereto, respectively, while reducing them down to a half (1 ⁇ 2) of the 2D program in width of the vertical direction
  • the MPEG 2 or the H.264 AVC (excepting MVC), not a method for coding a multi-viewpoints picture, originally or inherently, can be applied as it is, as the coding method itself, and therefore there can be obtained a merit for enabling to carry out the 3D program broadcasting with applying the existing broadcasting method for the 2D program, effectively.
  • the multi-viewpoints coding method means a coding method, which is standardized for coding the video of multi-viewpoints, and with this, it is possible to encode the video of multi-viewpoints, without dividing one (1) piece of video into each viewpoint, i.e., for encoding other picture for each viewpoint.
  • the main-viewpoint picture it is possible to keep the compatibility with the existing broadcasting method of the 2D program.
  • the main-viewpoint picture can keep the compatibility with the 2D picture of the H.264 AVC, and the main-viewpoint picture can be displayed as the 2D picture.
  • the “3D 2-viewpoints separated ES transmission method” is included a method of encoding the encoded picture for use of the left-side eye as a main-viewpoint picture with the MPEG 2, while encoding the encoded picture for use of the right-side eye as the other viewpoint picture with the H.264 AVC, and thereby obtaining separate streams, respectively.
  • the main-viewpoint picture is compatible with the MPEG 2, i.e., it can be displayed as the 2D picture, and therefore it is possible to keep the compatibility with the existing broadcasting method of 2D program, in which the pictures encoded by the MPEG 2 are widely spreading.
  • the “3D 2-viewpoints separated ES transmission method” is included a method of encoding the encoded picture for use of the left-side eye as a main-viewpoint picture with the MPEG 2, while encoding the encoded picture for use of the right-side eye as the other viewpoint picture with the MPEG 2, and thereby obtaining separate streams, respectively.
  • the main-viewpoint picture is compatible with the MPEG 2, i.e., it can be displayed as the 2D picture, and therefore it is possible to keep the compatibility with the existing broadcasting method of 2D program, in which the pictures encoded by the MPEG 2 are widely spreading.
  • the “3D 2-viewpoints separated ES transmission method” may be a method of encoding the encoded picture for use of the left-side eye as a main-viewpoint picture with the H.264 AVC or the H.264 MVC, while encoding the encoded picture for use of the right-side eye as the other viewpoint picture with the MPEG.
  • Program identification information and program arrangement information are called “program information”.
  • the program identification information is also called “PSI (Program Specific Information), and is compose of four (4) tables; i.e., a PAT (Program Association Table), being information necessary for selecting a desired program and for designating a packet identifier of a TS packet for transmitting a PMT (Program Map Table) relating to a broadcast program, a NIT (Network Information Table) for designating a packet identifier of a TS packet for transmitting common information among the packet identifier of the TS packet for transmitting each of encoded signals making up a broadcast program and information relating to pay broadcasting, a NIT (Network Information Table) for transmitting information of a transmission path, such as, a frequency, etc., and information to be associated with, and a CAT (Conditional Access Table) for designating the packet identifier of a packet for transmitting individual information among information relating to the pay broadcasting, and is regulated by a regulation of MPEG 2.
  • PAT Program Association Table
  • NIT Network Information Table
  • the information for encoding the video includes the information for encoding the audio, and the structure of the program.
  • the present invention newly, information for indicating if being the 3D video or not, etc., is included therein. That PSI is added within the management information supply portion 16 .
  • the program arrangement information may be also called “SI (Service Information)”, and is one of various types of information, which are regulated for the purpose of convenience when selecting a program; i.e., there is also included the PSI information of the MPEG 2 system regulation, and there are an EIT (Event Information Table), on which the information relating to programs is described, such as, a program name, broadcasting time, a program content, etc., and a SDT (Service Description Table), on which the information relating to organized channels (services) is described, such as, organized channel names, broadcasting provider names, etc.
  • SI Service Information
  • it includes the structure, the format of service, or information indicating the structure information of the programs for one (1) week, etc., of the program broadcasted at present and/or the program to be broadcasted next, and is added within the management information supply portion 16 .
  • a component descriptor In the program information are included a component descriptor, a component group descriptor, a 3D program details descriptor, a service descriptor, and a service list descriptor, etc., each being the constituent element of the program information.
  • Those descriptors are described within the tables, such as, PMT, EIT [schedule basic/schedule extended/present/following], NIT, and SDT.
  • EIT switchedule basic/schedule extended
  • it can obtain the information of programs up to seven (7) days in the future, other than that of the program being broadcasted at the present, it has the following demerits; i.e., the time-period until the completion of receiving is long because of a long transmission frequency from the transmitting comparing to that of PMT, needing much more memory regions for holding, and further being low in the sense of reliability in a sense of having a possibility of being changed because of being phenomena in the future.
  • EIT following
  • it can obtain the information of the program of a next broadcasting time.
  • the PMT of the program identification information is able to show a format of ES of the program being broadcasted, with using the table structure, which is regulated in ISO/IEC 13818-1, e.g., by means of “stream_type (a type of stream format)”, being information of eight (8) bits, which is described in a 2 nd loop (e.g., a loop for each ES (Elementary Stream)) thereof.
  • stream_type a type of stream format
  • 2 nd loop e.g., a loop for each ES (Elementary Stream)
  • 0x1B is assigned to, being same to that of the AVC video stream, which is regulated in the existing ITU-T recommendation H.264
  • 0x20 is assigned a sub-bit stream (other viewpoint) of the multi-viewpoints video encoded stream (for example, the H.264 MVC), which can be applied into the 3D video program.
  • the base-view bit stream of the H.262 (MPEG 2) when transmitting the plural numbers of viewpoints of the 3D video by the streams separated, is a steam, only video of the main-viewpoint thereof being encoded with the H.262 (MPEG 2) method, among the videos of plural numbers of viewpoints of 3D videos.
  • 0x21 is assigned a bit stream of other viewpoint of the H.262 (MPEG 2) to be applied when transmitting the plural numbers of viewpoints of 3D video by the separated streams.
  • MPEG 2 bit stream of other viewpoint of the H.262
  • 0x22 is assigned a bit stream of other viewpoint bit stream of AVC stream method, which is regulated in the ITU-T recommendation H.264
  • the sub-bit stream of the multi-viewpoints video encoded stream, which can be applied into the 3D video program is assigned to “0x20”
  • the bit stream of the other viewpoint of the H.262 (MPEG 2), to be applied when transmitting the plural numbers of viewpoints of 3D video by the separated streams is assigned to “0x21”
  • ISO/IEC 14496-10 video, to be applied when transmitting the plural numbers of viewpoints of 3D video by the separated streams is assigned to “0x22”; but each may be assigned to any one of “0x23” through “0x7E”.
  • the MVC video stream is only the example, but it may be a video stream other than the H.264/MVC, as far as it indicates the multi-viewpoints video encoded stream, which can be applied into the 3D video program.
  • the base-view sub-bit stream (the main-viewpoint (the type of stream format: “0x1B”) of the multi-viewpoints video encoded (for example; H.264/MVC) stream is transmitted, while as a sub-viewpoint (for use of the right-side eye), the sub-bit stream (the type of stream format: “0x20”) for use of other viewpoint of the multi-viewpoints video encoded (for example; H.264/MVC) stream is transmitted.
  • both the main-viewpoint (for use of the left-side eye) and the sub-viewpoint (for use of the right side eye) are applied with the stream of the multi-viewpoints video encoded (for example; H.264/MVC) method.
  • the multi-viewpoints video encoded (for example; H.264/MVC) method is basically a method for transmitting the multi-viewpoints video, and is able to transmit the 3D program with high efficiency, among those combinations shown in FIG. 47 .
  • the receiving apparatus when displaying (or outputting) a 3D program in 3D, the receiving apparatus is able to process both the main-viewpoint (for use of the left-side eye) video stream and the sub-viewpoint (for use of the right-side eye) video stream, and thereby to reproduce the 3D program.
  • the receiving apparatus When displaying (or outputting), the receiving apparatus is able to display (or output) the 3D program as the 2D program, if processing only the main-viewpoint (for use of the left-side eye) video stream.
  • the sub-viewpoint (for use of the right-side eye) is assigned with a type of stream format, which cannot be found, conventionally, then it is neglected in the existing receiving apparatus. With this, it is possible to prevent the sub-viewpoint (for use of the right-side eye) video stream from being displayed (or outputted), in a manner, which the broadcasting station side does not intend, on the existing receiving apparatus.
  • the broadcasting of 3D program of the combination example 1 is started, newly, it is possible to avoid such a situation that it cannot be displayed on the existing receiving apparatus having the function of displaying (or outputting) the video stream of the existing H.264/AVC (excepting MVC).
  • the broadcasting of that 3D program is started, newly, by the broadcasting or the like, which is managed by an income of advertisement, such as, CM (commercial message), etc., since it can be viewed/listened even on the receiving apparatus not cope with the 3D displaying (or outputting) function, and therefore, it is possible to avoid lowering of an audience rate by such restriction of the function in the receiving apparatus, i.e., also being meritorious on the side of the broadcasting station.
  • the base-view bit stream (the main-viewpoint) (the stream format type: “0x02”) of the H.262 (MPEG 2), to be applied when transmitting plural numbers of the viewpoints of 3D video by the separated streams
  • the sub-viewpoint (for use of the right-side eye) video stream is transmitted the AVC stream (the stream format type: “0x22”), which is regulated by the ITU-T recommendation H.264
  • the receiving apparatus when displaying (or outputting) the 3D program in 3D, the receiving apparatus is able to reproduce the 3D program, by processing both the main-viewpoint (for use of the left-side eye) video stream and the sub-viewpoint (for use of the right-side eye) video stream, and also when displaying (or outputting) the 3D program in 2D, the receiving apparatus is able to display (or output) it as the 2D program if processing only the main-viewpoint (for use of the left-side eye) video stream.
  • the base-view bit stream (the main-viewpoint) of the H.262 (MPEG 2), to be applied when transmitting plural numbers of the viewpoints of 3D video by the separated streams, into a stream having compatibly with the existing ITU-T recommendation H.262
  • the sub-viewpoint (for use of the right-side eye) is assigned with a type of stream format, which cannot be found, conventionally, then it is neglected in the existing receiving apparatus. With this, it is possible to prevent the sub-viewpoint (for use of the right-side eye) video stream from being displayed (or outputted), in a manner, which the broadcasting station side does not intend, on the existing receiving apparatus.
  • ISO/IEC 13818-2 video stream is spread, widely, it is possible to prevent the audience rate from being lowered due to the limitation of the receiving apparatus, and therefore the most preferable broadcasting for the broadcasting station can be achieved.
  • modifying the sub-viewpoint (for use of the right-side eye) into the AVC stream (the stream format type: “0x22”), which is regulated by the ITU-T recommendation H.264
  • the base-view stream (the main-viewpoint) (the stream format type: “0x02”) of the H.262 (MPEG2) to be applied when transmitting the plural numbers of viewpoints of 3D video by the separate streams
  • the sub-viewpoint (for use of the right-side eye) video stream is transmitted the bit stream (the stream format type: “0x21”) of other viewpoint of the H.262 (MPEG 2) to be applied when transmitting the plural numbers of viewpoints of 3D video by the separate streams.
  • the receiving apparatus similar to the combination example 3, it is possible for the receiving apparatus, but as far as it has a function of displaying (or outputting) the existing ITU-T recommendation H.262
  • the combination example 4 it is also possible to transmit the base-view stream (the main-view) (the stream format type: “0x1B”) of the multi-viewpoints video encoded (for example: H.264/MVC) stream, as the main-viewpoint (for the left-side eye) video stream, while transmitting the bit stream of other viewpoint (the stream format type: “0x21”) of the H.262 (MPEG 2) method to be applied when transmitting the plural numbers of viewpoints of 3D video by the separate streams, as the sub-viewpoint (for use of the right-side eye).
  • the base-view stream the main-view
  • the stream format type: “0x1B” the multi-viewpoints video encoded (for example: H.264/MVC) stream
  • the bit stream of other viewpoint the H.262 (MPEG 2) method to be applied when transmitting the plural numbers of viewpoints of 3D video by the separate streams, as the sub-viewpoint (for use of the right-side eye).
  • the similar effect can be obtained if applying the AVC video stream (the stream format type: “0x1B”), which is regulated by the ITU-T recommendation H.264
  • the AVC video stream the stream format type: “0x1B”
  • the main-viewpoint the stream format type: “0x1B”
  • the multi-viewpoints video encoded for example: H.264/MVC
  • the similar effect can be obtained if applying the ITU-T recommendation H.262
  • FIG. 4 shows an example of the structure of a component descriptor, as one of the program information.
  • the component descriptor indicates a type of the component (an element building up the program. For example, video, audio, letters, various kinds of data, etc.), and is also used for presenting the elementary stream by a letter format.
  • This descriptor is disposed in PMT and/or EIT.
  • “descriptor_tag” is a field of 8 bits, into which is described such a value that this descriptor can be identified as the component descriptor.
  • “descriptor_length” is a field of 8 bits, into which is described a size of this descriptor.
  • “stream_component” (content of the component) is a field of 4 bits, presenting the type of the stream (e.g., video, audio and data), and is encoded in accordance with FIG. 4 .
  • “component_type” is a field of 8 bits, defining the type of the component, such as, video, audio and data, for example, and is encoded in accordance with FIG. 4 .
  • “component_tag” is a field of 8 bits. A component stream of service can be referred by means of this field of 8 bits, i.e., the described contents ( FIG. 5 ) thereof, which are indicated by the component descriptor.
  • the component tag is a label for identifying the component stream, and has the same value to that of the component tag within a stream identification descriptor (but, only when the stream deification descriptor exists within PMT).
  • a field of 24 bits of “ISO — 639_languate_code” (a language code) identifies the language of the component (audio or data) and the language of description of letters, which are included in this descriptor.
  • the language code is presented by a code of 3 alphabetical letters regulated in ISO 639-2 (22). Each letter is encoded in accordance with ISO 8859-1 (24), and is inserted into a field of 24 bits in that order.
  • Japanese language is “jpn” by the code of alphabetical 3 letters, and is encoded as follows. “0110 1010 0111 0000 0110 1110” “text_char” (component description) is a field of 8 bits. A field of a series of component descriptions regulates the description of letters of the component stream.
  • FIGS. 5A to 5E show example of “stream_contnet” (content of the component) and “component_type” (type of the component), being the constituent elements of the component descriptor. “0x01” of content of the component shown in FIG. 5A tells about various video formats of the video stream, being compressed in accordance with an MPEG 2 format.
  • “0x05” of the component content shown in FIG. 5B tells about various video formats of the video stream, being compressed in a H.264 AVC format.
  • “0x06” of the component content shown in FIG. 5C tells about various video formats of the video stream, being compressed by the multi-viewpoints video encoding (for example, a H.264 MVC format).
  • “0x07” of the component content shown in FIG. 5D tells about various video formats of the stream of a “Side-by-Side” format of 3D video, being compressed in the format of MPEG 2 or H.264 AVC.
  • the same value is shown for the component contents in the MPEG 2 format and the H.264 AVC format, but different values may be set for the MPEG 2 and the H.264 AVC, respectively.
  • “0x08” of the component content shown in FIG. 5E tells about various video formats of a “Top-and Bottom” format of 3D video, being compressed in the format of MPEG 2 or H.264 AVC.
  • a “Top-and Bottom” format of 3D video being compressed in the format of MPEG 2 or H.264 AVC.
  • the same value is shown for the component contents in the MPEG 2 format and the H.264 AVC format, but different values may be set for the MPEG 2 and the H.264 AVC, respectively.
  • the identification or discrimination on various kinds of video methods including the identification of if that program is the 2D program or the 3D program, depending on the combination of the “stream_content” (content of the component) and the “component_type” (the component type).
  • the EIT due to distribution of the component descriptor relating to the program(s), which is/are broadcasted at present or will be broadcasted in future, by means of the EIT, it is possible to produce an EPG (a program table) within the receiving apparatus, by obtaining the EIT therein, and to produce if being the 3D video or not, the method of the 3D video, the resolution, and the aspect ratio, as information of the EPG.
  • the receiving apparatus has a merit that it can display (or output) those information on the EPG.
  • the receiving apparatus since the receiving apparatus observes the “stream_content” and the “component_type”, therefore there can be obtained an effect of enabling to recognize that the program is the 3D program, which is received at present or will be received in future.
  • FIG. 6 shows an example of the structure of the component group descriptor, being one of the program information.
  • the component group descriptor defines the combination of components within an event, and thereby identifying. In other words, it describes grouping information for plural numbers of the components. This descriptor is disposed in the EIT.
  • component_tag is a field of 8 bits, into which is described such a value that this descriptor can be identified as the component group descriptor.
  • descriptor_length is a field of 8 bits, in which the size of this descriptor is described.
  • component_group_type (type of the component group) is a field of 3 bits, and this presents the type of the group of components.
  • the multi-view TV service a TV service for enabling to display the 2D video of multi-viewpoints by exchanging it for each viewpoint, respectively.
  • the transmission is made including the plural numbers of viewpoints into one (1) screen, in the multi-viewpoints video encoded video stream or the stream of an encoding method, which is not regulated as the multi-viewpoints video encoding method, originally, but also in the multi-view TV program.
  • total_bit_rate_flag (flag of total bit rate) is a flag of 1 bit, and indicates a condition of description of the total bit rate within the component group among an event. When this bit is “0”, it indicates there is no total rate field within the component group in that descriptor. When this bit is “1”, it indicates that there is a total rate field within the component group in that descriptor.
  • number_of_group (number of groups) is a field of 4 bits, and indicates a number of the component groups within the event.
  • component_group_id (identification of the component group) is a field of 4 bits, into which the identification of the component group is described, in accordance with FIG. 8 .
  • number_of_CA_unit (number of units of charging) is a field of 4 bits, and indicates a number of unit(s) of charging/non-charging within the component groups.
  • CA_unit_id (identification of a unit of charging) is a field of 4 bits, into which the identification of a unit of charging is described, in accordance with FIG. 9 .
  • number_of_component (number of components) is a field of 4 bits, and it belongs to that component group, and also indicates a number of components belonging to the unit of charging/non-charging, which is indicated by “CA_unit_id” just before.
  • component_tag (component tag) is a field of 8 bits, and indicates a number of the component tag belonging to the component group.
  • total_bit_rate (total bit rate) is a field of 8 bits, into which a total bit rate of the components within the component group, while rounding the transmission rate of a transport stream packet by each 1 ⁇ 4 Mbps.
  • text_length (length of description of the component group) is a field of 8 bits, and indicates byte length of component group description following thereto.
  • text_char (component group description) is a field of 8 bits. A series of the letter information fields describes an explanation in relation to the component group.
  • FIG. 10A shows an example of the structure of a 3D program details descriptor, being one of the program information.
  • the 3D program details descriptor shows detailed information in case where the program is the 3D program, and may be used for determining the 3D program within the receiving apparatus. This descriptor is disposed in the PMT and/or the EIT.
  • the 3D program details descriptor may coexist together with the “stream_content” (content of the component) and the “component_type” (type of the component) for use of the 3D video program, which are shown in FIGS. 5C through 5E .
  • “3d — 2d_type” (type of 3D/2D) is a field of 8 bits, and indicates a type of 3D video or 2D video within the 3D program, in accordance with FIG. 10B .
  • This field is information for identifying of being the 3D video or the 2D video, in a 3D program being structured in such a manner that, for example, a main program is the 3D video, but a commercial advertisement, etc., to be inserted on the way thereof, is the 2D video, and is arranged for the purpose of protecting the receiving apparatus from an erroneous operation thereof (e.g., a problem of display (or output), being generated due to the fact that the broadcasted program is the 2D video in spite of 3D processing executed by the receiving apparatus).
  • “0x01” indicates the 3D video, while “0x02” the 2D video, respectively.
  • 3d_method_type (type of the 3D method) is a field of 8 bits, and indicates the type of the 3D method, in accordance with FIG. 11 .
  • “0x01” indicates the “3D 2-viewpoints separated ES transmission method”
  • “0x02” indicates the “Side-by-Side method”
  • “0x03” indicates the “Top-and-Bottom method”, respectively.
  • “stream_type” (type of the stream format) is a field of 8 bits, and indicates the type of ES of the program, in accordance with FIG. 3 .
  • the 3D program details descriptor is transmitted only in case of the 3D video program, but not during the 2D video program.
  • component_tag (component tag) is a field of 8 bits.
  • a component stream for the service can refer to the described content (in FIG. 5 ), which is indicated by the component descriptor, by means of this field of 8 bits.
  • values of the component tags, being given to each stream should be determined to be different from each other.
  • the component tag is a label for identifying the component stream, and has the value same to that of the component tag(s) within a stream identification descriptor (however, only in a case where the stream identification descriptor exists within the PTM).
  • the receiving apparatus 4 due to observation of the 3D program details descriptor by the receiving apparatus 4 , there can be obtained an effect of enabling to identify that the program, which is received at present or will be received in future, is the 3D program, if this descriptor exists. In addition thereto, it is also possible to identify the type of the 3D transmission method, if the program is the 3D program, and if the 3D video and the 2D video exists, being mixed up with, to identify that fact.
  • FIG. 12 shows an example of the structure of a service descriptor, being one of the program information.
  • the service descriptor presents a name of the programmed channel and a name of the provider, by the letters/marks, together with the type of the service formats thereof. This descriptor is disposed in the SDT.
  • service_type type of the service format
  • service_provider_name_length name of the provider
  • char letters/marks
  • service_name_length length of the service name
  • the receiving apparatus 4 due to observation of the “service_type” by the receiving apparatus 4 , there can be obtained an effect of enabling to identify the service (e.g., the programmed channel) to be a channel of the 3D program.
  • the service e.g., the programmed channel
  • the service mainly broadcasting the 3D video program
  • there may be a case that it must broadcast the 2D video such as, in case where a source of an advertisement video is only the 2D video, etc.
  • the identification of the 3D video program by means of the “service_type” (type of the service format) of that descriptor, it is preferable to apply the identification of the 3D video program by combining the “stream_component” (content of the component) and the “component_type” (type of the component), the identification of the 3D video program by means of the “component_group_type” (type of the component group), or the identification of the 3D video program by means of the 3D program details descriptor, which are already explained previously, in common therewith.
  • identifying by combining plural numbers of information it is possible to make such identification that, the program is the 3D video broadcasting service, but apart thereof is the 2D video, etc.
  • the receiving apparatus it is possible to expressly indicate that the service is the “3D video broadcasting service”, for example, on the EPG, and also to exchange display controlling, etc., between the 3D video program and the 2D video program, when receiving the program and so on, even if the 2D video program is mixed into that service, other than the 3D video program.
  • the service is the “3D video broadcasting service”, for example, on the EPG, and also to exchange display controlling, etc., between the 3D video program and the 2D video program, when receiving the program and so on, even if the 2D video program is mixed into that service, other than the 3D video program.
  • FIG. 14 shows an example of the structure of a service list descriptor, being one of the program information.
  • the service list descriptor provides service a list of services, upon basis of service identification and service format type. Thus, it describes therein a list of the programmed channels and the types of them. This descriptor is disposed in the NIT.
  • service_id identification of the service
  • service_type type of the service format
  • service_type type of the service format
  • the service since it is possible to identify if the service is the “3D video broadcasting service” or not, therefore, for example, it is possible to conduct the display for grouping only the “3D video broadcasting service” on the EPG display, etc., with using the list of the programmed channels and the types thereof, which are shown in that service list descriptor.
  • the component descriptor, the component group descriptor, the 3D program details descriptor, the service descriptor, and the service list descriptor of the program information are the information, to be produced and added in the management information supply portion 16 , for example, and stored in PSI (for example, the PMT, etc.) or in SI (for example, the EIT, SDT or NIT, etc.) of MPEG-TS, and thereby to be transmitted from the transmitting apparatus 1 .
  • PSI for example, the PMT, etc.
  • SI for example, the EIT, SDT or NIT, etc.
  • FIG. 15 shows an example of the transmission management regulation of the component descriptors within the transmitting apparatus 1 .
  • descriptor_tag is described “0x50”, which means the component descriptor.
  • descriptor_length is described the descriptor length of the component descriptor. The maximum value of the descriptor length is not regulated.
  • stream_component is described “0x01” (video).
  • component_type is described the video component type of that component. With the component type, it is determined from among those shown in FIG. 5 .
  • component_tag is described such a value of the component tag to be unique in said program.
  • ISO — 639_language_code is described “jpn (“0x6A706E”)”.
  • text_char is described characters less than 16 bytes (or, 8 full-size characters) as a name of the video type when there are plural numbers of video components. No line feed code can be used. This field can be omitted when the component description is made by a letter (or character) string of default.
  • the receiving apparatus 4 can observe the “stream_component” and the “component_type”, and therefore there can be obtained an effect of enabling recognition that the program, which is received at present or will be received in future, is the 3D program.
  • FIG. 16 shows an example of the transmission management regulation of the component group within the transmitting apparatus 1 .
  • descriptor_tag is described “0xD9”, which means the component group descriptor.
  • descriptor_length is described the descriptor length of the component group descriptor. No regulation is made on the maximum value of the descriptor length. “000” indicates a multi-view TV and “001” a 3D TV, respectively.
  • total_bit_rate_flag indicates “0” when all of the total bit rates within a group in an event are at the default value, which is regulated, or “1” when any one of the total bit rates within a group in an event is exceeds the regulated default value.
  • number_of_group is described a number of the component group(s) in an event.
  • MVTV multi-view TV
  • 3DTV 3D TV
  • component_group_id an identification of the component group. “0x0” is assigned when it is a main group, and in case of each sub-group, IDs are assigned in such a manner that broadcasting providers can be identified, uniquely.
  • number_of_CA_unit a number of unit(s) of charging/non-charging within the component group. It is assumed that the maximum value is two (2). It is “0x1” when no charging component is included within that component group, completely.
  • CA_uni_id an identification of unit of charging. Assignment is made in such a manner that broadcasting providers can be identified, uniquely. “num_of_component” belongs to that component group, and also it describes a number of the components belonging to the charging/non-charging unit, which are included in “CA_uni_id” just before. It is assumed that the maximum value is fifteen (15).
  • component_tag is described a component tag value, which belongs to the component group.
  • total_bit_rate is described a total bit rate within the component group. However, “0x00” is described therein, when it is the default value.
  • text_length is described a byte length of description of a component group following thereto. It is assumed that the maximum value is 16 (or, 8 full-size characters).
  • text_char must be described an explanation, necessarily, in relation to the component group. No regulation is made on a default letter (or character) string.
  • the receiving apparatus 4 can observe the “component_group_type”, and therefore there can be obtained an effect of enabling recognition that the program, which is received at present or will be received in future, is the 3D program.
  • FIG. 17 shows an example of the transmission management regulation of the 3D program details descriptor within the transmitting apparatus 1 .
  • In“descriptor_tag” is described “0xE1”, which means the 3D program details descriptor.
  • descriptor_length is described the descriptor length of the 3D program details descriptor. It is determined from among those shown in FIG. 10B .
  • In “3d_method_type” is described the type of the 3D method. It is determined from among those shown in FIG. 11 .
  • In “stream_type” is described a format of ES of the program. It is determined from among those shown in FIG. 3 .
  • component_tag is described such a value of component tag that it can be identified uniquely within that program.
  • the receiving apparatus 4 can observe the 3D program details descriptor, and therefore, if this descriptor exists, there can be obtained an effect of enabling recognition that the program, which is received at present or will be received in future, is the 3D program.
  • FIG. 18 shows an example of the transmission management regulation of the service descriptor within the transmitting apparatus 1 .
  • descriptor_tag is described “0x48”, which means the service descriptor.
  • descriptor_length is described the descriptor length of the service descriptor.
  • service_provider_name_length a name of the service provider, if the service is a BS/CS digital TV broadcasting. It is assumed that the maximum value is 20. Since no “service_provider_name_length” is managed in the terrestrial digital TV broadcasting, “0x00” is described therein.
  • char is described the provider's name when the service is the BS/CS digital TV broadcasting. It is 10 full-size characters at maximum. Nothing is described therein in case of the terrestrial digital TV broadcasting.
  • service_name_length is described name length of a programmed channel. It is assumed the maximum value is 20.
  • char is described a name of the programmed channel. It can be written within 20 bytes or within 10 full-size characters. However, only one (1) piece is disposed for a programmed channel targeted.
  • the receiving apparatus 4 can observe the “service_type”, and therefore there can be obtained an effect of enabling recognition that the programmed channel is the channel of 3D program.
  • FIG. 19 shows an example of the transmission management regulation of the service list descriptor within the transmitting apparatus 1 .
  • descriptor_tag is described “0x41”, which means the service list descriptor.
  • descriptor_length is described the descriptor length of the service list descriptor.
  • loop is described a loop of the number of services, which are included within the transport stream targeted.
  • service_id is described the “service_id”, which is included in that transport stream.
  • service_type is described the service type of an object service. It is determined from among those shown in FIG. 13 . However, it must be disposed for a TS loop within NIT, necessarily.
  • the receiving apparatus 4 can observe the “service_type”, and therefore there can be obtained an effect of enabling recognition that the programmed channel is the channel of 3D program.
  • FIG. 25 is a hardware structure view for showing an example of the structure of the receiving apparatus 4 , among the system shown in FIG. 1 .
  • a reference numeral 22 depicts a CPU (Central Processing Unit) for controlling the receiver as a whole, 22 a common bus for transmitting control and information between the CPU 2 and each portion within the apparatus, 23 a tuner for receiving a radio signal broadcasted from the transmitting apparatus 1 through a broadcast transmission network, such as, a radio wave (satellite, terrestrial), a cable, etc., for example, to execute selection at a specific frequency, demodulation, an error correction process, etc., and thereby outputting a multiplexed packet, such as, MPEG2-Transport Stream (hereinafter, also may be called “TS”), 24 a descrambler for decoding a scramble made by a scrambler portion 13 , 25 a network I/F (Interface) for transmitting/receiving information between the network, and thereby transmitting/receiving various kinds
  • ES means each of the video/audio data being compressed/encoded, respectively.
  • a reference numeral 30 depicts a video decoder portion for decoding the video ES into the video signal, 31 an audio decoder portion for decoding the audio ES into the audio signal, and thereby outputting it to a speaker 48 or outputting it from an audio output 42 , 32 a video conversion processor portion, executing a process for converting the video signal decoded in the video decoder portion 30 , or the video signal of 3D or 2D through a converting process, which will be mentioned later, into a predetermined format, in accordance with an instruction of the CPU mentioned above, and/or a process for multiplexing a display, such as, on OSD (On Screen Display), etc., which is produced by the CPU 21 , onto the video signal, etc., and thereby outputting the video signal after processing to a display 47 or a video signal output portion 41 while outputting a synchronization signal and/or a control signal (to be used in control of equipment
  • FIGS. 35 and 36 show an example of the system configuration when the receiving apparatus and the viewing/listening device are unified into one body, while FIG. 36 shows an example of the structure when the receiving apparatus and the viewing/listening device are separated in the structures.
  • a reference numeral 3501 depicts a display device, including the structure of the receiving apparatus 4 mentioned above therein and being able to display the 3D video and to output the audio, 3503 a 3D view/listen assisting device control signal (for example, IR), being outputted from the display device 3501 mentioned above, and 3502 the 3D view/listen assisting device, respectively.
  • the video signal is outputted from a video display, being equipped on the display apparatus 3501 mentioned above, while the audio signal is outputted from a speaker, being equipped on the display device 3501 mentioned above.
  • the display device 3501 is equipped with an output terminal for outputting the equipment control signal 44 or the 3D view/listen assisting device control signal, which is outputted from the output portion for the control signal 43 .
  • the 3D view/listen assisting device 3502 may be such a one of being able to achieve the polarization separation, so as to enter different pictures into the left-side eye and the right-side eye, but it is not necessary to output the equipment control signal 44 from the display device 3501 or the 3D view/listen assisting device control signal 3503 , which is outputted from the output portion to the 3D view/listen assisting device 3502 .
  • a reference numeral 3601 depicts a video/audio output device including the structure of the receiving apparatus 4 mentioned above, 3602 a transmission path (for example, a HDMI cable) for transmitting a video/audio/control signal, and 3603 a display for displaying/outputting the video signal and/or the audio signal, which are inputted from the outside.
  • 3602 a transmission path (for example, a HDMI cable) for transmitting a video/audio/control signal
  • 3603 a display for displaying/outputting the video signal and/or the audio signal, which are inputted from the outside.
  • the video signal and the audio signal which are outputted from the video output 41 and the audio output 42 of the video/audio output device 3601 (e.g., the receiving apparatus 4 ), and also the control signal, which is outputted from the control signal output portion 43 , are converted into transmission signals, each being of a form in conformity with a format, which is regulated for the transmission path 3602 (for example, the format regulated by the HDMI standard), and they are inputted into the display 3603 passing through the transmission path 3602 .
  • a format which is regulated for the transmission path 3602 (for example, the format regulated by the HDMI standard
  • the display 3603 decodes the above-mentioned transmission signals received thereon into the video signal, the audio signal and the control signal, and outputs the video and the audio therefrom, as well as, outputting the 3D view/listen assisting device control signal 3503 to the 3D view/listen assisting device 3502 .
  • the explanation was made upon an assumption that the display device 3603 and the 3D view/listen assisting device 3502 shown in FIG. 36 do displaying through the active shutter method, which will be mentioned later, but in case where the display device 3603 and the 3D view/listen assisting device 3502 shown in FIG. 36 apply the method for displaying the 3D video through the polarization separation, the 3D view/listen assisting device 3502 may be such a one of being able to achieve the polarization separation, so as to enter different pictures into the left-side eye and the right-side eye, but it is not necessary to output the 3D view/listen assisting device control signal 3503 from the display device 3603 to the 3D view/listen assisting device 3502 .
  • a part of each of the constituent element shown by 21 to 46 in FIG. 25 may be constructed with one (1) or plural numbers of LSI(s). Or, such a structure may be adopted that the function of a part of each of the constituent element shown by 21 to 46 in FIG. 25 can be achieved in the form of software.
  • FIG. 26 shows an example of the function block structure of processing in an inside of the CPU 21 .
  • each function block exists, for example, in the form of a module of software to be executed by the CPU 21 , wherein delivery of information and/or data and an instruction of control are conducted by any means, between the respective modules (for example, a message passing, a function call, an event transmission, etc.).
  • each module also executes transmission/receiving hardware of information between each of hardwires within the receiving apparatus 4 through the common buss 22 .
  • relation lines e.g., arrows
  • a tuning control portion 59 obtains the program information necessary for tuning from a program information analyzer portion 54 , appropriately.
  • a system control portion 51 manages a condition of each module and/or a condition of instruction made by the user, etc., and also gives a control instruction to each module.
  • a user instruction receiver portion 52 receives and interprets an input signal of a user operation, which the control signal transmitter portion 33 receives, and transmits an instruction of the user to the system control portion 51 .
  • An equipment control signal transmitter portion 53 instructs the control signal transmitter portion 33 to transmit an equipment control signal, in accordance with an instruction from the system control portion 51 or other module(s).
  • the program information analyzer portion 54 obtains the program information from the multiplex/demultiplex portion 29 , to analyze it, and provides necessary information to each module.
  • a time management portion 55 obtains time correction information (TOT: Time offset table), which is included in TS, from the program information analyzer portion 54 , thereby managing the present time, and it also gives a notice of an alarm (noticing an arrival of the time designated) and/or a one-shot timer (noticing an elapse of a preset time), in accordance with request(s) of each module, with using the counter that the time 34 has.
  • TOT Time offset table
  • a network control portion 56 controls the network I/F 25 , and thereby obtains various kinds of information from a specific URL (Unique Resource Locator) and/or IP (Internet Protocol) address.
  • a decode control portion 57 controls the video decoder portion 30 and the audio decoder portion 31 , and conducts start or stop of decoding, obtaining the information included in the stream, etc.
  • a recording/reproducing control portion 58 controls the record/reproduce portion 27 , so as to read out a signal from a recoding medium 26 , from a specific position of a specific content, and in an arbitrary readout format (normal reproduce, fast-forward, rewind, a pause). It also executes a control for recording the signal inputted into the record/reproduce portion 27 onto the recording medium 26 .
  • a tuning control portion 59 controls the tuner 23 , the descrambler 24 , the multiplex/demultiplex portion 29 and the decode control portion 57 , and thereby conducts receiving of broadcast and recording of the broadcast signal. Or, it conducts reproducing from the recording medium, and also controlling until when the video signal and the audio signal are outputted. Details of the operation of broadcast receiving and the recording operation of the broadcast signal and the reproducing operation from the recording medium will be given later.
  • An OSD produce portion 60 produces OSD data, including a specific message therein, and instructs a video conversion control portion 61 to pile up or superimpose that OSD data produced on the video data, thereby to be outputted.
  • the a video conversion control portion 61 produces the OSD data for use of the left-side eye and for use of the right-side eye, having parallax therebetween, and requests the video conversion control portion 61 to do the 3D display upon basis of the OSD data for use of the left-side eye and for use of the right-side eye, and thereby executing display of the message in 3D.
  • the video conversion control portion 61 controls the video conversion processor portion 32 , and superimpose the video, which is converted into 3D or 2D in accordance with the instruction from the system control portion 51 mentioned above, and the OSD inputted from the OSD produce portion 61 , onto the video signal, which is inputted into the video conversion processor portion 32 from the video decoder portion 30 , and further processes the video (for example, scaling or PinP, 3D display, etc.) depending on the necessity thereof; thereby displaying it on the display 47 or outputting it to an outside. Details of the converting method of the 2D video into a predetermined format will be mentioned later. Each function block provides such function as those.
  • the system control portion 51 receiving an instruction of the user indicating to receive broadcast at a specific channel (CH) (for example, pushdown of a CH button on the remote controller) from the user instruction receiver portion 52 instructs tuning at the CH, which the user instructs (hereinafter, “designated CH”), to the tuning control portion 59 .
  • CH specific channel
  • the tuning control portion 59 receiving the instruction mentioned above instructs a control for receiving the designated CH (e.g., a tuning at designated frequency band, a process for demodulating the broadcast signal, an error correction process) to the tuner, so that it output TS to the descrambler 24 .
  • the designated CH e.g., a tuning at designated frequency band, a process for demodulating the broadcast signal, an error correction process
  • the tuning control portion 59 instructs the descrambler 24 to descramble the TS mentioned above and to output it to the multiplex/demultiplex 29 , while it instructs the multiplex/demultiplex 29 to demultiplex of the TS inputted and to output the video ES demultiplexed to the video decoder portion 30 , and also to output the audio ES to the audio decoder portion 31 .
  • the tuning control portion 59 instructs the decode control portion 57 to decode the video ES and the audio ES, which are inputted into the video decoder portion 30 and the audio decoder portion 31 .
  • the decode control portion 57 controls the video decoder portion 30 to output the video signal decoded into the video conversion processor portion 32 , and controls the audio decoder portion 31 to output the audio signal decoded to the speaker 48 or the audio output 42 . In this manner is executed the control of outputting the video and the audio of the CH, which the user designates.
  • the system control portion 51 instructs the OSD produce portion 60 to produce and output the CH banner.
  • the OSD produce portion 60 receiving the instruction mentioned above transmits the data of the banner produced to the video conversion control portion 61 , and the video conversion control portion 61 receiving the data mentioned above superimposes the CH banner on the video signal, and thereby to output it. In this manner is executed the display of the message when tuning, etc.
  • the system control portion 51 instructs the tuning control portion 59 to tune up to the specific CH and to output a signal to the record/reproduce portion 27 .
  • the tuning control portion 59 receiving the instruction mentioned above, similar to the broadcast receiving process mentioned above, instructs the tuner 23 to receive the designated CH, instructs the descrambler 24 to descramble the MPEG2-TS received from the tuner 23 , and further instructs the multiplex/demultiplex portion 29 to output the input from the descrambler 24 to the record/reproduce portion 27 .
  • the system control portion 51 instructs the recording/reproducing control portion 58 to record the input TS into the record/reproduce portion 27 .
  • the record/reproduce control portion 58 receiving the instruction mentioned above executes a necessary process, such as, an encoding, etc., on the signal (TS) inputted into the record/reproduce portion 27 , and after executing production of additional information necessary when recording/reproducing (e.g., the program information of a recoding CH, content information, such as, a bit rate, etc.) and recording into management data (e.g., an ID of recording content, a recording position on the recording medium 26 , a recording format, encryption information, etc.), it executes a process for writing the management data onto the recording medium 26 . In this manner is executed the recording of broadcast signal.
  • a necessary process such as, an encoding, etc.
  • the system control portion 51 instructs the recording/reproducing control portion 58 to reproduce the specific program.
  • an instruction in this instance are given an ID of the content and a reproduction starting position (for example, at the top of program, at the position of 10 minutes from the top, continuation from the previous time, at the position of 100 Mbytes from the top, etc.)
  • the recording/reproducing control portion 58 receiving the instruction mentioned above controls the record/reproduce portion 27 , and thereby executes processing so as to read out the signal (TS) from the recording medium 26 with using the additional information and/or the management information, and after treating a necessary process thereon, such as, decryption of encryption, etc., to output the TS to the multiplex/demultiplex portion 29 .
  • the system control portion 51 instructs the tuning control portion 59 to output the video/audio of the reproduced signal.
  • the tuning control portion 59 receiving the instruction mentioned above controls the input from the record/reproduce portion 27 to be outputted into the multiplex/demultiplex portion 29 , and instructs the multiplex/demultiplex portion 29 to demultiplex the TS inputted, and to output the video ES demultiplexed to the video decoder portion 30 , and also to output the audio ES demultiplexed to the audio decoder portion 31 .
  • the tuning control portion 59 instructs the decode control portion 57 to decode the video ES and the audio ES, which are inputted into the video decoder portion 30 and the audio decoder portion 31 .
  • the decode control portion 57 receiving the decode instruction mentioned above controls the video decoder portion 30 to output the video signal decoded to the video conversion processor portion 32 , and controls the audio decoder portion 31 to output the audio signal decoded to the speaker 48 or the audio output 42 . In this manner is executed the process for reproducing the signal from the recording medium.
  • the present invention As a method for displaying the 3D video, into which the present invention can be applied, there are several ones; i.e., producing videos, for use of the left-side eye and for use of the right-side eye, so that the left-side eye and the right-side eye can feel the parallax, and thereby inducing a person to perceive as if there exists a 3D object.
  • an active shutter method of generating the parallax on the pictures appearing on the left and right eyes by conducting the light shielding on the left-side and the right-side glasses with using a liquid crystal shutters, on the glasses, which the user wears, and also displaying the videos for use of the left-side eye and for use of the right-side eye in synchronism with that.
  • the receiving apparatus 4 outputs the sync signal and the control signal, from the control signal output portion 43 and the equipment control signal 44 to the glasses of the active shutter method, which the user wears.
  • the video signal is outputted from the video signal output portion 41 to the external 3D video display device, so as to display the video for use of the left-side eye and the video for use of the right-side eye, alternately.
  • the similar 3D display is conducted on the display 47 , which the receiving apparatus 4 has. With doing in this manner, for the user wearing the glasses of the active shutter method, it is possible to view/listen the 3D video on that 3D video display device or the display 47 that the receiving apparatus 4 has.
  • a polarization light method of generating the parallax between the left-side eye and the right-side eye by separating the videos entering into the left-side eye and the right-side eye, respectively, depending on the polarizing condition, with sticking films crossing at a right angle in liner polarization thereof or treating liner polarization coat, or sticking films having opposite rotation directions in rotating direction of a polarization axis in circular polarization or treating circular polarization coat on the left-side and right-side glasses, on a pair of glasses that the user wears, while outputting the video for use of the left-side eye and the video for use of the right-side eye, simultaneously, of polarized lights differing from each other, corresponding to the polarizations of the left-side and the right-side glasses, respectively.
  • the receiving apparatus 4 outputs the video signal from the video signal output portion 41 to the external 3D video display device, and that 3D video display device displays the video for use of the left-side eye and the video for use of the right-side eye under the different polarization conditions. Or, the similar display is conducted by the display 47 , which the receiving apparatus 4 has. With doing in this manner, for the user wearing the glasses of the polarization method, it is possible to view/listen the 3D video on that 3D video display device or the display 47 that the receiving apparatus 4 has.
  • a color separation method may be applied of separating the videos for the left-side and the right-side eyes depending on the color.
  • a parallax barrier method of producing the 3D video with utilizing the parallax barrier which can be viewed by naked eyes.
  • the 3D display method according to the present invention should not be restricted to a specific method.
  • Determination is made on whether being the 3D or not, by confirming the information for determining on whether being the 3D program or not, which is newly included in the component descriptor and/or the component group descriptor, described in the table, such as, PTM and/or EIT [schedule basic/schedule extended/present/following], or confirming the 3D program details descriptor, being a new descriptor for use of determining the 3D program, or by confirming the information for determining on whether being the 3D program or not, which is newly included in the service descriptor and/or the service list descriptor, etc., described in the table, such as, NIT and/or SDT, and so on.
  • Those information are supplied or added to the broadcast signal in the transmitting apparatus mentioned previously, and are transmitted therefrom. In the transmitting apparatus, those information are added to the broadcast signal by the management information supply portion 16 .
  • the PMT since it describes therein only the information of present programs, the information of future programs cannot be confirmed, but it has a characteristic that the reliability thereof is high.
  • the EIT chedule basic/schedule extended
  • the EIT [following] since possible to obtain the information of programs of next coming broadcast hour(s), it is preferable to be applied into the present embodiment.
  • the EIT [present it can be used to obtain the present program information, and the information differing from the PMT can be obtained therefrom.
  • FIG. 20 shows an example of the process for each field of the component descriptor within the receiving apparatus 4 .
  • descriptor_tag When “descriptor_tag” is “0x50”, it is determined that the said descriptor is the component descriptor. By “descriptor_length”, it is determined to be the descriptor length of the component descriptor. If “stream_content” is “0x01”, “x05”, “x06” or “0x07”, then it is determined that the said descriptor is valid (e.g., the video). When other than “0x01”, “x05”, “x06” and “0x07”, it is determined that the said descriptor is invalid. When “stream_content” is “0x01”, “x05”, “x06” or “0x07”, the following processes will be executed.
  • component_type is determined to be the component type of that component. With this component type, it is assigned with any value shown in FIG. 5 . Depending on this content, it is possible to determine on whether that component is the component of the 3D video program or not.
  • component_tag is a component tag value to be unique within that program, and can be used by corresponding it to the component tag value of the stream descriptor of PMT.
  • ISO — 639_language code identifies the language of that component and the language of the description of letters included in this component. For example, if the value is “jpn (“0x6A706E”)”, then the letter codes disposed following thereto are treated as “jpn” (Japanese language).
  • the component descriptor it is possible to determined the type of the video component, which builds up the event (e.g., the program), and the component description can be used when selecting the video component in the receiver.
  • component_type of the component descriptor describes only the representative component types of that component, but it is hardly done to change this value in real time responding to the mode change during the program.
  • component_type described by the component descriptor is referred to when determining a default “maximum_bit_rate” in case where a digital copy control descriptor, being the information for controlling a copy generation and the description of the maximum transmission rate within digital recording equipment, is omitted therefrom, for that event (e.g., the program).
  • the receiving apparatus 4 can observe the “stream_type” and the “component_type”, and therefore there can be obtained an effect of enabling to recognize that the program, which is received at present, or which will be received in future, be the 3D program.
  • FIG. 21 shows an example of a process upon each field of the component group descriptor, in the receiving apparatus 4 .
  • descriptor_tag is “0xD9”
  • descriptor_length it is determined to have the descriptor length of the component group descriptor.
  • component_group_type is “000”, it is determined to be the multi-view TV service, on the other hand if “001”, it is determined to be the 3D TV service.
  • total_bit_rate_flag is “0”, it is determined that the total bit rate within the group of an event (e.g., the program) is not described in that descriptor. On the other hand, if “1”, it is determined the total bit rate within the group of an event (e.g., the program) is described in that descriptor.
  • “num_of_group” is determined to be a number of the component group(s) within an event (e.g., the program). While there is the maximum number, and if the number exceeds that maximum number, then there is a possibility of treating it as the maximum value. If “component_group_id” is “0x0”, it is determined to be a main group. If it is other than “0x0”, it is determined to be a sub-group.
  • number_of_CA_unit is determined to be a number of charging/non-charging units within the component group. If exceeding the maximum value, there is a possibility of treating it to be “2”.
  • CA_uni_id is “0x0”, it is determined to be the non-charging unit group. If “0x1”, it is determined to be the charging unit including a default ES group therein. If other than “0x0” and “0x1”, it is determined to be a charging unit identification of other(s) than those mentioned above.
  • number_of_component belongs to that component group, and is determined to be a number of the component(s) belonging to the charging/non-charging unit, which indicated by the “CA_uni_id” just before. If exceeding the maximum value, there is a possibility of treating it to be “15”.
  • component_tag is determined to be a component tag value belonging to the component group, and this can be used by corresponding it to the component tag value of the stream descriptor of PMT.
  • total_tag_rate is determined to be the total bit rate within the component group. However, when it is “0x00”, it is determined to be default.
  • text_length is equal to or less than 16 (or, 8 full-size characters), it is determined to be the component group description length, and if being larger than 16 (or, 8 full-size characters), the explanation exceeding 16 (or, 8 full-size characters) may be neglected.
  • the receiving apparatus 4 can observe the “component_group_type”, and therefore there can be obtained an effect of enabling to recognize that the program, which is received at present, or which will be received in future, be the 3D program.
  • FIG. 22 shows an example of a process for each field of the 3D program details descriptor, within the receiving apparatus 4 .
  • descriptor_tag is “0xE1”
  • descriptor_tag it is determined that the said descriptor is the 3D program details descriptor.
  • descriptor_tag it is determined to have the descriptor length of the 3D program details descriptor.
  • 3d — 2d_type is determined to be 3D/2D identification in that 3D program. This is designated from among of those shown in FIG. 10B .
  • 3d_method_type is determined to be the identification of 3D method in that 3D program. This is designated from among of those shown in FIG. 11 .
  • “stream_type” is determined to be a format of the ES of that 3D program. This is designated from among of those shown in FIG. 3 .
  • “component_tag” is determined to be the component tag value, to be unique within that 3D program. This can be used by responding it to the component tag value of the stream identifier of PMT.
  • FIG. 23 shows an example of a process upon each field of the service descriptor, within the receiving apparatus 4 . If “descriptor_tag” is “0x48”, it is determined that the said descriptor is the service descriptor. By “descriptor_length” it is determined to be the description length of the service descriptor. If “service_type” is other than those shown in FIG. 13 , said descriptor is determined invalid.
  • service_provider_name_length is determined to be a name length of the provider, when receiving the BS/CS digital TV broadcast, if it is equal to or less than 20, and if it is greater than 20, the provider name is determined to be invalid. On the other hand, when receiving the terrestrial digital TV broadcast, those other than “0x00” are determined to be invalid.
  • “char” is determined to be a provider name, when receiving the BS/CS digital TV broadcast. On the other hand, when receiving the terrestrial digital TV broadcast, the content described therein is neglected. If “service_name_length” is equal to or less than 20, it is determined to be the name length of the programmed channel, and if greater than 20, the programmed channel name is determined invalid.
  • “char” is determined to be a programmed channel name. However, if impossible to receive SDT, in which the descriptors are arranged or disposed in accordance with the transmission management regulation explained in FIG. 18 mentioned above, basic information of a target service is determined to be invalid.
  • the receiving apparatus 4 can observe the “service_type”, and there can be obtained an effect of enabling to recognize that the programmed channel is the channel of the 3D program.
  • FIG. 24 shows an example of a process upon each field of the service list descriptor, within the receiving apparatus 4 . If “descriptor_tag” is “0x41”, it is determined that said descriptor is the service list descriptor. By“descriptor_length”, it is determined to be the description length of the service list descriptor.
  • loop is described a loop of the number of services, which are included in the target transport stream.
  • service_id is determined as “service_id” to that transport stream.
  • service_type indicates a type of service of the target service. Other(s) than those defined in FIG. 13 is/are determined to be invalid.
  • the service list descriptor can be determined to be the information of the transport stream included within the target network.
  • the receiving apparatus 4 can observe the “service_type”, and there can be obtained an effect of enabling to recognize that the programmed channel is the channel of the 3D program.
  • the type for indicating the 3D video is assigned to the “component_type” of the component descriptor (for example, in FIGS. 5C to 5E ), and if there is any one, “component_type” of which indicates the 3D, then it is possible to determine that program to be the 3D program. (For example, with assigning that shown in FIGS. 5C to 5E , etc., it is confirmed that said value exists in the program information of the target program.)
  • the description for indicating the 3D service is assigned to the value of the “service_group_type”, and if the value of the “component_group_type” indicates the 3D service, then it is possible to determine it is the 3D program. (For example, while assigning “001” of the bit field to the 3D TV service, etc., it is confirmed that said value exists in the program information of the target program.)
  • the 3D program details descriptor which is arranged in the PMT and/or the EIT, as was explained in FIGS. 10 and 11 mentioned above, when determining whether the target program is the 3D program or not, it is possible to determine depending on the content of the “2d — 3d_type” (the 3D/2D type) within the 3D program details descriptor. Also, if the 3D program details descriptor about the receiving program, it is determined to be the 2D program.
  • the receiving apparatus can deal with the 3D method type (i.e., the “3d_method_type” mentioned above) thereof, which is included in the descriptor mentioned above.
  • the 3D method type i.e., the “3d_method_type” mentioned above
  • the program information there is also a method of obtaining it through a communication path for exclusive use thereof (e.g., the broadcast signal, or the Internet).
  • a communication path for exclusive use thereof e.g., the broadcast signal, or the Internet.
  • the explanation was given on various kinds of information (i.e., the information included in the table and/or the descriptor) for determining if being the 3D video or not, by the unit of the service (CH) or the program; however, all of those are not necessarily needed to be transmitted, according to the present invention. It is enough to transmit the information necessary fitting to configuration of the broadcasting. Among those information, it is enough to determine if being the 3D video or not, upon a unit of the service (CH) or the program, by confirming a single information, respectively, or to determine if being the 3D video or not, upon a unit of the service (CH) or the program, by combining plural numbers of information.
  • information i.e., the information included in the table and/or the descriptor
  • the determination When making the determination by combining the plural numbers of information, it is also possible to make the determination, such as, that it is the 3D video broadcasting service, but a part of the program is the 2D video, etc.
  • the receiving apparatus it is possible to indicate clearly, for example, that said service is “3D video broadcasting service” on the EPG, and also, if the 2D video program is mixed with in said service, other than the 3D video program, it is possible to exchange the display control between the 3D video program and the 2D video program when receiving the program.
  • the 3D components which are designated in FIGS. 5C to 5E , for example, are processed (e.g., reproduced, displayed and/or outputted) in 3D, if they can be processed (e.g., reproduced, displayed and/or outputted), appropriately, within the receiving apparatus 4 ; however, if they cannot be processed (e.g., reproduced, displayed and/or outputted), appropriately, within the receiving apparatus 4 (for example, in case where there is no function of reproducing the 3D video for dealing with the 3D transmitting method, which is designated), they may be processed (e.g., reproduced, displayed and/or outputted) in 2D. In this instance, there may be displayed that said 3D video program cannot be displayed in 3D or outputted in 3D, appropriately, on the receiving apparatus, together with the display and the output of the 2D video.
  • FIG. 50 An example of the message display in this case is shown in FIG. 50 .
  • a reference numeral 701 depicts an entire of screen to be displayed or outputted by the apparatus, and 5001 depicts an example of a message to the user for an indication of being the 3D method type, which the receiving apparatus 4 cannot deal with.
  • the message 5001 may be displayed an error code for presenting a type of an error, a 3D method type (for example, a value of “3d_method_type”) or a value of combining those. With this, there can be obtained a merit of enabling the user to decide the condition in an inside of the receiving apparatus, and so on.
  • the system control portion 51 obtains the program information of the present program from the program information analyzer portion 54 (S 201 ), and determines on whether the present program is the 3D program or not, in accordance with the determining method for the 3D program mentioned above. If the present program is not the 3D program (“no” in S 202 ), no process is conducted, in particular. If the present program is the 3D program (“yes” in S 202 ), next, it is confirmed on whether the receiving apparatus is enabled or not, with the 3D method type of the present program (S 802 ).
  • FIG. 48 shows an example of a display of an electronic program table including the 3D program therein.
  • the electronic program table is mainly constructed upon basis of the program information included in the EIT, which is transmitted after being multiplexed on the broadcast signal; however other than that, there may be done data sending of the program information with a unique multiplexing method, or sending of the program information via the Internet, etc.
  • the information to be applied on the electronic program table are a program name, a broadcast starting time, a broadcasting period, and other detailed information of the programs (e.g., the cast, director, information relating to decoding of video and/or audio, a series name, etc.), in relation to an event (the program), and such the electronic program table is conducted with, as shown in FIG.
  • the EIT is transmitted in relation to, not only the program, which is broadcasted at present, but also the program, which will be broadcasted in future.
  • the receiving apparatus it is possible to execute the displaying process of the electronic program table, which will be mentioned hereinafter, and so on, with using the information included in the EIT, in relation to the program, which is broadcasted at present, and also the program, which will be broadcasted in future.
  • a reference numeral 701 shown in the figure depicts an entire of the screen, which the apparatus displays or outputs, 4801 an entire of the electronic program table presented on the screen mentioned above, respectively, wherein the vertical axis presents the service (CH: channel) while the horizontal axis the time, respectively, and in this example is shown an electronic program table of the services 1CH, 3CH, 4CH and 6CH from 7:00 until 12:00. Further, when displaying the electronic program table, only the electronic program table may be displayed, without reproducing the program, which is received at present. Those processed may be executed, in the receiving apparatus shown in FIG. 25 , under the control of the CPU 21 (e.g., the system control portion 51 and the OSD produce portion 60 ), within the video conversion processor portion 32 .
  • the CPU 21 e.g., the system control portion 51 and the OSD produce portion 60
  • the 3D program mark a mark (hereinafter, being called “3D program mark”), such as “4802”, for example, with which it can be seen that said program be the 3D program, is displayed in such an extent that the mark can be seen being attached to that program (for example, within the range of the rectangular for indicating the program, or in a predetermined range surrounding that rectangular).
  • the 3D method type of that program may be obtained from the information of the “3d_method type” mentioned above, to be determined, and there may be displayed characters or a mark for indicating the method of the 3D broadcast.
  • the mark “MVC” for indicating the multi-viewpoints encoding method.
  • the user can decide that said program is the 3D program, as well as, with which 3D method type it is broadcasted.
  • the 3D method type which is obtained from the “3d_method type” mentioned above, is not enabled with the receiving apparatus
  • there can be considered a method of changing the content displayed depending on if the receiving apparatus is enabled or not, with the 3D method type of said program such as, displaying a mark for indicating “non-enabled” (for example, “X” in the figure) or changing a display color (for example, conducting a display like a shading in the figure, or changing a color of a region of the electronic program display), or in case where said program is the 3D method type, with which the receiving apparatus is enabled, a method of displaying a mark indicating “enabled” (for example, “ ⁇ ” in the place of “X” at the displaying position thereof in the figure). With doing this, it is possible to let the user to recognize if said program is the program of the 3D method type or not, with which the receiving apparatus is enabled, easily.
  • combining those displays also is applicable a display of indicating that it is the 3D method type, with which the apparatus itself is not enabled, by changing the display color while displaying the 3D method type of the program. In that case, for the user, it is possible to decide if it is the 3D method type or not, with which the receiving apparatus is enabled, while confirming the 3D display type of that program, easily.
  • the 3D program mark may be displayed in a region separating from the region selected.
  • it may be displayed, as is shown by a reference numeral 4902 , together with the detailed information of the program (for example, a CH number, a broadcasting time, a program name, shown by a reference numeral 4901 ), in an outside of the rectangular region indicating the selected program mentioned above.
  • the example shown in FIG. 49 is that providing the regions for displaying the 3D program mark 4902 and the detailed information 4901 of the program, outside a region 4903 for displaying a program list of the electronic program table.
  • the electronic program table As a method for displaying the electronic program table, other than that, when the user do a specific operation through the remote controller (for example, pushdown of a button, or setup on a menu), or when she/he opens the electronic program table for exclusive use of the 3D program, or if the apparatus is the 3D enabled, only the 3D programs may be displayed on the electronic program table. With doing in this way, for the user, it is easy to find out the 3D program.
  • the 3D program mark may be displayed on a program display (for example, a CH banner), which is displayed when tuning up to a program or when changing the program information, or when the user pushes down a specific button (for example, “screen display”).
  • a program display for example, a CH banner
  • a specific button for example, “screen display”.
  • the 3D program mark may be displayed together with the detailed information of the program(s), such as, the CH number, the broadcasting time and/or the program name, etc. Also, the display shown in FIG. 53 may be made when displaying the 3D program in 3D.
  • the display of the 3D program mark it may be the letters, such as, “3D”, which is included at a specific position (for example, at a head portion) in the character data of the electronic program table (for example, “text_char” portion of a short-formatted event descriptor included in the EIT).
  • a specific position for example, at a head portion
  • the character data of the electronic program table for example, “text_char” portion of a short-formatted event descriptor included in the EIT.
  • the user instruction receive portion 52 when the user executes an instruction for exchanging to 3D output/display (for example, pushing down “3D” key on the remote controller), etc., the user instruction receive portion 52 , receiving the key code mentioned above, instructs the system control portion 51 to exchange to the 3D video (however, in the processing given hereinafter, a similar process will be done, for the content of the 3D 2-viewpoints separate ES transmission method, even when exchanging to the 3D output/display under the condition other than that the user instructs to exchange into 3D display/output of the 3D content).
  • the system control portion 51 determines if the present program is the 3D program or not, in accordance with the method mentioned above.
  • the system control portion 51 firstly instructs the tuning control portion 59 to output the 3D video.
  • the tuning control portion 59 upon receipt of the instruction mentioned above, first of all, obtains PID (packet ID) and an encoding method (for example, H.264/MVC, MPEG 2, H.264/AVC, etc.), for each of the main viewpoint video ES and the sub-viewpoint video ES mentioned above, from the program information analyze portion 54 , and next, makes a control on the multiplex/demultiplex portion 29 to demultiplex the main viewpoint video ES and the sub-viewpoint video ES, thereby to output them to the video decoder portion 30 .
  • PID packet ID
  • an encoding method for example, H.264/MVC, MPEG 2, H.264/AVC, etc.
  • the multiplex/demultiplex portion 29 is controlled, so that, for example, the main viewpoint video ES mentioned above is inputted into a first input of the video decode portion while the sub-viewpoint video ES mentioned above into a second input thereof.
  • the tuning control portion 59 instructs the decode control portion 57 to transmit information indicating that the first input of the video decode portion 30 is the main viewpoint video ES while the second input thereof is the sub-viewpoint video ES and the respective encoding methods thereof, and also to decode those ES.
  • the video decode portion 30 is constructed to have plural numbers of decoding functions, corresponding to the encoding methods, respectively.
  • the video decode portion 30 is constructed to have only a decoding function corresponding to a single encoding method. In this case, the video decode portion 30 can be constructed, cheaply.
  • the decode control portion 57 receiving the instruction mentioned above executes decoding on the main viewpoint video ES and the sub-viewpoint video ES, respectively, and outputs the video signals for use of the left-side eye and for use of the right-side eye to the video conversion processor portion 32 .
  • the system control portion 51 instructs the video conversion control portion 61 to execute the 3D outputting process.
  • the video conversion control portion 61 receiving the instruction mentioned above controls the video conversion processor portion 32 , thereby to output the 3D video from the video output 41 , or to display it on the display 47 , which is equipped with the receiving apparatus 4 .
  • FIGS. 37A and 37B About that 3D reproducing/outputting/displaying method will be given explanation, by referring to FIGS. 37A and 37B .
  • FIG. 37A is a view for explaining the reproducing/outputting/displaying method for dealing with output and display of a frame sequential method, which alternately displays and outputs the videos of the left and the right viewpoints of the 3D content of the 3D 2-viewpoints separate ES transmission method.
  • Frame lines (M 1 , M 2 , M 3 . . . ) in the left-side upper portion of the figure present plural numbers of frames, which are included in the main viewpoint (for use of the left-side eye) ES of the content of the 3D 2-viewpoints separate ES transmission method, frame lines (S 1 , S 2 , S 3 . . .
  • the video conversion processor portion 32 outputs/displays each frame of the main viewpoint (for use of the left-side eye)/sub-viewpoint (for use of the right-side eye) video signals inputted, as the video signals, alternately, as is shown by frame lines (ml, S 1 , M 2 , S 2 , M 3 , S 3 ) on the right side in the figure.
  • frame lines ml, S 1 , M 2 , S 2 , M 3 , S 3
  • sync signals are outputted from the control signal 43 , for enabling the video signals to be identified as that for use of the main viewpoint (for use of the left-side eye) and that for use of the sub-viewpoint (for use of the right-side eye), respectively.
  • An external video outputting device receiving the video signals and the sync signals mentioned above outputs the videos of the main viewpoint (for use of the left-side eye) and the sub-viewpoint (for use of the right-side eye) by fitting the video signals with the sync signals, and also transmits the sync signals to the 3D view/listen assisting device, thereby enabling the 3D display.
  • the sync signals to be outputted from the external video outputting device may be produced within that external video outputting device.
  • the sync signals mentioned above are outputted from the equipment control signal transmit terminal 44 , passing through the equipment control signal transmitter portion 53 and the control signal transmitter portion 33 , so as to make the control (for example, exchanging light shutoff of the active shutter) on the external 3D view/listen assisting device; thereby conducting the 3D display.
  • FIG. 37B is a view for explaining the reproducing/outputting/displaying method for dealing with output and display of a method for displaying the videos of the left and the right viewpoints of the 3D content of the 3D 2-viewpoints separate ES transmission method in different areas or regions of the display.
  • the streams of the 3D 2-viewpoints separate ES transmission method are decoded in the video decode portion 30 , and thereby executing the video conversion process in the video conversion processor portion 32 .
  • the display areas or regions may not be a unit of the line, for example, in the case of the display having different pixels for each viewpoint, a combination of plural numbers of pixels for use of the main viewpoint (the left-side eye) and a combination of plural numbers of pixels for use of the sub-viewpoint (the right-side eye) may be used as the display areas or regions, respectively.
  • the display device of the polarization light method mentioned above it is enough to output the videos having the polarization conditions, different from each other, corresponding to the respective polarization conditions of the left-side eye and the right-side eye of the 3D view/listen assisting device, from the different areas or regions mentioned above.
  • the resolution at which each viewpoint can be displayed on the display comes to be less than that of the method shown in FIG. 37A ; however the video for use of the main viewpoint (the left-side eye) and the video for use of the sub-viewpoint (the right-side eye) can be outputted/displayed at the same time, and there is no necessity of displaying them alternately. With this, it is possible to obtain the 3D display having fewer flickers than those with the method shown in FIG. 37A .
  • the 3D view/listen assisting device may be polarization separation glasses, when applying the method shown in FIG. 37B , and there is no necessity, in particular, for executing an electronic control. In this case, the 3D view/listen assisting device can be supplied, with a price much cheaper.
  • the user instruction receive portion 52 When the user gives an instruction to exchange to the 2D video (for example, pushing down “2D” button on the remote controller), the user instruction receive portion 52 , receiving the key code mentioned above, instructs the system control portion 51 to exchange the signal to the 2D video (however, in the processing given hereinafter, a similar process will be done, even when exchanging into the 2D output/display under the condition other than that the user instructs to exchange into 2D display/output of the 3D content of the 3D 2-viewpoints separate ES transmission method).
  • the system control portion 51 gives an instruction to the tuning control portion 59 , at first, to output the 2D video therefrom.
  • the tuning control portion 59 receiving the instruction mentioned above, firstly obtains PID of the ES for use of the 2D video (i.e., the main viewpoint ES mentioned above, or an ES having a default tag) from the program information analyze portion 54 , and controls the multiplex/demultiplex portion 29 to output the ES mentioned above towards the video decoder portion 30 . Thereafter, the tuning control portion 59 instructs the decode control portion 57 to decode the ES mentioned above.
  • the tuning control portion 59 instructs the decode control portion 57 to decode the ES mentioned above.
  • the decode control portion 57 receiving the instruction mentioned above control the video decoder portion 30 , so as to decode the ES mentioned above, and outputs the video signal to the video conversion processor portion 32 .
  • the system control portion 51 controls the video conversion control portion 61 to make the 2D output of the video.
  • the video conversion control portion 61 receiving the above-mentioned instruction for the system control portion 51 outputs the 2D video signal from the video output terminal 41 towards the video conversion processor portion 32 , or executes such a control on the display 47 that it displays the 2D video thereon.
  • the description is made about the method of not executing the decoding on the ES for use of the right-side eye as the method for outputting/displaying 2D; however, the 2D display may be executed, by decoding both the ES for use of the left-side eye and the ES for use of the right-side, and by executing a process of culling or thinning out in the video conversion processor portion 32 .
  • the 2D display may be executed, by decoding both the ES for use of the left-side eye and the ES for use of the right-side, and by executing a process of culling or thinning out in the video conversion processor portion 32 .
  • no process for decoding and/or no process for exchanging the demultiplexing process are needed, there can be expected an effect of reducing the exchanging time and/or simplification of software processing, etc.
  • the system control portion 51 firstly instructs the tuning control portion 59 to output the 3D video therefrom.
  • the tuning control portion 59 receiving the instruction mentioned above obtains PID (e.g., packet ID) of the 3D video ES, including the 3D video therein, and the encoding method (for example, MPEG 2, H.264/AVC, etc.) from the program information analyze portion 54 , and next it controls the multiplex/demultiplex portion 29 to demultiplex the above-mentioned 3D video ES, thereby to output it towards the video decoder portion 30 , and also controls the video decoder portion 30 to execute the decoding process corresponding to the encoding method, thereby to output the video signal decoded towards the video conversion processor portion 32 .
  • PID e.g., packet ID
  • the encoding method for example, MPEG 2, H.264/AVC, etc.
  • the system control portion 51 instructs the video conversion control portion 61 to execute the 3D outputting process.
  • the video conversion control portion 61 receiving the instruction mentioned above from the system control portion 51 , instructs the video conversation processor portion 32 to divide the video signal inputted into the video for use of the left-side eye and the video for use of the right-side eye and to treat a process, such as, scaling, etc. (details will be mentioned later).
  • the video conversation processor portion 32 outputs the video signal converted from the video output portion 41 , or displays the video on the display, which is equipped with the receiving apparatus 4 .
  • FIG. 39A is a view for explaining the reproducing/outputting/displaying method for dealing with the output and the display of the frame sequential method, which displays and outputs, alternately, the videos of the left and the right viewpoints of the 3D content of the Side-by-Side method or the Top-and-Bottom method.
  • each frame of the decoded video signals of the Side-by-Side method mentioned above is divided to be the video for use of the left-side eye or the video for use of the right-side eye, and is further treated with the scaling (i.e., carrying out extension/interpolation so as to fit to the horizontal size of an output video, or compression/thinning, etc.).
  • the frames are outputted, alternately, as the video signals.
  • FIG. 39B is a view for explaining a reproducing/outputting/displaying method for dealing with the output and/or display of a method of displaying the videos having the left/right viewpoints of the 3D content, of the Side-by-Side method or the Top-and-Bottom method, in the different areas or regions on the display. In similar as is shown in FIG.
  • FIG. 1 shown on the left-hand side in the figure presents the video signal of the Side-by-Side method, in which the videos for use of the left-side eye and for use of the right-side eye are arranged or disposed on the left side/the right side of one (1) frame.
  • each frame of the decoded video signals of the Side-by-Side method mentioned above is divided to be the video for use of the left-side eye or the video for use of the right-side eye, and is further treated with the scaling (i.e., carrying out extension/interpolation so as to fit to the horizontal size of an output video, or compression/thinning, etc.).
  • the video for use of the left-side eye and the video for use of the right-side eye, on which the scaling is treated are outputted or displayed in the different areas or regions. Similar to the explanation given in FIG. 37B , herein, for the purpose of displaying the videos in the different areas or regions, there is a method of, for example, displaying them with using odd number lines and even number lines of the display as display areas or regions for use of the main viewpoint (the left-side eye) and for use of the sub-viewpoint (the right-side eye), respectively, and so on.
  • the system control portion 51 receiving the instruction mentioned above instructs the video conversion control portion 61 to output the 2D video therefrom.
  • the video conversion control portion 61 receiving the instruction mentioned above from the system control portion 51 , controls the video conversion processor portion 32 to output the 2D video responding to the inputted video signal mentioned above.
  • FIG. 40A illustrates the explanation of the Side-by-Side method
  • FIG. 40B the Top-and-Bottom method; however, in either of them, the difference lies only in the arrangements of the video for use of the left-side eye and the video for use of the right-side eye within the video, and therefore the explanation will be made by referring to the Side-by-Side method shown in FIG. 40A .
  • the frame line (L 1 /R 1 , L 2 /R 2 , L 3 /R 3 . . .
  • FIG. 1 shown on the left-hand side in the figure presents the video signal of the Side-by-Side method, in which the video signals for use of the left-side eye and for use of the right-side eye are disposed on the left side/the right side of one (1) frame.
  • the video conversion processor portion 32 divides each frame of the above-mentioned video signal of the Side-by-Side method, which is inputted, into each frame of the video for use of the left-side eye or the video for use of the right-side eye, and thereafter treats the scaling only upon a portion of the main viewpoint video (e.g., the video for use of the left-side eye); thereby outputting only the main viewpoint video (e.g., the video for use of the left-side eye) as the video signal, as is shown by the frame line (L 1 , L 2 , L 3 . . . ) on the right-hand side in the figure.
  • the main viewpoint video e.g., the video for use of the left-side eye
  • the video conversion processor portion 32 outputs the video signal, being conducted with the process mentioned above, as the 2D video from the video output portion 41 , and also outputs the control signal from the control signal output portion 43 . In this manner, the 2D output/display is conducted.
  • the 2D output/display while keeping the 3D content of the Side-by-Side method or the Top-and-Bottom method, as it is, i.e., storing 2 viewpoints in one (1) screen, and such a case will be shown in FIGS. 40C and 40D .
  • the video may be outputted from the receiving apparatus, while keeping to store the videos of the 2 viewpoints of Side-by-Side method or the Top-and-Bottom method in one (1) screen, and the conversion for the 3D display may be conducted in the viewing/listening device.
  • FIG. 41 shows an example of a flow of processes of the system control portion 51 , which is executed at an opportunity, such as, change of the present program and/or the program information at the time when the program is exchanged.
  • the example shown in FIG. 41 is a flow for executing the 2D display, at first, of one viewpoint (for example, the main viewpoint), even if being the 2D program or the 3D program.
  • the system control portion 51 obtains the program information of the present program from the program information analyze portion 54 , so as to determine if the present program is the 3D program or not, with the method for determining the 3D program mentioned above, and further obtain the 3D method type of the present program (for example, determined from the 3D method type, which is described in the 3D program details descriptor, such as, 2-viewpoints separate ES transmission method/Side-by-Side method, etc.), from the program information analyze portion 54 (S 401 ).
  • the program information of the present program may be obtained, periodically, not limited to the time when the program is exchanged.
  • the system control portion 51 executes such a control that one viewpoint (for example, the main viewpoint) of the 3D video signal is displayed in 2D, in the format corresponding to the respective 3D method type, with the methods, which are explained in FIGS. 38 and 40 A and 40 B (S 404 ).
  • a display indicative of being the 3D program may be displayed on the 2D display video of the program, superimposing it thereon. In this manner, when the present program is the 3D program, the video of the one viewpoint (for example, the main viewpoint) is displayed in 2D.
  • the present program is the 3D program
  • the video of one viewpoint for example, the main viewpoint
  • the user can view/listen it, in the similar manner to that when being the 2D program, if the user is not ready for the 3D viewing/listening, such as, the user does not wear the 3D view/listen assisting device, etc.
  • the 3D content of the Side-by-Side method or the Top-and-Bottom method not outputting the video as it is, i.e., storing 2 views in one (1) screen, as is shown in FIGS.
  • FIG. 42 shows an example of a message, the video of which is displayed in 2D in the step S 404 and is displayed by the system control portion 51 through the OSP produce portion 60 .
  • a message is displayed for informing the user that the 3D program is started, and further an object 1602 (hereinafter, being called “a user response receiving object; for example, a button in the OSD) for the user to make a response thereto, is displayed thereon, so as to let her/him to select the operation thereafter.
  • a user response receiving object for example, a button in the OSD
  • the user instruction receive portion 52 inform the system control portion 51 that the “OK” is pushed down.
  • the user selection is determined as “exchange to 3D”.
  • the user selection is determined is “other than exchange to 3D”.
  • the user for example, when such an operation is made that it brings the condition indicative of, if preparation is completed or not, by the user, for the 3D viewing/listening (i.e., 3D view/listen preparation condition), into “OK” (for example, wearing the 3D glasses), then the user selection comes to “exchange to 3D”.
  • FIG. 43 A flow of processes in the system control portion 51 , to be executed after the user selects is shown in FIG. 43 .
  • the system control portion 51 obtains a result of the user select from the user instruction receive portion 52 (S 501 ). If the user select is not “exchange to 3D” (“no” in S 502 ), then the video ends while being displayed in 2D, no particular process is executed.
  • the video is displayed in 3D in accordance with the 3D displaying method mentioned above.
  • the object is displayed for the user to respond; however, it may be only that of displaying a letter, or a logo or a mark, etc., only for indicating that said program is that enabled with “3D view/listen”, such as, simply, “3D program”, etc.
  • the user recognizing that the program is enabled with “3D view/listen”, it is enough to push down the “3D” key on the remote controller, so as to exchange from the 2D display into the 3D display at an opportunity of a notice from the user instruction receive portion 52 , which receives the signal from that remote controller, to the system control portion 51 .
  • the message display to be displayed in the step S 404 there may be considered a method, displaying only“OK”, simply, but also clearly indicating or asking if the method for displaying the program should be that for the 2D video or for the 3D video. Examples of the message and the user response receiving object in that case are shown in FIG. 44 .
  • the user response receiving object shown in FIG. 42 or 44 may be displayed on the screen. In that case, the user can make an operation for exchanging the video between 2D/3D, while confirming the message mentioned above.
  • the message display at the timing when the video exchanges into the 3D video (for example, when the user pushes down the 3D button), since the possibility that the user is doing the operation is high, there can be obtained a merit that the possibility is high for the user to view the message.
  • the sound effects mentioned above may be transmitted, after being multiplexed on the audio ES or the data broadcast ES, on the side of the broadcasting station, and may be reproduced and/or outputted by the receiving apparatus, for example. Or, the sound effects, which the receiving apparatus has therein, may be reproduced and/or outputted (for example, the data thereof is read out within the audio decoder portion 31 , or from a ROM or the recording medium 26 , to be outputted after being decoded).
  • FIG. 45 A processing flow executed in the system control portion 51 when the 3D program starts is shown in FIG. 45 .
  • An aspect differing from the processing flow shown in FIG. 41 lies in that a step for outputting a specific video/audio (S 405 ) is added, in the place of the processing of S 404 .
  • the specific video/audio mentioned herein can be listed up, if it is the video, a message of paying an attention to preparation of 3D, a black screen, s still picture of the program, etc., while as the audio can be listed up a silence, or music of fixed pattern (e.g., an ambient music), etc.
  • a video of fixed pattern e.g., a message or an ambient picture, or the 3D video, etc.
  • it can be achieved by reading out the data thereof, from the inside of the video decoder portion 30 or the ROM not shown in the figure or the recording medium 26 , thereby to be outputted after being decoded.
  • outputting the black screen it can be achieved by, for example, the video decoder portion 30 outputting the video of signals indicating only a black color, or the video conversion processor portion 32 outputting the mute or the black video as the output signal.
  • the audio of fixed pattern e.g., the silence or the ambient music
  • it can be achieved by reading out the data, for example, within the audio decoder portion 31 or from the ROM or the recording medium 26 , thereby to be outputted after being decoded, or by muting the output signal, etc.
  • the message display to be displayed in the step S 405 it is as shown in FIG. 46 .
  • An aspect differing from those shown in FIG. 42 lies only in the video and the audio, which are displayed; but, the configurations of the message and/or the user response receiving object to be displayed and the operation of the user response receiving object are same to those.
  • FIG. 27 shows an example of a flow to be executed in the system control portion 51 , when the time-period until starting of the next program is changed due to the tuning process or the like, or when it is determined that the time of starting of the next program is changed, because of the staring time of the next program, which is included in the EIT of the program information transmitted from the broadcasting station, or the information of ending time of the present program, etc.
  • the system control portion 51 obtains the program information of the next coming program from the program information analyze portion 54 (S 101 ), and determines if the next program is the 3D program or not, in accordance with the determining method of the 3D program mentioned above.
  • next coming program is not the 3D program (“no” in S 102 )
  • the process is ended, but without doing any particular processing.
  • next coming program is the 3D program (“yes” in S 102 )
  • calculation is done on the time-period until when the next program starts.
  • the starting time of the next program or the ending time of the present program is obtained from the EIT of the obtained program information mentioned above, while obtaining the present time from the time management portion 55 , and thereby calculating the difference between them.
  • FIG. 28 shows an example of the message to be shown in that instance.
  • a reference numeral 701 depicts the entire of screen that the apparatus displays, and 702 the message itself that the apparatus displays, respectively. In this manner, it is possible to call an attention for the preparation of the 3D view/listen assisting device to the user, before the 3D program starts.
  • time “X” for determination until the time when the program starts, if it is made small, there is a possibility of not being in time for preparation of the 3D view/listen by the user until starting of the program. Or, if making it large, then there is brought about demerits, such as, preventing the message display for a long time, and also generating a pause after completion of the preparation; therefore, it must be adjusted at an appropriate time-period.
  • the starting time of the next coming program may be displayed, in the details thereof, when displaying the message to the user.
  • An example of the screen display in that case is shown in FIG. 29 .
  • a reference numeral 802 is the message displaying the time until when the 3D program starts.
  • the time is described by a unit of minute, however it may be described by a unit of second.
  • the user is able to know the starting time of the next program, in more details thereof, but there is also a demerit of increasing a processing load.
  • a reference numeral 902 depicts the message for alerting the start of the 3D program, and 903 the mark, which can be seen three-dimensionally when the user uses the 3D view/listen assisting device.
  • a countermeasure such as, mending or replacing, etc.
  • a countermeasure such as, mending or replacing, etc.
  • a reference numeral 1001 depicts an entire of the message, and 1002 a button for the user to make the response, respectively.
  • the user instruction receive portion 52 notices to the system control portion 51 that the “OK” is pushed down.
  • the system control portion 51 receiving the notice mentioned above stores the fact that the 3D view/listen preparation condition is “OK”, as a condition.
  • explanation will be given on a processing flow in the system control portion when the present program changes to the 3D program, after an elapse of time, by referring to FIG. 32 .
  • the system control portion 51 obtains the program information of the present program from the program information analyze portion 54 (S 201 ), and determines if the present program is the 3D program or not, in accordance with the method mentioned above, for determining the 3D program.
  • the present program is not the 3D program (“no” in S 202 )
  • such a control is executed that the video is displayed in 2D in accordance with the method mentioned above (S 203 ).
  • the control is made so as to display the video in 3D, in accordance with the method mentioned above (S 206 ). In this manner, the 3D display of the video is executed, when it can be confirmed that the present program is the 3D program and that the 3D view/listen preparation is completed.
  • the message display to be displayed in the step S 104 there can be considered, not only displaying “OK” simply, as is shown in FIG. 41 , but also clearly indication or asking if the display method of the next coming program should be the 2D video or the 3D video. Examples of the message and the user response receiving object in that case are shown in FIGS. 33 and 34 .
  • the determination on the 3D view/listen preparation condition of the user is made upon the operation on the menu by the user through the remote controller, herein; however, other than that may be applied a method of determining the 3D view/listen preparation condition mentioned above, upon basis of a user wearing completion signal, which the 3D view/listen assisting device generates, or a method of determining that she/he wears the 3D view/listen assisting device, by photographing a viewing/listening condition of the user by an image pickup or photographing device, so as to make an image recognition or a face recognition of the user from the result of photographing.
  • a method of determining the 3D view/listen preparation condition be “OK” when the user pushes down the ⁇ 3D> button of the remote controller, or a method of determining the 3D view/listen preparation condition be “OK” when the user pushes down a ⁇ 2D> button or a ⁇ return> button or a ⁇ chancel> button of the remote controller.
  • a demerit such as, transmission of the condition caused due to an error or misunderstanding, etc.
  • the processing can be considered to execute the processing while making the determination only on the program information of the next coming program, which is obtained previously, without obtaining the information of the present program.
  • the step S 201 shown in FIG. 32 there can be considered a method of using the program information, which is obtained previously (for example, in the step S 101 shown in FIG. 27 ), without make determination of whether the present program is the 3D program or not.
  • a merit such as, the processing configuration becomes simple, etc.; however, there is a demerit, such as, a possibility that the 3D video exchange process is executed even in the case where the next coming program is not the 3D program, due to a sudden change of the program configuration.
  • the user it is possible to view/listen the 3D program under the condition much better, in particular, in a starting part of the 3D program; i.e., the user can complete the 3D view/listen preparation, in advance, or can display the video, again, after completing the preparation for viewing/listening the 3D program by the user, with using the recording/reproducing function, when she/he is not in time for starting of the 3D program.
  • the convenience for the user i.e., automatically exchanging the video display into that of a display method, which can be considered desirable or preferable for the user (e.g., the 3D video display when she/he wishes to view/listen the 3D video or the contrary thereto).
  • the program is changed into the 3D program through tuning, or when reproduction of the 3D program recorded starts, etc.
  • the “3d — 3d_type” (type of 2D/3D) information, which is explained in FIG. 10B
  • the “3d_method type” (type of the 3D method) information which is explained in FIG. 11 .
  • the “3d — 3d_type” (type of 2D/3D) information and the “3d_method_type” (type of the 3D method) information may be separated, but it is also possible to build up information to identify if being the 3D video or the 2D video and to identify which 3D method that 3D program has, together.
  • encoding may be executed thereon, including the 3D/2 type information and the 3D method type information mentioned above in the user data area following “Picture Head” and “Picture Coding Extension”.
  • encoding may be executed thereon, including the 3D/2 type information and the 3D method type information mentioned above in the addition information (e.g., supplement enhancement information) area, which is included in an access unit.
  • addition information e.g., supplement enhancement information
  • the identification mentioned above can be made by using a unit shorter than that when storing it in the PMT (Program Map Table), it is possible to improve or increase a speed of the receiver responding to the exchanging between the 3D video/2D video in the video to be transmitted, and also to suppress noises, much more, which can be generated when exchanging between the 3D video/2D video.
  • the broadcast station side may be constructed, for example, only the encode portion 12 in the transmitting apparatus 1 shown in FIG. 2 is renewed to be enabled with a 2D/3D mixed broadcasting; there is no necessity of chaining the structure of the PMT (Program Map Table) to be added in the management information supply portion 16 , and therefore it is possible to start the 2D/3D mixed broadcasting with a lower cost.
  • 3D related information in particular, the information for identifying 3D/2D
  • the information for identifying 3D/2D such as, the “3d — 2d_type” (type of 3D/2D) information and/or the “3d_method_type” (type of the 3D method) information
  • the receiver may be constructed in such manner that it determines said video is the 2D video.
  • the broadcasting station it is also possible to omit storing of those information when it processes the encoding, and therefore enabling to reduce a number of the processes in broadcasting.
  • FIG. 54 shows an example of the structure of the content descriptor, as one of the program information.
  • This descriptor is disposed in the EIT.
  • the content descriptor can be described the information indicative of program characteristics, other than genre information of the event (the program).
  • descriptor_tag is a field of 8 bits for identifying the descriptor itself, in which a value “0x54” is described so that this descriptor can be identified as the content descriptor.
  • descriptor_length is a field of 8 bits, in which a size of this descriptor is described.
  • “content_nibble_level — 1” (genre 1) is a field of 4 bits, and this presents a first stage grouping or classification of the content identification. In more details, there is described a large group of the program genre. When indicating the program characterstics, “0xE” is designated.
  • “content_nibble_level — 2” (genre 2) is a field of 4 bits, and this presents a second stage grouping or classification of the content identification, in more details thereof comparing to the“content_nibble_level — 1” (genre 1). In more details, a middle grouping of the program genre is described therein.
  • “content_nibble_level_E” a sort or type of a program characteristic code table.
  • the field of 4 bits of the “user_nibble” can be disposed by two (2) pieces thereof, and upon combination of the values of two (2) pieces of “user_nibble” (hereinafter, bits disposed in front being called “first user_nibble” bits, while bits disposed in the rear “second user_nibble”) bits, it is possible to define the program characteristics.
  • the receiver receiving that content describer determines that said described is the content describer when the “descriptor_tag” is “0x54”. Also, upon the “descriptor_length” it can decide an end of the data, which is described within this describer. Further, it determine the description, being equal to or shorter than the length presented by the “descriptor_length”, to be valid, while neglecting a portion exceeding that, and thereby executing the process.
  • the receiver determines“content_nibble_level — 1” if the value thereof is “0xE” or not, and determines as the large group of the program genre, when it is not “0xE”. When being “0xE”, determination is not made that it is the genre, but any one of the program characteristics is designated by the “user_nibble” following thereto.
  • the receiver determines the “content_nibble_level — 2” to be the middle group of the program genre when the value of the “content_nibble_level — 1” mentioned above is not “0xE”, and uses it in searching, displaying, etc., together with the large group of the program genre.
  • the receiver determines it indicates the sort of the program characteristic code table, which is defined upon the combination of the “first user_nibble” bits and the “second user_nibble” bits.
  • the receiver determines the bits to be that indicating the program characteristics upon the basis of the “first user_nibble” bits and the “second user_nibble” bits, when the “content_nibble_level — 1” mentioned above is “0xE”. In case where the value of the “content_nibble_level — 1” is “0xE”, they are neglected even if any value is inserted in the “first user_nibble” bits and the “second user_nibble” bits.
  • the broadcasting station can transmit the genre information of a target event (the program) to the receiver, by using combination of the value of “content_nibble_level — 1” and the value of “content_nibble_level — 2”, in case where it does not set the “content_nibble_level — 1” to “0xE”.
  • the large group of the program genre is defined as “news/reporting” when the value of “content_nibble_level — 1” is “0x0”, further defined as “weather” when the value of “content_nibble_level — 1” is “0x0” and the value of “content_nibble_level — 2” is “0x1”, and defined as “special program/document” when the value of “content_nibble_level — 1” is “0x0” and the value of “content_nibble_level — 2” is “0x2”, and the large group of the program genre is defined as “sports” when the value of “content_nibble_level — 1” is “0x1”, further defined as “baseball” when the value of “content_nibble_level — 1” is “0x1” and the value of “content_nibble_level — 2” is “0x1”,
  • the receiver it is possible to determine the large group of the program genre, if being “news/reporting” or “sports”, depending on the value of “content_nibble_level — 1”, and upon basis of the combination of the value of “content_nibble_level — 1” and the value of “content_nibble_level — 2”, it is possible to determine the middle group of the program genre, i.e., down to program genres lower than the large group of the program genre, such as, “news/reporting” or “sports”, etc.
  • in the memory portion equipped with the receiver may be memorized genre code table information for showing a corresponding relationship between the combination of the values of “content_nibble_level — 1” and “content_nibble_level — 2”, and the program genre, in advance.
  • the broadcasting station transmits the content describer with setting the value of “content_nibble_level — 1” to “0xE”.
  • the receiver can determine that the information transmitted by that describer is, not the genre information of the target event (the program), but the program characteristic information of the target event (the program). Also, with this, it is possible to determine that the “first user_nibble” bits and the “second user_nibble” bits, which are described in the content describer, indicate the program characteristic information by the combination thereof.
  • the program characteristic information of the target event is defined as “program characteristic information relating to 3D program” when the value of “first user_nibble” bits is “0x3”
  • the program characteristics are defined as “no 3D video is included in target event (program)” when the value of “first user_nibble” bits is “0x3” and the value of “second user_nibble” bits is “0x0”
  • the program characteristics are defined as “video of target event (program) is 3D video” when the value of “first user_nibble” bits is “0x3” and the value of “second user_nibble” bits is “0x1”
  • the program characteristics are defined as “3D video and 2D video are included in target event (program)” when the value of “first user_nibble” bits is “0x3” and the value of “second user_nibble” bits is
  • the receiver it is possible to determine the program characteristics relating to the 3D program of the target event (the program), upon basis of the combination of the value of “first user_nibble” bits and the value of “second user_nibble” bits; therefore, the receiver receiving the EIT, including that content describer therein, can display an explanation on the electronic program table (EPG) display, in relation to program(s), which will be received in future or is received at present, that “no 3D video is included” therein, or that said program is “3D video program”, or that “3D video and 2D video are included” in said program, or alternately display a diagram for indicating that fact.
  • EPG electronic program table
  • the receiver receiving the EIT, including that content describer therein is able to make a search on a program(s) including no 3D video therein, a program(s) including the 3D video therein, and a program(s) including the 3D video and 2D program therein, etc., and thereby to display a list of said program(s), etc.
  • the memory portion equipped with the receiver may be memorized the program characteristic code table information for showing a corresponding relationship between the combination of the value of “first user_nibble” bits and the value of “second user_nibble” bits, and also the program characteristics, in advance.
  • the program characteristic information of the target event is determined as “program characteristic information relating to 3D program” when the value of the “first user_nibble” bits is “0x3”, and further the program characteristics are defined as “no 3D video is included in target event (program)” when the value of the “first user_nibble” bits is “0x3” and the value of the “second user_nibble” bits is “0x0”, the program characteristics are defined as “3D video is included in target event (program), and 3D transmission method is Side-by-Side method” when the value of the “first user_nibble” bits is “0x3” and the value of the “second user_nibble” bits is “0x1”, the program characteristics are defined as “3D video is included in target event (program
  • the receiver it is possible to determine the program characteristics relating to the 3D program of the target event (the program), upon basis of the combination of the value of “first user_nibble” bits and the value of “second user_nibble” bits, not only if the 3D video is included or not, in the target event (the program), but also to determine the 3D transmission method when the 3D video is included therein.
  • the receiver can display an explanation on the electronic program table (EPG) display, in relation to the program(s), which will be received in future or is received at present, that “no 3D video is included”, or that “3D video is included, and can be reproduced in 3D on this receiver” or that “3D video is included, but cannot be reproduced in 3D on this receiver”, or alternately display a diagram for indicating that fact.
  • EPG electronic program table
  • the program characteristics when the “first user_nibble” bits is “0x3” and the value of the “second user_nibble” bits is “0x3”, are defined as “3D video is included in target event (program), and 3D transmission method is 3D 2-viewpoints separate ES transmission method”; however, there may be prepared values of the “second user_nibble” bits for each detailed combination of the streams of “3D 2-viewpoints separate ES transmission method”, as is shown in FIG. 47 . With doing so, for the receiver, it is possible to make further detailed identification thereof.
  • the information of the 3D transmission method of the target event may be displayed.
  • the receiver receiving the Eli including that content describer therein, is able to make a search on a program(s) including no 3D video therein, a program(s) including the 3D video and reproducible on the present receiver and a program(s) including the 3D video but irreproducible on the present receiver, etc., and thereby to display a list of said program(s), etc.
  • the program search on the program, including the 3D video therein but unable to be reproduced on the present receiver, and/or the program search for each 3D transmission method are/is effective, for example, when it is reproducible on other 3D video program reproducing equipment, which the user has, even if it cannot be reproduced in 3D on the present receiver.
  • the memory portion equipped with the receiver may be memorized the program characteristic code table information for showing a corresponding relationship between the combination of the value of “first user_nibble” bits and the value of “second user_nibble” bits, and also the information of the 3D transmission methods, with which the receiver is enabled (reproducible in 3D), in advance.
  • FIGS. 58A and 58B An example of the operations of the receiving apparatus, for recording the information, if the video data to be recorded is the 3D or not, in addition thereto, when recording the broadcast data into the recording medium 26 by the record/reproduce portion 27 , by referring to the drawings, FIGS. 58A and 58B and following thereafter.
  • FIGS. 58A and 58B show an internal block diagram of the recording/reproducing portion 27 .
  • the multiplex/demultiplex portion 29 is selected a partial TS (or a full TS) to be recorded, and into the record/reproduce portion 27 is inputted the stream to be recorded.
  • multiplex/demultiplex portion 29 divides or de-multiplexes the program information, such as, the EIT included in the broadcast stream, etc., thereby obtaining them.
  • recorded program information which may be needed in display of a reproduction list, etc., is collected, and this data is also inputted therein.
  • the stream divided in the multiplex/demultiplex portion is inputted into a demultiplexer 5801 and an information obtain portion 5802 .
  • the demultiplexer 5801 the information is analyzed down to the ES level, and thereby obtains the 3D/2D identifier, which is written in the User Data area thereof, for example.
  • the information obtained is transmitted to a program information manage portion 5806 through the CPU 21 , to be recorded together with other program information.
  • the information obtain portion 5802 is obtained the information, which can be obtained on the TS level. For example, there is produced a position information for reading out the data, which is necessary when executing reproduction of the recorded program, in particular, the special reproduction (may be called “clip information”, and the details thereof will be mentioned later).
  • an encode portion 5803 is encoded the stream data, from a viewpoint of security thereof.
  • An encoding method may be a unique or original one, or may be a method determined in accordance with any regulation.
  • the data completed in encoding thereof is delivered to a file system 5804 .
  • In the file system 5804 is executed recording into the recording medium 26 in a unique or original format or predetermined one.
  • processing is conducted on the data through a recording medium control portion 5805 .
  • the program information manage portion 5806 receives it, so as to adjust it into data of a predetermined format.
  • the program information completed is transmitted to the file system 5804 , to be written into the recording medium 26 .
  • a reference numeral depicts a common bus, which is connected with, continuing from the common but 22 , and it transmits the control signals between the CPU 21 and each block.
  • Timing to execute recording of the program information may be sufficient to be the timing, which the information seems to be fixed or decided, and there can be listed up the examples; for example, 10 seconds after starting the recording, or after all of the recording is completed, etc. Also, the demultiplexer 5801 may be stopped the operation thereof at the timing when the 2D/3D information seems to be decided.
  • FIG. 58B shows a variation of that shown in FIG. 58A , wherein the stream data selected in the multiplex/demultiplex portion 27 is inputted into the demultiplexer 5801 .
  • the information obtain portion 5802 it is possible to obtain the 2D/3D information on the ES level from demuxed data.
  • a remux portion it is reconstructed into the partial TS, and the recording thereafter is executed in the similar manner to that shown in FIG. 58A .
  • a format conversion (being called, also “trans-code”) of the video signal/audio signal or a bit rate conversion (e.g., trans-rate) may be made between the demux and the remux.
  • trans-code e.g., bit rate conversion
  • FIG. 59 shows an example of the structures of a folder and/or a file when it is recorded.
  • information such as, how many numbers of files are recorded, and recording positions and data sizes of management data of those (being called, also “program information file”), for example.
  • program information file Ina menu thumbnail file are recoded, such as, a number of pieces of all menu thumbnail pictures recorded, picture data to be used in the menu screen of displaying a list of the programs recorded and/or recording position information for reading out those picture data, etc.
  • a chapter thumbnail file is recorded a number of pieces of thumbnail pictures recorded for use in a chapter, picture data to be displayed when displaying a list of chapter information, which will be mentioned later, for each recorded program, and/or recording position information for reading out those picture data, etc.
  • FIG. 59 examples of the folder and file configurations, each of which records a corresponding program information file (e.g., program information N), recording position of the video/audio data and a clip information file (e.g., clip information N) for binding a reproduction time, for each recorded program file (e.g., a program N (“N” is a natural number equal to or greater than “1”)) including the video/audio data therein, within a one folder separated respectively.
  • program information file e.g., program information N
  • N is a natural number equal to or greater than “1”
  • a program 3 file is produced in a directory equal to or lower than a playlist directory
  • a program information 3 file is produced in a directory equal to or lower than a clip information directory
  • a clip information 3 file is produced in a directory equal to or lower than a clip information directory, for example.
  • an edition of data it is enough to record the information of plural numbers of the recorded program files into one (1) program information file or one (1) clip information file, or to record a certain part of the recorded program files therein.
  • the program information 1 presents the combination of information corresponding to the entire program 1 and a part of the program 2 , it is possible to show those to the user as a one (1) reproduced program.
  • the program information file corresponding to that is deleted, and a search is made on whether it is referred only to that program information file or not, for all of the recorded program files (e.g., the program), which are associated with, thereby to delete it if referring to only the corresponding program file, but not delete if referring to the program information file(s) other than that.
  • Doing in this manner maintains a condition of coordinating or adjusting in the reference or association thereof, between the program information file and the recorded program file.
  • the references therebetween may be confirmed, in the similar manner, so as to maintain the condition coordinated or adjusted.
  • the entire information is revised, appropriately, every time when data alternation is made within the same directory (e.g., within an AV directory shown in FIG. 59 ), so as to maintain the condition coordinated.
  • Examples of the data structure or configuration in an inside of the program information file, and of adding the 3D/2D determination information thereto are shown, by referring to FIG. 60 .
  • program-common information is recorded, for example, the information to be held, not depending on a program referred to.
  • character set designates, for example, a language code to record the following informing therein.
  • production protect flag means, when the value thereof is “1”, for example, that there is a restriction, such as, reproduction cannot be made if the user inputs a PIN code. When the value is “0”, it is assumed that there is no restriction and reproduction can be made.
  • write-in protect flag is “1” in the value thereof, for example, it means that the user cannot make an edition and/or deletion.
  • the value is “0”, the user can make an operation of the edition/deletion, freely.
  • the write-in protect flag may be settable by the user, with provision of an OSD (interface), such as, a setup menu, etc.
  • non-reproduced flag indicates that it is never reproduced by the user, yet, after recording thereof, if the value thereof is “1”, for example. If the value is “0”, it indicates that the reproduction is made at least once. “edition flag” means that no edition process is made ever, after recording thereof, such as, partial deletion and/or division/combining, for example, if the value thereof is “1”. If the value is “0”, it means that any kind of edition was made by the user.
  • time zone indicates a time zone of a land, for example, where this program information is recorded.
  • recording date/time indicates a date and time, for example, when this program information is recorded.
  • production time indicates a total value by a unit of time, for example, recording time of each program, to which this program information is referred.
  • maker ID is registered, for example, a numerical value, presenting an equipment manufacturing company, which is determined, uniquely and separately, when producing this program information.
  • maker model code is registered, for example, a model number of equipment, which is uniquely determined by the equipment manufacturing company, separately, when this program information is produced.
  • broadcasting station number is recorded, for example, that broadcasting station number when the program recorded is the data that is received on the air.
  • broadcasting station name is recorded, for example, that broadcasting station name when the program recorded is the data that is received on the air.
  • program name is reserved, for example, a program title name of the program recoded. The program name may be edible by the user, freely, even after being recorded.
  • program details is reserved, for example, detailed information for presenting the content of the program recorded. For example, the cast of the program, which is included in the SI/PSI, or the program content, in which a brief explanation of story thereof is described, may be recorded as it is.
  • the 3D/2D identification information Into a reserve area or region of the program-common information is added the 3D/2D identification information. This means, for example, when the value is “1” upon one (1) bit information, that there is at least one (1) program including the 3D video signal therein, among the programs relating to the present playlist, and when the value is “0”, that no 3D video signal is included in the program relating thereto. In this case, no inquire is made of the 3D method of the 3D video signal.
  • recording the information herein it is possible to inform the user of whether the 3D video is included or not, within a certain program, for example, by referring to this information when she/he displays the list of the program(s) recorded.
  • “playlist information” is recorded, for example, “play item number”.
  • This “play item” means, for example, one (1) program, to which this program information is related or associated.
  • the play item and the program are related or associated with, one by one (1:1).
  • the “play item number” means a number of program(s), with which this program information is related or associated.
  • “play item information” is held by the number of pieces same to that of the number of the “play item”, and in an inside thereof is recorded the information unique to each program corresponding thereto.
  • “clip information fine name” is held a name of the clip information file corresponding thereto.
  • codec identification information indicates an encoding method of the video/audio data of the program corresponding thereto. It is, for example, MPEG 2 or H.264, if being the video, or if being the audio, the audio encoding method, such as, MPEG 2-AAC or MP3, or MPEG 1-Layer 2, etc.
  • start time indicates the starting position of the program corresponding, to which the position indicated by this “play item information” corresponds.
  • a value to be designated may be recorded by time information, such as, PTS/DTS, etc.
  • end time indicates the ending portion of the program, to which the position indicated by this “play item information” corresponds.
  • a value to be designated thereto may be recorded by time information, such as PTS/DTS, etc.
  • the 3D/2D identification information may be added therein, or may be recorded in both. For example, if being the information of 1 bit and having “1” in the value thereof, it indicates that the 3D video signal is included in that program, but if the value is “0”, it is indicated that no 3D video signal is included in that program. Recording it herein enables a management by a detailed unit much more. However, for the purpose of adding the 3D/2D identification information, it may be added into a vacant area or region for registering the information, within an area or region where the information for indicating the characteristics of the video signals, such as, codec, etc., are integrated.
  • mark information is recorded a number of pieces of “mark information” for indicating how many chapter(s) (may be called “chapter mark” or simply “mark”) is/are setup in the play list corresponding to this program information.
  • mark information is recorded one information, by each, as time information for uniquely identifying where that mark exists among the program files, for example, combining the information of a number of times of renewal of PTS and/or STC, etc., therewith. For example, when a chapter is added in the program corresponding thereto through the user's operation of the function of registering the chapter automatically, the “mark information” is increased by one (1).
  • mark invalid flag means, when the value thereof is “1”, that said chapter is invalid, for example, it is not displayed even to the user. If the value is “0”, it means that said chapter is valid, and that the display thereof is also made.
  • mark type means a type of the mark. It may be a mark for indicating a resume position or a mark for indicating a chapter portion, etc.
  • a manufacturer of the receiving apparatus 4 is also able to set the type, arbitrarily, it is possible to make an arbitrary classification, such as, a mark for indicating a chapter position indicative of separation between a main program and CM, a mark for indicating the chapter position, which is set up through the user operation, for example.
  • a mark for indicating a chapter position indicative of separation between a main program and CM a mark for indicating the chapter position, which is set up through the user operation, for example.
  • maker ID is registered a numerical value, presenting an equipment manufacturing company, which is determined, uniquely and separately, when producing that mark.
  • play item ID is held an ID of the play item information corresponding thereto.
  • mark time indicates a position on a program corresponding to this “mark information”.
  • entity ES PID indicates to which ES the mark information is directed, among the programs corresponding thereto.
  • thumbnail reference though not shown in the present figure, but stores therein an ID for identifying that file, when holding a file of thumbnail still picture(s) locating at the position of each mark.
  • mark name holds therein information when a name is given to the corresponding mark.
  • plural numbers of mark information may be recorded therein, in the similar manner, in case where there are plural numbers of marks, which are produced through the user operation, or produced automatically by program within the recording/reproducing device.
  • the demultiplexer 5801 enables to grasp exchanging between 2D/3D by a unit of frame. With using this, if storing the position of the exchanging between 2D/3D as the mark information, for the receiving apparatus, it is possible to manage the position of the exchanging, more strictly, therefore there can be obtained an advantage that, for example, it is possible to control the timing, more accurately, when displaying a message for suggesting the user to wear the glasses when reproducing, as was mentioned above.
  • the mark type may be held as a type (class) for indicating the exchange between 2D/3D.
  • the clip information file to be recorded in the clip information directory is a file corresponding to the program file, for example, one by one (1:1), and the data position information of the corresponding program file is recorded.
  • the information obtain portion 5802 detection is made on a packet, in which the “I” picture top data of the stream to be recorded is included, and on the time information (for example, PTS) thereof, and in the file system 5804 , they are recorded in the clip information file, as the data of format enabling to determine the top data position of the “I” picture, for example, combining the number of the packets from the top of the program and the time information.
  • Recording of this information enable a control, such as, not reading out all data, but reading out only data of the necessary “I” picture to display, when starting the reproduction or fast-forward/rewind reproduction from the position of an arbitrary “I” picture during the reproduction, etc.
  • the recorded program file including the program information and the video/audio data therein, and the program information file and the clip information file are provided within the folders separated, but as a variation thereof, they may be recorded in a same folder, or the program information folder may be attached to the recorded program file in the form of stream directory.
  • the data configuration within the program information file should not be limited to the example shown in FIG. 60 ; however it is enough that all of the files can be recorded in a certain format and information can be read out therefrom regularly when reproducing, and thereby the similar effect can be obtained.
  • FIG. 61 shows an example of a sequence for determining if the program to be recorded is a program or not, including such a 3D display video of SBS method, for example.
  • the multiplex/demultiplex portion 29 obtains an identifier for determining 3D/2D on such PSI or SI information as shown in FIG. 7 , 10 A or 56 , from the stream to be recorded, and transmits it as data to the CPU 21 , and then advances the process to S 6102 .
  • the CPU 21 determines if there is the information or not, indicating 3D/2D in the information obtained. As a result of determination, if there is the identifier (e.g., if “Yes”), the process is shifted to S 6103 , on the other hand if nothing (e.g., if “No”) to S 6104 .
  • the CPU 21 determines if the information obtained indicates the 3D program or not, and if it is the 3D program (e.g., if “Yes”), it advances to S 6104 , while if indicating the 2D program (e.g., if “No”) to S 6105 .
  • FIG. 62 shows an example of other sequence for determining if the program is the 3D program or not.
  • the demultiplexer 5801 obtains the 3D identification information (for example, information in User Data of MPEG 2 video, or Frame Packing Arrangement SEI information of H.264 video) existing within the video ES of the stream to be recorded, to transmit it to the CPU 21 , and advances the process to S 6202 .
  • the 3D identification information for example, information in User Data of MPEG 2 video, or Frame Packing Arrangement SEI information of H.264 video
  • the CPU 21 determines of the information obtained is the 3D program or not, and if the information obtained indicates the 3D program (e.g., if “Yes”), it advances to S 6104 , or if it indicates the 2D program (e.g., if “No”), to S 6105 .
  • an instruction is given from the CPU 21 to the program information management portion 5806 , so that “3D” is described in the program information in accordance with such regulated format as shown in FIG. 60 , for example, and the process is completed.
  • an instruction is given from the CPU 21 to the program information management portion 5806 , so that “2D” is described into the program information in accordance with such regulated format as shown in FIG. 60 , for example, and the process is completed.
  • FIG. 63 shows an example of a sequence for determining if the program is the 3D program or not with using both of that information.
  • the multiplex/demultiplex portion 29 obtains an identifier for determining 3D/2D on such PSI or SI information as shown in FIG. 7 or 10 A, from the stream to be recorded, and transmits as data to the CPU 21 , and then advances the process to S 6201 .
  • the demultiplexer 5801 obtains such information as is described in the “user_nibble” shown in FIG. 56 , to transmit it to the CPU 21 , and advances the process to S 6301 .
  • the CPU 21 determines if the stream to be recorded is the 3D or not, from two (2) pieces of information obtained, and if both information indicate 3D, it advances the process to S 6104 , on the other hand if either one information indicates 2D, to S 6105 .
  • timings when executing the processes shown in FIGS. 61 to 63 are arbitrary, for example, in case where recording is programmed from a top of a certain program, the receiving apparatus 4 may record the program information attached therewith, responding to the timing when an establishing of recording (i.e., it means that the data is reserved into the recording medium 26 ) is settled, or may record the value, which is obtained after an elapse of certain time-period after when the recording is started. If it is a system recording one of the broadcasting programs in one (1) recoding file, by each, the program information may be recorded after an elapse of certain time-period (for example, 10 seconds later) from detection of exchange of the broadcast program, etc.
  • the time-period from when determining 3D of the program to be recorded up to when actually recording the program information into the recording medium 26 by the CPU 21 may have an arbitrary time difference, for example, while obtaining and analyzing the information after elapsing a certain time-period from when the recording starts, the program information may be recorded into the recording medium 26 at the timing of completion of the recording.
  • FIG. 64A shows an example of adding method identification information of the 3D broadcasting, further to the example of the data configuration in an inside of the program information file shown in FIG. 60 .
  • “3D/2D identification information” and “3D method identification information” are added, where there is sufficient “reserve area” within the program-common information.
  • bit-strings of “3D method identification information” and meanings presented by them is shown in FIG. 64B . If the bit value is “0x00”, the corresponding program includes the 3D video signal of “Side-by-Side” method. If the bit value is “0x01”, the corresponding program includes the 3D video signal of “Top-and-Bottom” method. If the bit value is “0x02”, the corresponding program includes the 3D video signal of the frame sequential method. “0x03” is reserved for future extension.
  • the receiving apparatus 4 is able to display the information, when recording plural numbers of 3D broadcastings different in the methods thereof upon receipt thereof, and also is able to reproduce them through an appropriate selection of the 3D display method when reproducing.
  • the sequence for obtaining the information when recording is similar to that of the example, which is shown in FIGS. 61 to 63 ; i.e., it is enough to record the 3D method identification information when recording the program information into the recording medium 26 .
  • the “3D/2D identification information” and the “3D method identification information” may be recorded in the “reserve area” within the play item information. With this, it is also possible to provide the user, the 3D method by more detailed unit thereof.
  • the “3D/2D identification information” may be recorded within the program-common information, while the “3D method identification information” within the play item information. With doing in this manner, it is possible to provide the user if the 3D video signal is contained or not, only by referring to the program-common information for her/him.
  • the bit value may be defined as follow: if the bit value of this information is “0x00”, the corresponding program does not contain the 3D video signal therein; if being “0x01”, the corresponding program contains the 3D video signal of the “Side-by-Side” method; if the bit value is “0x02”, the corresponding program contains the 3D video signal of the “Top-and-Bottom” method; and if the bit value is “0x03”, the corresponding program contains the 3D video signal of the frame sequential method, and also even in the case of using the 3D display other than the method(s) mentioned in the present embodiment, if they are defined, one by one, with assigning sufficient bits thereto, it is possible to obtain the similar effect to that of the embodiment mentioned above.
  • FIG. 65 shows an example of a screen display when displaying a list of the program data, which is recorded in the recording medium 26 .
  • a reference numeral 6501 depicts a display for displaying.
  • 6502 depicts an area or region where a tag is displayed, indicating, recording data of which recording medium is displayed at present, for example, when the receiving apparatus is connectable with plural numbers of recoding media.
  • 6503 depicts an area or region where a type information tag is displayed, which is selected by the user when she/he wishes to display the programs classified.
  • the display area 6504 is displayed only a program(s) corresponding thereto, which is/are never reproduced yet after being recorded.
  • the tab displayed on this hierarchy may be displayed covering over plural numbers of the hierarchies. For example, when selecting “genre”, it is possible to align the genre tabs, such as, “drama”, “news” and “animation”, with providing a tab display area one more in the area 6503 .
  • the area 6504 is an area for displaying the list of data of the recorded program(s) fitting to the classification selected by 6503 .
  • the user makes an operation, such as, selecting a program, which she/he wishes to reproduce, among from the program(s) displayed here.
  • a scroll bar may be displayed in the horizontal direction.
  • the information to be displayed within one data of the recoded program is arbitrary, but there are displayed, for example, a thumbnail picture, date and time when recoding is made, a program name, a recoding mode when recording, etc.
  • the thumbnail picture may be a still picture or a moving picture.
  • a reference numeral 6505 depicts a guide display area for displaying a brief explanation of operations and/or contents of operations available on the remote controller to the user.
  • FIG. 65A shows an example of displaying a mark (also, may be called “icon”) to the 3D program, for indicating that, by referring to the 3D determination information of the program information, when displaying the list of all the programs recorded in the recording medium 26 .
  • a program D and a program D are the 3D programs, but other than those are the 2D programs, therefore no display is made thereon.
  • FIG. 65B shows an example when “3” is selected by a classification or type tab and there is displayed a list of the 3D programs, which are recorded in the recording medium 26 .
  • Such a display with applying the classification can be made also, by recoding the 3D determination information as the program information when recording. This can be achieved by means of the CPU 21 , extracting only the 3D program to display, by referring to the 3D determination information of the program information for all the recorded programs.
  • the user can find out the 3D program even from among, easily, if there are the recoded programs in a large number thereof.
  • the mark for indicating to be the 3D program is displayed for each program in the list display area 6504 , however since it is apparent that the 3D program(s) is/are displayed in the type tab area 6503 , this mark may not be displayed.
  • the information shown in the display area 6504 should not be limited to the content as shown in FIG. 65 .
  • an order of displaying each of information such as, the program names and/or the program recoding times, etc., may be different from this example, and also as the information to be displayed, other than this example, information delivered from the program information may be displayed, for example, a mark for indicating the condition of not-viewed/listened and/or a mark for indicating that the chapter is registered, etc.
  • FIG. 66 shows an example of a list display differing in the method of displaying the programs in the list display area 6504 for each program.
  • the display content for each program is simplified, a large number of the programs can be displayed one time. In this display example, too, the user can distinguish the 3D program, easily, if displaying the mark indicating the 3D program together with.
  • the 3D method may be displayed when displaying the list.
  • the marks catching the characteristics, such as, “Sid-by-Side”, “Top-and-Bottom” and “frame sequential”, there can be obtained an effect that the user can understand, easily, which 3D method it is.
  • the recording position and/or the content thereof does not matter with, as far as it can be seen that it is the 3D program, and it may be a character string, such as, “[3D]” or “(3D)”, or may be a character string, such as, “being broadcasted as 3D program” at the last of the program name, for example.
  • FIGS. 68A and 68B show examples of a list screen of the recorded programs, which are displayed when executing the control in such a manner. Because the character strings are added to a program 1 and a program 4 , being the 3D program, at the top of the program name thereof, there can be obtained an effect that it can be determined, which one of the programs is the 3D broadcasting, at a glance, only from the program name thereof. In this instance, such mark for indicating the 3D program, as shown in FIG. 65 , may be displayed together with. Even in case where the display is made in a list display format, the similar effect can be obtained with doing the similar display.
  • the receiving apparatus has a detachable recording medium, and the recording medium detached can be used, being connected with other equipment, then with this control, there can be obtained an effect that the user can determines, which one of the programs is the 3D when the information of the recorded programs is displayed, by reading out the program names, even if the equipment connected with cannot read out the 3D determination information of the program information.
  • FIG. 68B An example of the display of the recorded program list at this time is shown in FIG. 68B .
  • a display area of the recording medium information 6502 and a display area of the program list 6504 similarly in FIG. 68A , “[3D]” is shown for the program 1 and the program 4 .
  • the user can determine the 3D program on the list, easily, even when she/he is using the equipment, which cannot read out the 3D determination information.
  • thumbnail pictures of the list display format shown in FIGS. 65A and 65B are small in the sizes thereof and are the still pictures, therefore they may be produced/displayed in 2D even if being the thumbnails of the 3D programs.
  • the program having the video signal of the “Side-by-Side” data is read out from the recording medium 26 , so as to decode the video on the left-hand side in the video decode portion 30 , and a picture expanded by 2 times in the horizontal direction is captured, so as to be converted into the still picture of JPEG format, etc., thereby to be reserved in the corresponding portion of the recoding medium 26 .
  • data of the previous program information is recorded that fact, thereby doing a renewal thereof.
  • Reproduction content which can be obtained by connecting or linking all or a part of the plural numbers of recorded programs, is presented as a play list, and the list display, which is exemplarily shown in FIGS. 65A and 65B , is exchanged to the list display of the play list through pushing-down of a blue button of the remote controller device by the user.
  • FIG. 69 shows an example for displaying a content list of the play lists.
  • a reference numeral 6504 is a list display area of the play list(s) already produced. Confirming the program information of the respective programs included in each play list, if at least one of the values of the 3D/2D identification information has the value meaning to include the 3D therein, the mark for indicating the 3D is displayed together with that play list, at the display point thereof.
  • the display similar to that shown in FIG. 65B or 66 is executed. Also, if displaying the play list upon basis of the 3D/2D identification information, when she/he edits or newly produces, there can be brought about an merit for the user, that she/he can confirm into which position the 3D program enters.
  • FIG. 70 shows an example of a screen display for producing the play list, by extracting or cutting out a program or a scene from the recorded programs, which are memorized in the recording medium 26 .
  • a reference numeral 7001 depicts a display area for program data, which should be a source for producing the play list, wherein a thumbnail picture is displayed upon a unit of program or a unit of chapter of each program.
  • a reference numeral 7002 depicts an area, in which the contents of the play list under the condition of production are displayed, as a line of thumbnail pictures of the respective scenes. For example, the user selects a desired scene in the area 7001 , through the lest and the right keys of the remote controller, etc., and next in the area 7002 , she/he inserts that scene at the position of her/his desire; thereby producing the play list.
  • the mark may be displayed by analyzing the program or the video data of the scene or the program information. With this, the user can produce the play list while confirming in which position the 3D program is inserted.
  • a video/audio output device 3601 i.e., the receiving apparatus 4 holds the recording medium therein.
  • the information indicating if each recorded program includes the 3D video or not by adding only information of 1 bit within the program information for example, it is possible to display such a list screen on the display 3603 , enabling the user to easily identify the 3D program thereon, as is shown in FIG. 65 , via the transmission path 3602 .
  • the video/audio output device 3601 i.e., the receiving apparatus 4 .
  • the display 3603 may receive the recorded program data including the 3D video therein, and may decode it into the original video signal so as to determine the 3D method type.
  • the video/audio output device 3601 further holds therein the 3D method identification information, as the program information, with assigning the plural numbers of bits thereto, as is shown in FIG. 64 , it is possible to display such a list screen on the display 3603 , that the user can easily identify even the 3D method thereof, as is shown in FIG. 67 .
  • the 3D method type information when viewing/listening the recorded program data including the 3D video data therein on the present system configuration, it is possible to notice the 3D method type information to the display 3603 , by converting the 3D method identification information, which is held in the program information, into a predetermined format of the metadata, if the 3D metadata can be transmitted on the transmission path 3602 (for example, the format regulated according to the HDMI regulation).
  • FIG. 71 shows an example of connecting the receiving apparatus 4 and the reproducing apparatus 7102 and a recording/reproducing apparatus 7103 .
  • a connect control device 7101 is equipment having a looting function, such as, a modem or the like, for example, and transmits the data, which is inputted from the network 3 , including the public network, with determining an appropriate one from among the devices or apparatuses connected with. Or, it controls the data transmission process between the equipments connected to the connect control device 7101 .
  • the reproducing apparatus 7102 is an apparatus, which can obtain/display the list information of the programs, which is recorded in the recording medium 26 within the receiving apparatus 4 , via the network, or receive the data of the desired program via the network, to be viewed/listened.
  • a detailed method for controlling the content through the network it may be based on a measured specification, such as, DLNA (Digital Living Network Alliance), for example.
  • the recording/reproducing apparatus 7103 has a recording medium, and is able to obtain/display the list information of the programs within the receiving apparatus 4 , similar to the reproducing apparatus 7102 , and to receive the data of the desired program via the network, to be viewed/listened, and further to copy the data of an arbitrary program the receiving apparatus 4 , into recording medium so as to utilize it therein.
  • the CPU 21 converts the program information into a format appropriate for network transmission, and output it to the network 3 from the network I/F 25 .
  • the recording/reproducing portion 27 reads out the data of the program, which is recorded in the recording medium 26 , and the CPU 21 treats an encryption process on it, appropriate for transmission on the network, and it is outputted from the network I/F 25 .
  • the receiving apparatus 4 it is also possible to obtain/display the list information of the programs, which are recorded in the recording medium within the recording/reproducing apparatus 7103 , to receive the data of the desired program jumping over the network, to be viewed/listened, and to copy the data of an arbitrary program into the recording medium 26 so as to utilize it.
  • FIG. 72 shows an example of displaying the list of the programs, which are recorded in the recording medium 26 , on the reproducing apparatus 7102 .
  • a reference numeral 7201 depicts an area for display a list of an apparatus(es) connected with through the network. The user can call up the program information from a desired apparatus by operating the up/down keys, etc., of the remote controller.
  • a reference numeral 7202 depicts a select button for shifting to the hierarchy higher by one (1).
  • 7203 depicts an area for displaying a list of programs, which are recorded within the folder selected in the apparatus selected.
  • the recording date/time, the recoding time-length, in FIG. 72A there is displayed an icon of indicating to be the 3D program for that.
  • FIG. 72B shows an example of displaying the list of programs, which are recorded in the recording medium 26 , on the reproducing apparatus 7102 , when the receiving apparatus 4 records the character string, such as, “[3D]” by revising the program name when recording, as was explained by referring to FIG. 68 .
  • the reproducing apparatus 7102 does not hold the icon data for use of the 3D program therein, the user can identify which one is the 3D program.
  • a reference numeral 7204 depicts an area for displaying an operation guide for the user, similar to 6505 . This may differs from 6505 in the setup content thereof, such as, no function is assigned to a “red” and “blue” buttons, but to “yellow” one is assigned a function of turning the display screen back to the previous condition by one (1) time, etc.
  • connection of the apparatuses as the configuration for accomplishing the embodiment mentioned above, it can be also achieved by connecting the reproducing apparatus 7102 and the receiving apparatus 4 directly. Or, there may be connected plural numbers of sets of the reproducing apparatuses and/or the recording/reproducing apparatuses, or the receiving apparatuses, much more than those shown in FIG. 71 , or the connect control devices may be combined in multi-stages. Even with such configuration different in the connection, it is possible to look the recorded program(s) of the receiving apparatus 4 from the reproducing equipment(s), which is/are connected in the similar manner to that of the reproducing apparatus 7201 . And, it is also possible to look the recorded program(s) of other recording device or apparatus from the receiving apparatus 4 .
  • the receiver 4 may recognize the characteristics of the reproducing apparatus 7102 in advance, so as not transmit the information of the 3D content thereto.
  • FIG. 73 is shown a sequence.
  • the reproducing apparatus 7102 inquires about the list of contents to the receiving apparatus 4 via the network.
  • the list of the recorded program(s) is transmitted, including the information of 3D program therein, and then the process is completed.
  • the list of the recorded program(s) is transmitted, but not including the information of 3D program therein, and then the process is completed.
  • the usability for the user may be improved, by transmitting the recorded program information, including the 3D program therein, so that the user can select the 3D program on the reproducing apparatus 7102 , to convert it from the 3D program into the 2D within the receiving apparatus 4 , and thereby outputting the video data therefrom, when reproducing.
  • the CPU 21 of the receiving apparatus 4 converts the program data and/or the recorded program data into the transmission signals of a format suited to the format regulated on the network (for example, the format regulated by DLNA), and then transmits the data in accordance with the method adapted to the sequence, which is regulated on the network transmission path.
  • an equipment authentication is executed, for confirming on whether copying/moving can be made or not, without any trouble, between the equipment at the origin of transmission and the equipment at the destination of transmission; when the authentication is established, the data at the origin of transmission is invalidated, so that it cannot be reproduced, if it is movement of the data;
  • the program information is transmitted in a predetermined format (for example, XML format);
  • the recorded program data is transmitted in a predetermined format (for example, MPEG-TS format, on which DTCP-IP encryption is treated); and if the transmission is completed up to the end, the session is closed, for example.
  • the recording/reproducing apparatus 7103 converts the data transmitted, appropriately, into a data format and/or an encryption method, which is/are regulated for recording it within the apparatus, and then records the data into the recording medium.
  • the 3D/2D identification information is regulated by a data format on the transmission path, and also by a data format applied when being recorded in the recording/reproducing apparatus 7103 , as a result of executing the copying/moving of the data, equivalent information can be held, as the program information of the recording/reproducing apparatus 7103 , and therefore the usability for the user is improved.
  • the 3D method identification information may be transmitted in the similar manner.
  • dubbing As a scene of using the 3D/2D determination information, there are cases when copying and moving the data between the recording media (being called “dubbing”, collectively). Thus, it is the case where plural numbers of the recording media can be connected to the receiving apparatus 4 through the recoding/reproducing portion 27 , or where dubbing can be made from a one (1) recording medium to other recording medium.
  • FIG. 74A shows a display screen when she/he selects the data of a dubbing target from among the recorded program(s) recorded in the recording medium 26 .
  • a program list display area 6504 is displayed; if it can be copied or not, or an allowable number of times of copying, as the information relating to the dubbing. For example, the following indications are made; i.e., for the program indicated as “copy inhibited”, it cannot be copied nor moved, for the program indicated as “move”, it can be moved (but, the original data is deleted when the data is moved to other equipment), and for the program indicated as “copy N times”, it can be copied (N ⁇ 1) times (i.e., the last time is “move”), for example.
  • the CPU 21 Upon executing the dubbing, there are cases where the folder and/or folder configuration differ(s) from, depending on a kind thereof, between the recording medium at the origin of movement and the recording medium at the destination of movement. Accordingly, the CPU 21 understands or grasps the data format, recording the origin of movement and the destination of movement therein, in advance, to convert the information to be recorded (i.e., the program information, the thumbnail file, the recorded video/audio data) into a format of the destination of transmission, appropriately, and thereby executing the dubbing. However, it is not necessary to move all the information.
  • the CPU 21 cannot record the 3D/2D identification information to the destination of movement.
  • the thumbnail picture is large the data thereof, and where not defined as an essential matter to be recoded, according to the regulation, to which the recording medium at the destination of movement follows, there can be also made such an implementation that no dubbing is made on the thumbnail pictures, by taking the time-period necessary for completing a dubbing operation into the consideration thereof.
  • FIG. 74B shows an example of displaying a list of the program(s) recorded within the optical disc, when executing the dubbing from the recording medium 26 to the optical disc after treading a process for chaining the program name, in particular, when after completing the dubbing.
  • the icon display may be made in FIG. 74B . With the 3D method identification information, if it can succeeded when making the dubbing, in the similar manner, then that information may be displayed in FIG. 74B .
  • reproduction is started of the selected program from among the program data reserved in the recording medium 26 .
  • a function for enabling to adjust a reproduce speed, a reproduce direction and/or a reproduce position arbitrarily (herein, reproduction in a positive direction and at 1 ⁇ speed is called “normal reproduction”, on the contrary to that, methods of reproducing at other speed and in the reverse direction may be sometimes called “special reproduction”, collectively), through the remote controller operations by the user, during the reproducing.
  • a high-speed search such as, in the positive direction and at 2 ⁇ speeds or 10 ⁇ speeds
  • a high-speed search in the reverse direction reproduction of executing output/display at 1.3 ⁇ speeds accompanying the audio output (may be called “quick reproduction”, too), etc.
  • FIG. 75 is shown a diagram of an example of the process when executing the high-speed search at 2 ⁇ speeds.
  • FIG. 75 An example of the configuration when extracting the video data from the stream is shown in an upper portion of FIG. 75 .
  • FIG. 75 is shown an example of 3D video data, which is stored in one (1) piece of the picture, being divided into the video for use of the left-side eye and the video for use of the right-side eye, like the “Side-by-Side” method or the “Top-and-Bottom” method, (this may be also called “2-viewpoints same ES method), for example.
  • the video data is made up with a series of GOD structures; wherein in one (1) GOD is contained a line of still pictures for 0.5 second, for example. Also, it is assumed that one (1) piece of the “I” picture is contained in one (1) GOP.
  • the high-speed search at 2 ⁇ speeds is executed, by displaying the “I” pictures of all GOP at time-interval two (2) times faster to the original one, such like, after displaying the “I” picture of GOP 1 , by displaying the “I” picture of GOP 2 after elapsing 0.25 second, the “I” picture of GOP 3 after elapsing 0.5 second . . . .
  • a display example 2 after displaying the “I” pictures of GOP 1 , the time-period for displaying one (1) piece of the “I” picture is elongated, in the place of removing the display pictures at every two pieces, such like, displaying the “I” picture of GOP 3 after elapsing 0.5 second, the “I” picture of GOP 5 after elapsing 1 second, and with this, the 2 ⁇ speeds display can be achieved.
  • the high-speed search can be achieved, arbitrarily, by adjusting the number of pieces of the still pictures to be displayed and/or the time-period for displaying per one (1) piece thereof.
  • the double-speed can be achieved at an arbitrary number thereof, in the similar manner, by adjusting the number of pieces of the still pictures to be displayed and/or the time-period for displaying per one (1) piece thereof. This is also true to the high-speed search, but in the reverse direction. Also, there is no necessity for the time-period for displaying one (1) piece of the still picture to be constant.
  • a control such as, treating a mute process or the like, may be executed on the output of the decoder portion, in particular, on the audio output, not to be outputted; thereby preventing from becoming an unnatural output not synchronized between the video and the audio.
  • the stream reproduction time and the data readout position on the recording medium are determined, uniquely, depending on the position information to be referred when reading out the data, which will be explained next.
  • the position information when reproducing since a reproduction time-interval at the data readout position can be grasped, it is possible to adjust the number of pieces of the still pictures and the time-period for displaying per one (1) piece thereof, even if the time-interval of each GOP is not unique, like, 0.5 second, as is shown in FIG. 75 , in the GOP configuration, or if plural numbers of “I” pictures (or, a picture, which can be displayed alone) are included in one (1) GOP, and therefore it is possible to achieve the high-speed reproduction at the arbitrary double-speed.
  • FIG. 76 is shown examples of the clip information data and the data readout position information.
  • FIG. 76A is shown an example of the data configuration of a clip information file.
  • stream format information indicates a format of the stream, such as, MPEG2-TS, for example.
  • version information indicates a version of the regulation at the time when this clip information file is recorded.
  • sequence information start address indicates a starting position of “sequence information” which will be mentioned later.
  • program information start address indicates a starting position of “sequence information” which will be mentioned later.
  • characteristic information start address indicates a starting position of “sequence information” which will be mentioned later.
  • clip mark information start address indicates a starting position of “sequence information” which will be mentioned later.
  • maker unique information start address indicates the starting positions of those information, respectively.
  • clip information indicates the characteristic of the program corresponding to this clip information, such as, how it is encode, if conversion is made or not when receiving the broadcasting, at which number the bite rate is, etc., for example.
  • sequence information includes the information of how many sequences are included in the program corresponding to this clip information if defining a portion continuous in STC values of the stream as one (1) sequence, PCR values of those sequences and PST values at the starting position and the ending position, etc.
  • program information records a number of program(s) included by the program corresponding to this clip information, and holds therein, an ID of stream at the time when each program is broadcasted, and further detailed information of all ES included in that program (e.g., video format, frame rate, aspect ratio, if being the video ES, while sampling frequency thereof if being the audio ES).
  • clip mark information is recorded the information of chapter(s) included by the program corresponding to this clip information, and the content thereof is analogous to the chapter information shown in FIG. 60 .
  • maker unique information is a data area, which can be used for a maker manufacturing the receiving apparatus to input unique information therein.
  • character information are recorded the following: i.e., “stream number” for indicating a number of vide ES, “stream PID” for indicating PID for each of that video ES, “video stream format information” for determining the data format at the top of GOP (for example, the data of MPEG 2 format, wherein a GOP header and a sequence header are provided before the “I” picture, etc.), a number of pieces of data of the position information (e.g., an abstract), which will be mentioned later, and “position information start address” for indicating the starting position of the position information, and thereafter is recorded a substance of the position information for each vide ES.
  • stream number for indicating a number of vide ES
  • stream PID for indicating PID for each of that video ES
  • video stream format information for determining the data format at the top of GOP (for example, the data of MPEG 2 format, wherein a GOP header and a sequence header are provided before the “I” picture, etc.)
  • position information e.g., an abstract
  • FIG. 76B shows an example of the data configuration of the position information.
  • the position information for indicating the readout position is managed by two (2) sets of tables of position information (details) and the position information (abstract).
  • the position information (abstract) table are held data position information on the recording medium and an abstract PTS value, which records therein upper bits (for example, 1 4 bits) of the PTS value applicable in calculation for reproduction time of that data.
  • each data holds a reference value for showing a relationship between the data of the position information (details) table.
  • the data of the position information (details) table corresponds to an actual readout position (for example, a packet at the top of the “I” picture, etc.), and the table holds therein the data position information (for example, upper 17 bits) and a detailed PTS value applicable in calculation for the reproduction time of that data.
  • the detailed PTS value may not include down to the lower bit(s), completely, but may be enough in a format of reserving the middle 11 bits thereof, for example, if sufficient time resolution can be maintained with it.
  • the configuration of the data should not be limited to that of the present embodiment, and into the position information (details) may be recorded additional information, for example, PID or the data size of “I” picture of the corresponding video ES, the format of the video data (e.g., being MPEG 2 format or H.264 format, etc.) together with.
  • additional information for example, PID or the data size of “I” picture of the corresponding video ES, the format of the video data (e.g., being MPEG 2 format or H.264 format, etc.) together with.
  • the recorded program is the 3D of the 2-viewpoints separate ES transmission method
  • the “stream number” of “characteristic information” is “2”, and this means a condition that the position information of a main viewpoint video and the position information of a sub-viewpoint video are recorded in series.
  • FIG. 77 is shown an example of high-speed search processing of the 3D video, in a manner for the user to view/listen easily, with respect to the configuration of the 2-viewpoints same ES video data shown in FIG. 75 .
  • a display example 1 a case where the high-seed search at 2 ⁇ speeds is achieved by conducting a skip (i.e., change of the reproduction position, and may be also called “jump”), after conducting the normal reproduction for “1” second.
  • the skip can be made with using the clip information mentioned above.
  • the normal reproduction is determined to be “1” second, in the time-period thereof, in FIG. 77 , but it may be executed for a period of an arbitrary number of second(s).
  • the high-speed search of 2 ⁇ speeds can be obtained, for example, if determining the time-period of the normal reproduction at “2” seconds and skipping to GOP 9 , “2” seconds later.
  • the longer the time-period of the normal reproduction the easier 3D recognition for the user, but since it also bring a time-interval of skipping by one (1) time to be large; therefore, there is a probability of deteriorating an accuracy of finding out a scene desired.
  • a display example 2 is shown a case of achieving the high-speed search of 2 ⁇ speeds by executing the skip after displaying the still picture for “1” second.
  • the skip can be made with using the clip information mentioned above.
  • the time-period is determined at “1” second for displaying the still picture, by one (1) piece thereof, but it does not matter if it be executed for a period of an arbitrary number(s) of second(s).
  • processing is made, such as, displaying the “I” picture, which is contained in GOP, “2” seconds later, . . . , and thereby can be achieved the high-speed search of 2 ⁇ speeds.
  • the time-interval for one (1) time of skipping can be shorten, so as to increase a number of pieces of the pictures to be displayed; therefore, it is possible to improve or increase the accuracy of finding out the scene desired, and thereby improving or increasing the usability thereof.
  • the plural numbers of processing shown in the above may be made selectable for the user from a setup menu, etc. It may be achieved by selecting the format of the display example 1 or the format of the display example 2 , as the method for displaying on the scene of 3D video, or by designating the time, such as, how much second(s), for one (1) time of the normal reproduction, when applying the format of the display example 1 .
  • the display may be made selectable, between the display of 2D picture or the display of 3D video, in the high-speed search of the video data including the 3D video therein.
  • the video output is controlled in such a manner that the picture data, accompanying the “B” picture or the “P” picture therewith, to be decoded will not be outputted, appropriately, depending on the speed.
  • the audio data is also controlled, not to be outputted, partially, at an interval depending on the speed, for example, and thereby outputting the audio fitting to the video.
  • FIG. 78 An example is shown in FIG. 78 , of the processing for executing the high-speed search on the content, including an adequate 3D scene therein, by treading different process thereon depending on whether the scene to be reproduced is 2D or 3D, in the apparatus.
  • FIG. 25 An example is shown in FIG. 78 , of the processing for executing the high-speed search on the content, including an adequate 3D scene therein, by treading different process thereon depending on whether the scene to be reproduced is 2D or 3D, in the apparatus.
  • FIG. 25 An example is shown in FIG. 78 , of the processing for executing the high-speed search on the content, including an adequate 3D scene therein, by treading different process thereon depending on whether the scene to be reproduced is 2D or 3D, in the apparatus.
  • detection is made on the encoded 3D program details descriptor, which is stored user data area and/or the additional information area, and the process advances into S 7802 .
  • this detection may be executed within the multiplex/demultiplex portion 29 or the program information analyze portion 54 , and the data to be detected may be the program characteristics, which are included in the content descriptor shown in FIG. 54 .
  • S 7802 from the information obtained in S 7801 within the system control portion 51 , it is determined if the scene, on which the decoding is executed at present, is 2D or 3D. If the result determined is 3D, the process advances into S 7803 , on the contrary if 2D, it advances into S 7804 . Determination if it is 2D or 3D may be made by other method(s) than that.
  • the system control portion 51 instructs a processing block(s) relating thereto, to set up the special reproduction depending on the 3D display, and each processing block executes necessary processed, and thereby completing the processing.
  • the system control portion 51 instructs a processing block(s) relating thereto, to set up the special reproduction depending on the 2D display, and each processing block executes necessary processed, and thereby completing the processing.
  • FIG. 79 is a sequence block diagram for showing an example of setting up the special reproduction, to be executed in S 7803 , when displaying moving pictures continuing for a time-period of several seconds, in the format thereof, as is shown by the display example 1 in FIG. 77 .
  • S 7901 while the system control portion 51 keeping the decoding method in the video decode portion 30 to the setup for executing the decoding process, which is same to that for the normal reproduction, as it is, the process advances into S 7902 .
  • the system control portion 51 instructs the recording/reproducing control portion 58 , to readout/change the transmission method of the recorded program data, and output data including the continuous video data for a time period of several seconds (for example, for 1 second), by one (1) time.
  • the recording/reproducing control portion 58 is able to determine the readout position of the desired data for a time-period of several seconds, by referring to the GOP information shown in FIG. 76 .
  • the process is completed with provision that a setup is made to repeat the processing, such as, determining the next coming readout position depending on the double-speed, after reading out the necessary data sequentially, and outputting it to the video decode portion 30 .
  • Timing for transmitting each data may be achieved by, for example, while managing it by the recording/reproducing control portion 58 , by referring to an operation clock not shown in the figure, executing the sequential decoding processes of the input data within the video decode portion 30 , or may be a process, in which the recording/reproducing control portion 58 transmits data at high speed when a vacancy is generated, depending on the data remaining volume in the video decoder portion 30 before being decoded (being also called “flow control”), while managing the display timing by the video decoder portion 30 .
  • the audio output may be executed together with.
  • FIG. 80 is a sequence block diagram for showing an example of setting up the special reproduction to be executed in S 7803 when displaying one (1) piece of still picture for a time-period of several seconds, as is shown by the display example 2 in FIG. 77 .
  • the system control portion 51 changes the decoding process method in the video decoder portion 30 to decode only the “I” picture, and advance the process into S 8002 .
  • the system control portion 51 instructs the recording/reproducing control portion 58 to change the readout/transmission method of the recorded program data, so that one (1) piece of the “I” picture data is outputted at one (1) time.
  • the video decoder portion is set to process only the “I” picture data in S 8001 , transmission of data other than that “I” picture data does not matter with the operation thereof.
  • the recording/reproducing control portion 58 can determine the starting position and the ending position of reading out of the “I” picture desired by referring to the position information shown in FIG. 78 .
  • the process is completed with provision that a setup is made to repeat the processing, such as, determine the next coming readout position depending on the double-speed and the time-period of displaying one (1) piece of the “I” picture, after reading out the “I” picture and transmitting it to the video decoder portion 30 .
  • the time interval for displaying one (1) piece of the “I” picture is arbitrary.
  • the sequence set up for the special reproduction to be executed in S 7803 differs in the content of the process depending in the video format recorded.
  • the recorded video is a program of the 2-viewpoint separate ES transmission method, it is enough to execute the process similar to that shown in FIG. 80 ; however, in S 8002 , the recording/reproducing control portion 58 selects only a main 1 ES to output it, and completes the process after make a setup to provide the “I” pictured data at a necessary time interval.
  • FIG. 81 An example of a processing sequence when the recording video is the 2-viewpoints same ES transmission method is shown in FIG. 81 .
  • S 8101 is made a setup in the video conversion process portion 32 , to convert the video into that for 1-viewpoint display if it is the video for 2-viewpoints, for example, the “Side-by-Side” method, etc., and the process advances into S 8102 .
  • S 8102 is made a setup similar to S 8002 , and the process is completed.
  • the processing may be executed in such a manner that the interval for displaying the “I” picture comes to a fine display interval similar to that when reproducing the 2D, such as, 100 milliseconds, etc., for example.
  • a control is executed; i.e., stopping the 3D viewing/listening function of the 3D view/listen assisting device 3502 , or showing the same picture for the right-side eye and the left-side eye (e.g., displaying “M” picture, again, at the timing for displaying “S” picture shown in FIG. 37A , and displaying “L” picture, again, at the timing for displaying “R” picture shown in FIG. 39A ) while keeping the 3D viewing/listening function of the 3D view/listen assisting device 3502 as it is.
  • the receiving apparatus 4 shown in FIG. 25 can cooperate with the recording/reproducing apparatus including the recording medium and having the recording function and/or the reproducing apparatus having the production function, via the high-speed digital I/F 46 , or a digital I/F not shown in the figure, for example.
  • a block diagram of an example of the structure thereof is shown in FIG. 82 .
  • a recording/reproducing apparatus 8201 is connectable with the receiving apparatus, and has the recording function by itself. Accordingly, there can be considered a case where the program recorded by the recording/reproducing apparatus 8201 is outputted in the digital form, and it is inputted to the receiving apparatus to be viewed/listened. (Hereinafter, although explanation will be made on an example of connection with the recording/reproducing apparatus 8201 , but even with the reproducing apparatus, it is possible to obtain an equivalent effect through the processing equivalent thereto.
  • the digital I/F such as, HDMI (being a trademark registered; an abbreviation of High-Definition Multimedia Interface) connection
  • HDMI being a trademark registered; an abbreviation of High-Definition Multimedia Interface
  • the receiving apparatus 4 shown in FIG. 25 can cooperate with different receiving apparatus, the recording/reproducing apparatus, and the reproducing apparatus, etc., via the network I/F 25 , like the example of the configuration shown in FIG. 71 (for example, viewing/listening of content via the network, which is regulated by DLNA (being a trademark registered; an abbreviation of Digital Living Network Alliance) guideline and/or viewing/listening of content on VOD (e.g., an abbreviation of Video On Demand)).
  • DLNA being a trademark registered; an abbreviation of Digital Living Network Alliance
  • VOD e.g., an abbreviation of Video On Demand
  • control of an output of data is executed within the recording/reproducing apparatus 7103 , while decoding/video conversion process within the receiving apparatus 4 .
  • Fundamental steps of processing follow a series of controls, which are described in FIGS. 78 to 81 .
  • the process moves into S 7801 . Also, fitting to this, a communication command is transmitted to the recording/reproducing apparatus 7103 via the network.
  • the video decoder portion 30 detects the encoded 3D program details descriptor, which is stored in the user data area or the additional information area, and the process advances into S 7802 . However, this detection may be executed in the multiplex/demultiplex portion 29 or the program information analyze portion 54 , and the data to be detected may be the program characteristics included within the content descriptor shown in FIG. 54 .
  • S 7802 it is determined if the scene, being decoded at present, is the 2D or the 3D, from the information obtained in S 7801 within the system control portion 51 . If the result of the determination is the 3D, the process advances into S 7803 , while if it is 2D, the process advances into S 7804 . However, the determination if being the 2D or the 3D may be conducted by the methods other than that.
  • the system control portion 51 instructs the setup of the special reproduction depending on the 3D display to the processing blocks relating thereto, while each processing block executes the necessary processing therein, and the sequence is completed.
  • the system control portion 51 instructs the setup of the special reproduction depending on the 2D display to the processing blocks relating thereto, while each processing block executes the necessary processing therein, and the sequence is completed.
  • the video decoder portion 30 When displaying moving pictures continuing for a time-period of several seconds, as is shown by the display example 1 in FIG. 77 , in S 7901 , the video decoder portion 30 is keeps the setup of executing the process equivalent to that for the normal reproduction, and the process advances into S 7902 .
  • the readout/transmission method for the recorded program data is changed within the recording/reproducing apparatus 7103 , so as to output the data including the video data, for a continuous time-period of a number of second(s), by one (1) time.
  • the number of the second(s) of the continuous data to be read out is arbitrary.
  • the recording/reproducing apparatus 7103 if the position information is recorded in the format shown in FIGS. 75 and 76 , since it is also possible to determine the readout position of the data continuing for a time-period of several seconds desired, by referring to that data, therefore the process is completed with provision that a setup is made to repeat the processing, such as, determining the next coming readout position depending on the double-speed.
  • a series of processes is executed, in an equivalent manner when receiving the broadcasting, such as, from division of the data received on the network I/F 25 into the video/audio, etc., sequentially, within the multiplex/demultiplex portion 29 , up to deciding process within the video decoder portion 30 , the video conversion process within the video conversion process portion 32 , and displaying on the display 47 .
  • the system control portion 51 changes the decoding process method of the video decoder portion 30 to decode only the “I” picture, and the process advances into S 8002 .
  • the recording/reproducing apparatus 7103 changes the readout/transmission method for the recorded program data, and thereby outputs the “I” picture data for one (1) piece by one (1) time.
  • the video decoder portion since the video decoder portion is set up to process the “I” picture only in S 8001 , it does not matter with, in the operation thereof, if transmitting data other than the “I” picture. Processing only the “I” picture data enables reduction of the processing load. If the recording/reproducing apparatus 7103 records the position information when recording, as was shown in FIGS. 75 and 76 , then by referring to this, it is possible to determine starting position and the ending portion for reading out the “I” picture.
  • a sequence set up for the special reproduction to be executed in S 7803 differs from in the content of processing depending on the video format recorded. If the recorded video is the program of the 2-viewpoints separate ES transmission method, it is enough to execute the processes similar to those shown in FIG. 80 , however in S 8002 , the recording/reproducing apparatus 58 selects only the main 1 ES, so as to output it, and after making a setup to provide the “I” picture at a necessary time-interval, completes the process.
  • FIG. 81 An example of a processing sequence when the recorded video is the 2-viewpoints same ES transmission method is shown in FIG. 81 .
  • S 8101 is made a setup to convert the video, if being for two (2) viewpoints, such as, the “Side-by-Side” method, etc., for example, into that for displaying one (1) viewpoint for use of the 2D view/listen, within the video conversion process portion 32 , and the process advances into S 8102 .
  • S 8102 is made a setup similar to that in S 8002 , and thereafter is completed the process.
  • the processing may be done, so that the interval for displaying the “I” picture comes to be a fine display interval equivalent to that when reproducing the 2D, such as, 100 milliseconds, etc.
  • a control is executed; i.e., stopping the 3D viewing/listening function of the 3D view/listen assisting device 3502 , or showing the same picture for the right-side eye and the left-side eye (e.g., displaying “M” picture, again, at the timing for displaying “S” picture shown in FIG. 37A , and displaying “L” picture, again, at the timing for displaying “R” picture shown in FIG. 39A ) while keeping the 3D viewing/listening function of the 3D view/listen assisting device 3502 as it is.
  • FIG. 83 is shown an example of a sequence of the processing.
  • S 8301 if it indicates to have the 3D video data by referring to the 3D identification information of the corresponding reproduction content, the process advances into S 8302 , on the contrary if not have the 3D video data into S 8303 .
  • the method for refereeing to the identification information rather than accessing to the file every time, if making an access to the file only one (1) time before starting the reproduction, so as to record it as a local variable, herein after the value of the 3D/2D identification information can be determined by referring to the corresponding variable when determining; there is an advantage of lightening the processing load.
  • a setup is made to execute the process for the 3D video data, which was explained in FIGS. 79 to 81 mentioned above, irrespective of that the reproduction display position is the 3D video data or the 2D video data, and then the process is completed.
  • the process is executed for converting the input video into the 2D video, if it is the 3D video of, such as, the “Side-by-Side” method or the “Top-and-Bottom” method, as was explained in FIG. 40 , for example, in the video conversion process portion 32 , but no conversion process is done if the input video is 2D video.
  • S 8303 is conducted such process as was shown in FIG. 78 , so as to make a setup of executing the process depending on if the video data at the reproduction position is the 3D video data or the 2D video data, and the process is completed.
  • the process is executed for converting the input video into the 2D video, if it is the 3D video of, such as, the “Side-by-Side” method or the “Top-and-Bottom” method, as was explained in FIG. 40 , for example, in the video conversion process portion 32 , but no conversion process is done if the input video is 2D video.
  • the determination may be made if the program name (or may be called a “program title”), which is included in the program-common information, is a title or not, having a meaning to include the 3D video.
  • program name or may be called a “program title”
  • the determination of if including the 3D video scene or not may be made with methods other than that.
  • the reproduction content of the 2-viewpoints same ES transmission method does not hold the 3D program details descriptor in the user data area or the addition information area thereof, for the receiving apparatus 4 , sometimes, it is difficult to determine of 2D/3D of the reproduction scene. Then, a possibility can be considered that the 2D video is displayed in the 3D format, in particular, when such exchange generates between the 2D video and the 3D video, during the high-speed search.
  • the display format during the special reproduction may be treated as the 2D. With doing this, a possibility of showing an uncomfortable video to the user upon an erroneous display format, and there is obtained an effect of achieving the special reproduction view/listenable easily.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
US13/327,224 2011-02-04 2011-12-15 Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method Abandoned US20120201515A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/023,319 US9094668B2 (en) 2011-02-04 2013-09-10 Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011022253A JP5595946B2 (ja) 2011-02-04 2011-02-04 デジタルコンテンツ受信装置、デジタルコンテンツ受信方法、およびデジタルコンテンツ送受信方法
JP2011-022253 2011-09-29

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/023,319 Continuation US9094668B2 (en) 2011-02-04 2013-09-10 Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method

Publications (1)

Publication Number Publication Date
US20120201515A1 true US20120201515A1 (en) 2012-08-09

Family

ID=46588168

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/327,224 Abandoned US20120201515A1 (en) 2011-02-04 2011-12-15 Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method
US14/023,319 Active 2031-12-31 US9094668B2 (en) 2011-02-04 2013-09-10 Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/023,319 Active 2031-12-31 US9094668B2 (en) 2011-02-04 2013-09-10 Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method

Country Status (3)

Country Link
US (2) US20120201515A1 (zh)
JP (1) JP5595946B2 (zh)
CN (2) CN105611210B (zh)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110099285A1 (en) * 2009-10-28 2011-04-28 Sony Corporation Stream receiving device, stream receiving method, stream transmission device, stream transmission method and computer program
US20120050508A1 (en) * 2010-08-30 2012-03-01 Samsung Electronics Co., Ltd. Three-dimensional image display apparatus and driving method thereof
US20130332880A1 (en) * 2012-06-07 2013-12-12 Samsung Electronics Co., Ltd. Apparatus and method for displaying
US20140112636A1 (en) * 2012-10-19 2014-04-24 Arcsoft Hangzhou Co., Ltd. Video Playback System and Related Method of Sharing Video from a Source Device on a Wireless Display
US20140143733A1 (en) * 2012-11-16 2014-05-22 Lg Electronics Inc. Image display apparatus and method for operating the same
EP2717589A3 (en) * 2012-08-31 2014-09-03 Samsung Electronics Co., Ltd Display device, set-top box and method of determining stereoscopic video content
US20150100986A1 (en) * 2013-10-08 2015-04-09 Wistron Corp. Controlling method for recording digital television programs
US20150316984A1 (en) * 2014-03-21 2015-11-05 Samsung Electronics Co., Ltd. Wearable device and method of operating the same
US20190392865A1 (en) * 2017-03-17 2019-12-26 Yamaha Corporation Content reproduction device, content reproduction method, and content reproduction system
TWI809477B (zh) * 2020-08-26 2023-07-21 新加坡商聯發科技(新加坡)私人有限公司 用於視訊靜音模式的多媒體設備和相關方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6518410B2 (ja) * 2014-07-03 2019-05-22 株式会社ソニー・インタラクティブエンタテインメント コンテンツ管理装置およびコンテンツ管理方法
CN104486757A (zh) * 2014-12-12 2015-04-01 上海斐讯数据通信技术有限公司 移动终端的语音加解密系统和语音加解密方法
CN107451185B (zh) * 2017-06-22 2022-03-04 重庆缘溪行文化传媒有限公司 录音方法、朗读系统、计算机可读存储介质和计算机装置
JP6513854B2 (ja) * 2018-04-10 2019-05-15 三菱電機株式会社 映像再生装置および映像再生方法
US11593967B2 (en) * 2020-01-08 2023-02-28 Samsung Electronics Co., Ltd. Attribute transfer in V-PCC

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100150523A1 (en) * 2008-04-16 2010-06-17 Panasonic Corporation Playback apparatus, integrated circuit, and playback method considering trickplay
US20110116771A1 (en) * 2009-11-16 2011-05-19 Sony Corporation Information processing apparatus, information processing method, display control apparatus, display control method, and program

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3594569B2 (ja) 2001-06-27 2004-12-02 三洋電機株式会社 ディジタル放送受信装置
EP2262273A3 (en) 2002-04-25 2013-12-04 Sharp Kabushiki Kaisha Image data creation device, image data reproduction device, and image data recording medium
US8290603B1 (en) * 2004-06-05 2012-10-16 Sonos, Inc. User interfaces for controlling and manipulating groupings in a multi-zone media system
JP4393151B2 (ja) * 2003-10-01 2010-01-06 シャープ株式会社 画像データ表示装置
KR100700814B1 (ko) * 2005-07-07 2007-03-27 엘지전자 주식회사 디지털 비디오 기기에서의 텍스트 파일 재생장치 및 방법
KR100813978B1 (ko) * 2006-02-22 2008-03-17 삼성전자주식회사 멀티미디어 데이터를 기록 및 재생하는 방법 및 장치
JP2008270939A (ja) * 2007-04-17 2008-11-06 Victor Co Of Japan Ltd 動画像配信システム並びにサーバー装置及びクライアント装置
KR101480186B1 (ko) * 2007-12-10 2015-01-07 삼성전자주식회사 2d 영상과 3d 입체영상을 포함하는 영상파일을 생성 및재생하기 위한 시스템 및 방법
EP2326101B1 (en) * 2008-09-18 2015-02-25 Panasonic Corporation Stereoscopic video reproduction device and stereoscopic video display device
KR20110063615A (ko) 2008-09-30 2011-06-13 파나소닉 주식회사 3d 영상이 기록된 기록매체, 3d 영상을 재생하는 재생장치 및 시스템 lsi
WO2010095381A1 (ja) * 2009-02-20 2010-08-26 パナソニック株式会社 記録媒体、再生装置、集積回路
WO2010134665A1 (ko) * 2009-05-18 2010-11-25 (주)엘지전자 입체영상에 대한 3d 모드 선택이 가능한 입체영상 재생 장치 및 방법
JP4587237B1 (ja) * 2009-06-17 2010-11-24 Necカシオモバイルコミュニケーションズ株式会社 端末装置及びプログラム
JP5250491B2 (ja) * 2009-06-30 2013-07-31 株式会社日立製作所 記録再生装置
JP2011097451A (ja) * 2009-10-30 2011-05-12 Fujifilm Corp 3次元画像表示装置及び方法
CN101841692B (zh) * 2010-04-23 2011-11-23 深圳市茁壮网络股份有限公司 视频流快进快退的方法
US20130215240A1 (en) * 2010-05-28 2013-08-22 Sadao Tsuruga Receiver apparatus and output method
JP5473842B2 (ja) * 2010-09-09 2014-04-16 三菱電機株式会社 映像再生方法及び装置、並びに映像表示方法及び装置、並びにプログラム及び記録媒体
KR101675119B1 (ko) * 2010-09-28 2016-11-22 삼성전자 주식회사 3차원 사용자 인지 정보를 표시하기 위한 데이터스트림 생성 방법 및 장치, 데이터스트림 재생 방법 및 장치
US20120210229A1 (en) * 2011-02-16 2012-08-16 Andrew Bryant Color workflow
KR101164379B1 (ko) * 2011-08-01 2012-08-07 민병철 사용자 맞춤형 컨텐츠 제작이 가능한 학습 장치 및 이를 이용한 학습 방법

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100150523A1 (en) * 2008-04-16 2010-06-17 Panasonic Corporation Playback apparatus, integrated circuit, and playback method considering trickplay
US20110116771A1 (en) * 2009-11-16 2011-05-19 Sony Corporation Information processing apparatus, information processing method, display control apparatus, display control method, and program

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110099285A1 (en) * 2009-10-28 2011-04-28 Sony Corporation Stream receiving device, stream receiving method, stream transmission device, stream transmission method and computer program
US8704873B2 (en) * 2009-10-28 2014-04-22 Sony Corporation Receiving stream data which may be used to implement both two-dimensional display and three-dimensional display
US20120050508A1 (en) * 2010-08-30 2012-03-01 Samsung Electronics Co., Ltd. Three-dimensional image display apparatus and driving method thereof
US20130332880A1 (en) * 2012-06-07 2013-12-12 Samsung Electronics Co., Ltd. Apparatus and method for displaying
EP2717589A3 (en) * 2012-08-31 2014-09-03 Samsung Electronics Co., Ltd Display device, set-top box and method of determining stereoscopic video content
US20140112636A1 (en) * 2012-10-19 2014-04-24 Arcsoft Hangzhou Co., Ltd. Video Playback System and Related Method of Sharing Video from a Source Device on a Wireless Display
US20140143733A1 (en) * 2012-11-16 2014-05-22 Lg Electronics Inc. Image display apparatus and method for operating the same
US20150100986A1 (en) * 2013-10-08 2015-04-09 Wistron Corp. Controlling method for recording digital television programs
US20150316984A1 (en) * 2014-03-21 2015-11-05 Samsung Electronics Co., Ltd. Wearable device and method of operating the same
US20190392865A1 (en) * 2017-03-17 2019-12-26 Yamaha Corporation Content reproduction device, content reproduction method, and content reproduction system
US11004470B2 (en) * 2017-03-17 2021-05-11 Yamaha Corporation Content reproduction device, content reproduction method, and content reproduction system
TWI809477B (zh) * 2020-08-26 2023-07-21 新加坡商聯發科技(新加坡)私人有限公司 用於視訊靜音模式的多媒體設備和相關方法

Also Published As

Publication number Publication date
JP2012165086A (ja) 2012-08-30
JP5595946B2 (ja) 2014-09-24
CN102630023A (zh) 2012-08-08
CN105611210B (zh) 2018-12-04
CN105611210A (zh) 2016-05-25
US20140016909A1 (en) 2014-01-16
US9094668B2 (en) 2015-07-28

Similar Documents

Publication Publication Date Title
US9094668B2 (en) Digital content receiving apparatus, digital content receiving method and digital content receiving/transmitting method
US20130169762A1 (en) Receiving apparatus, receiving method and transmitting apparatus
US20120033034A1 (en) Receiving apparatus and receiving method
JP2013090020A (ja) 映像出力装置および映像出力方法
JP5481597B2 (ja) デジタルコンテンツ受信装置および受信方法
JP5952451B2 (ja) 受信装置および受信方法
JP5602539B2 (ja) 受信装置
JP2012049932A (ja) 受信装置
JP2012100181A (ja) 映像出力装置、映像出力方法、受信装置および受信方法
JP6185891B2 (ja) 受信装置および受信方法
WO2011151960A1 (ja) 受信装置および出力方法
JP5684415B2 (ja) デジタル放送信号受信装置およびデジタル放送信号受信方法
JP5933063B2 (ja) 受信装置および受信方法
JP5933062B2 (ja) 送受信システムおよび送受信方法
JP5980858B2 (ja) 受信装置および受信方法
JP5961717B2 (ja) 受信装置、受信方法、および送受信方法
JP2012015570A (ja) 受信装置、受信方法、および送受信方法
JP2013090019A (ja) 映像出力装置および映像出力方法
JP2016226008A (ja) 受信装置および受信方法
JP5156795B2 (ja) 表示装置及び表示方法
JP2011254276A (ja) 受信装置、受信方法、および送受信方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: HITACHI CONSUMER ELECTRONICS CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KANEMARU, TAKASHI;TSURUGA, SADAO;OTSUKA, SATOSHI;SIGNING DATES FROM 20111213 TO 20111216;REEL/FRAME:027808/0022

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION