WO2014045614A1 - Video signal transmitting method, video signal receiving apparatus, and video signal receiving method - Google Patents

Video signal transmitting method, video signal receiving apparatus, and video signal receiving method Download PDF

Info

Publication number
WO2014045614A1
WO2014045614A1 PCT/JP2013/057571 JP2013057571W WO2014045614A1 WO 2014045614 A1 WO2014045614 A1 WO 2014045614A1 JP 2013057571 W JP2013057571 W JP 2013057571W WO 2014045614 A1 WO2014045614 A1 WO 2014045614A1
Authority
WO
WIPO (PCT)
Prior art keywords
video signal
signal
information
video
control information
Prior art date
Application number
PCT/JP2013/057571
Other languages
French (fr)
Inventor
Atsushi Hirota
Original Assignee
Kabushiki Kaisha Toshiba
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kabushiki Kaisha Toshiba filed Critical Kabushiki Kaisha Toshiba
Publication of WO2014045614A1 publication Critical patent/WO2014045614A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2362Generation or processing of Service Information [SI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/28Arrangements for simultaneous broadcast of plural pieces of information
    • H04H20/33Arrangements for simultaneous broadcast of plural pieces of information by plural channels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234327Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Definitions

  • VIDEO SIGNAL TRANSMITTING METHOD VIDEO SIGNAL
  • Embodiments described herein relate generally to a video signal transmitting method, video signal
  • HDTV high- definition image
  • 2K1K a high- definition image
  • 1K1K 1,920 pixels in the horizontal direction
  • UHDTV ultra-HDTV
  • ultra-high-definition images such as an image called 4K2K consisting of 3,840 pixels in the horizontal direction and
  • FIG. 1 is an exemplary block diagram for
  • FIG. 2 is a block diagram for explaining an example of a video signal processor in a broadcasting station configuring the content distribution system according to the embodiment
  • FIG. 3 is a block diagram for explaining an example of. a content processor in the broadcasting station according to the embodiment
  • FIG. 4 is a view for explaining an example of control information newly added in the broadcasting station according to the embodiment.
  • FIG. 5 is a schematic block diagram for explaining an example of a receiving terminal configuring the content distribution system according to the
  • FIG. 6 is a block diagram for explaining an example of a content restoration processor forming the receiving terminal according to the embodiment
  • FIG. 7 is a block diagram for explaining an example of a video restoration processor forming the receiving terminal according to the embodiment.
  • FIG. 8 is a view for explaining another example of the control information newly added in the broadcasting station according to the embodiment.
  • FIG. 9 is a view for explaining details of the other example of the control information newly added in the broadcasting station according to the embodiment.
  • FIG. 10 is a view for explaining details of the other example of the control information newly added in the broadcasting station according to the embodiment;
  • FIG. 11 is a view for explaining details of the other example of the control information newly added in the broadcasting station according to the embodiment.
  • FIG. 12 is a flowchart for explaining a part of an example of a main processing operation performed by the receiving terminal according to the embodiment.
  • FIG. 13 is a flowchart for explaining the rest of the example of the main processing operation performed by the receiving terminal according to the embodiment.
  • a video signal transmitting method includes causing a control information addition module to add,, to each of first and second video signals, control information for associating the first and second video signals with each other, and causing a transmitter to transmit the first and second video signals to which the control information is added.
  • FIG. 1 is a schematic view showing an example of a content distribution system 11 to be explained in this embodiment. That is, in the content distribution system 11, a program content distributed by means of a radio broadcast signal as a medium from a broadcasting station 12 is received by a receiving terminal 13 and used to, for example, display an image and reproduce a sound .
  • program contents are supplied by wired or wireless communication from the broadcasting station 12 to a server 14 and accumulated in it.
  • the receiving terminal 13 can access the server 14 via a local area network (LAN) router 15 capable of wired or wireless communication, a network 16 such as a fixed Internet Protocol (IP) network, and a gateway 17.
  • LAN local area network
  • IP Internet Protocol
  • the receiving terminal 13 can implement, for example, a so-called IP broadcasting function of acquiring a program content distributed based on a preset program distribution schedule from the server 14, and, for example, displaying an image and reproducing a sound, and a so-called video on demand (VOD) function of acquiring a program content demanded for the server 14 and, for example, displaying an image and reproducing a sound.
  • IP broadcasting function of acquiring a program content distributed based on a preset program distribution schedule from the server 14, and, for example, displaying an image and reproducing a sound
  • VOD video on demand
  • FIGS. 2 and 3 show transmission signal processing systems by which the broadcasting station 12 performs processing for transmission on a program content to be broadcast or communicated.
  • FIG. 2 shows an example of a video signal processor 18 for performing signal processing for transmission on a video signal forming a program content.
  • the video signal processor 18 includes an input terminal 19 to which an original video signal, for example, a video signal corresponding to 4K2K is supplied.
  • This 4K2K video signal supplied to the input terminal 19 is supplied to a 4K/2K converter 20 and down-converted into a video signal corresponding to 2K1K, and this 2K1K video signal is extracted from an output terminal 21.
  • the 2K1K video signal output from the 4K/2K converter 20 is supplied to a 2K/4K converter 22 and up-converted into a 4K2K video signal, and a subtracter 23 generates a difference signal between this 4K2K video signal and the 4K2K video signal supplied to the input terminal 19. This difference signal is extracted from an output terminal 24.
  • the video signal processor 18 separates the 4K2K original video signal into the down-converted 2K1K video signal and the difference signal between the original video signal and the 4K2K video signal
  • FIG. 3 shows an example of a content processor 25 for performing signal processing for transmission on a program content containing the 2K1K video signal and difference signal separated by the video signal
  • the content processor 25 includes input terminals 26 and 27 to which the 2K1K video signal and difference signal output from the video signal processor 18 are respectively supplied.
  • the 2K1K video signal supplied to the input terminal 26 is supplied to and encoded by a moving experts group (MPEG) 2 encoder 28, packetized by PES by a packetized elementary stream (PES) processor 29, and supplied to a multiplexer 30.
  • MPEG moving experts group
  • PES packetized elementary stream
  • the content processor 25 includes an input terminal 31 to which an audio signal forming the program content is supplied. This audio signal
  • supplied to the input terminal 31 is encoded by an audio encoder 32, packetized by PES by a PES processor 33, and supplied to the multiplexer 30.
  • the content processor 25 includes an input terminal 34 to which subtitle/superimposed text ' information forming the program content is supplied.
  • This subtitle/superimposed text information supplied to the input terminal 34 is packetized by PES by a PES processor 35, and supplied to the multiplexer 30.
  • the content processor 25 includes an input terminal 36 to which program related information forming the program content is supplied. This program related information supplied to the input terminal 36 is supplied to a program specific information
  • PSI service information
  • SI service information
  • the PSI/SI generator 37 newly defines and describes control information in program map table (PMT) information described in the PSI. This control signal is used to associate the 2K1K video signal supplied to the input terminal 26 with the difference signal supplied to the input terminal 27.
  • PMT program map table
  • the PSI/SI generated by the PSI/SI generator 37 is supplied to and sectioned by a sectioning processor 38, and supplied to the multiplexer 30.
  • the multiplexer 30 performs TS packet multiplexing on the video signal, audio signal, and
  • This TS generated by the multiplexer 30 is supplied to a transmission path encoder 39 and
  • the TS is extracted from an output terminal 40 and supplied to be broadcast over a predetermined first channel .
  • the difference signal supplied to the input terminal 27 is supplied to and encoded by a high- efficiency video coding (HEVC) encoder 41, packetized by PES by a PES processor 42, and supplied to a HEVC encoder 41, packetized by PES by a PES processor 42, and supplied to a HEVC encoder 41, packetized by PES by a PES processor 42, and supplied to a HEVC encoder 41, packetized by PES by a PES processor 42, and supplied to a
  • HEVC high- efficiency video coding
  • the program related information supplied to the input terminal 35 is supplied to a
  • the PSI/SI generator 44 newly defines and describes control information in the PMT information in addition to the multiplexed transmission control information of the difference signal. This control information is used to associate the difference signal supplied to the input terminal 27 with the 2K1K video signal supplied to the input terminal 26.
  • This PSI/SI generated by the PSI/SI generator 44 is supplied to and sectioned by a sectioning processor 45, and supplied to the multiplexer 43.
  • the multiplexer 43 generates a TS by multiplexing the difference signal output from the PES processor 42 and the PSI/SI output from the sectioning processor 45.
  • This TS generated by the multiplexer 43 is supplied to a transmission path encoder 46 and subjected to an encoding process corresponding to the transmission path. After that, the TS is extracted from an output terminal 47 and supplied to be broadcast over a second channel.
  • FIG. 4 shows an example of the data structure of the control information newly- defined and described in the PMT information in the PSI/SI generator 37. That is, this control information is newly defined by a name "extended_component_descriptor ()", and transmitted as it is described in a second descriptor area "descriptor ()" of the PMT, which corresponds to a video signal to be supplied to the input terminal 26.
  • complementary_component_flag is a 1-bit flag, and is attribute information indicating whether a video signal having this component is singly
  • "ref_transport_stream_id” is a 16-bit field, and is reference information indicating an identifier "transport_stream_id" of a TS transmitting a component (difference signal) as a reference target of a 2K1K video signal forming this component.
  • “ref_program_number” is a 16-bit field, and is reference information indicating
  • program_number of a TS transmitting a component
  • ref_component__tag is an 8-bit field, and is reference information indicating the tag value of a component (difference signal) as a reference target of a 2K1K video signal forming this component.
  • this descriptor when this descriptor is placed in the P T of the 2K1K video signal generated by the video signal generator 18, the descriptor indicates that a related video signal is separately transmitted.
  • complementary_component_flag 0, reference information or the like indicating that this component is singly reproducible (2K1K) and indicating the acquisition location of a component (difference signal) as a reference target of this component when it is
  • this descriptor when this descriptor is placed in the PMT of the difference signal generated by the video signal processor 18, the descriptor indicates that related video information is separately transmitted.
  • the reference information may contain information indicating, for example, a distribution type indicating the form of distribution of a component as a reference target, i.e., indicating a distribution form such as broadcasting, network, storage medium, or file form, a distribution schedule, a uniform resource identifier (URI), and a charge type.
  • a distribution type indicating the form of distribution of a component as a reference target
  • a distribution form such as broadcasting, network, storage medium, or file form
  • URI uniform resource identifier
  • FIG. 5 is a schematic view showing an example of the signal processing system of the receiving terminal 13. That is, broadcast signals received by an antenna 48 are supplied to a tuner module 50 via an input terminal 49, and a broadcast signal of a desired channel is tuned. This broadcast signal tuned by the tuner module 50 is supplied to a signal processor 51.
  • the signal processor 51 performs a decoding process on the input broadcast signal, thereby restoring a video signal, an audio signal, subtitle/superimposed text information, and program related information. If the video signal is separated into a 2K1K video signal and difference signal, the signal processor 51 also performs a process of restoring an original 4K2K video signal from these signals. Then, the signal processor 51
  • the synthesizer 52 superposes an on screen display (OSD) signal on the video signal supplied from the' signal processor 51, and outputs the synthetic video signal.
  • OSD on screen display
  • the video signal is supplied to the video display module 56 via an output terminal 55 and used to display an image.
  • the audio processor 53 converts the input audio signal into an audio signal having a format
  • This audio signal output from the audio processor 53 is supplied to the loudspeaker 58 via an output terminal 57 and used to reproduce a sound.
  • a controller 59 comprehensively controls various operations including the above-mentioned various receiving operations of the receiving terminal 13.
  • the controller 59 incorporates a central processing unit (CPU) 59a.
  • CPU central processing unit
  • the controller 59 controls each module so as to reflect the operation contents.
  • the controller 59 uses a memory module 59b.
  • the memory module 59b mainly includes a read only memory (ROM) storing a control program to be executed by the CPU 59a, a random access memory (RAM) for providing a work area for the CPU 59a, and a nonvolatile memory storing various kinds of setting information, control information, and the like.
  • a hard disk drive (HDD) 63 is also connected to the controller 59. Based on an operation performed on the operation module 60 or remote controller 61 by the user, the controller 59 can perform control so as to supply the video signal and audio signal obtained from the signal processor 51 to the HDD 63, and cause the HDD 63 to record these signals on a hard disk 63a.
  • the controller 59 can perform control so as to cause the HDD 63 to read out a video signal and audio signal from the hard disk 63a and supply the readout signals to the signal processor 51, thereby using these signals in the above-mentioned video display and audio reproduction .
  • a network interface 64 is connected to the controller 59.
  • the network interface 64 is connected to the LAN router 15 so as to be able to transmit information. Therefore, based on an operation performed on the operation module 60 or remote
  • controller 61 by the user, the controller 59 can access the server 14 via the LAN router, network 16, and gateway 17, and acquire a program content provided by the server 14.
  • a video signal, audio signal, and the like forming a program content acquired from the server 14 are, of course, used in the above-mentioned video display and audio reproduction, and recorded on and reproduced from the hard disk 63a by the HDD 63.
  • FIGS. 6 and 7 show examples of signal processing modules in the signal processor 51.
  • FIG. 6 shows an example of a content restoration processor 65 for performing a decoding process on a broadcast signal tuned by the tuner module 50, thereby restoring a video signal, an audio signal, subtitle/superimposed text information, and program related information.
  • the content restoration processor 65 includes an input terminal 66 for receiving a broadcast signal broadcast by the above-mentioned first channel, i.e., a signal obtained by performing a transmission path encoding process on a TS formed by multiplexing a 2K1K video signal, an audio signal,
  • the content restoration processor 65 also includes an input terminal 67 for receiving a broadcast signal broadcast by the above-mentioned second channel, i.e., a signal obtained by performing a transmission channel encoding process on a TS formed by multiplexing a difference signal and PSI/SI.
  • the tuner module 50 tunes in one of the first and second channels by time-divisionally switching them.
  • the terminal 66 is supplied to a transmission path decoder 68 and subjected to demodulation and a transmission path decoding process of error correction decoding. Consequently, the TS decoded from the encoding process corresponding to the transmission path is supplied to a demultiplexer 69.
  • the demultiplexer 69 demultiplexes the input TS into the 2K1K video signal, audio signal, subtitle/superimposed text information, and PSI/SI.
  • the 2K1K video signal is supplied to and
  • depacketized by a depacketizing processor 70 decoded by an MPEG2 decoder 71, and extracted from an output terminal 72.
  • the audio signal is supplied to and depacketized by a depacketizing processor 73, decoded by an audio decoder 74, and extracted from an output terminal 75.
  • the PSI/SI is supplied to and desectioned by a section restoration module 78, and extracted from an output terminal 79.
  • the broadcast signal supplied to the input terminal 67 is supplied to a transmission path decoder 80 and subjected to a transmission path decoding process. Consequently, the TS decoded from the encoding process corresponding to the transmission path is supplied to a demultiplexer 81.
  • demultiplexer 81 demultiplexes the input TS into the . difference signal and PSI/SI.
  • the PSI/SI is supplied to and desectioned by a section restoration module 82, and extracted from an output terminal 83.
  • the difference signal is supplied to and depacketized by a depacketizing processor 84, decoded by an HEVC decoder 85, and extracted from an output terminal 86.
  • FIG. 7 shows an example of a video restoration processor 87 for restoring the original 4K2K video signal from the 2K1K video signal and difference signal restored by the content restoration processor 65. That is, the video restoration processor 87 includes input terminals 88 and 89 to which the 2K1K video signal and difference signal . output from the content restoration processor 65 are respectively supplied.
  • the 2K1K video signal supplied to the input terminal 88 is supplied to a 2K/4K converter 90 and up- converted into a 4K2K video signal.
  • a synthesizer 91 synthesizes this 4K2K video signal output from the 2K/4K converter 90 with the difference signal supplied to the input terminal 89, thereby restoring the original 4K2K video signal.
  • This 4K2K video signal is extracted from an output terminal 92.
  • the synthesizer 91 must synthesize the 4K2K video signal up-converted by the 2K/4K converter 90 and the difference signal output from the HEVC decoder 34 in a synchronized state by using, for example, a presentation time stamp (PTS).
  • PTS presentation time stamp
  • broadcasting station 11 does not broadcast a 4K2K video signal by simply encoding it, but encodes and
  • attribute information For the 2K1K video signal and difference signal to be broadcast, attribute information
  • reference information indicating whether the component is singly reproducible or reproducible when using another component
  • reference information indicating the acquisition location of a component as a reference target of the component when it is reproduced, and the like are newly defined and described in the PMT information contained in the PSI/SI.
  • the receiving terminal 13 can determine whether it is necessary to use another component in order to reproduce a received broadcast signal, by checking the attribute information of the broadcast signal. This makes it possible to avoid the inconvenience that a reproduction process is performed on only the difference signal, and transmit the video signal with higher reliability.
  • the receiving terminal 13 can perform any combination of the receiving terminal 13. Furthermore, the receiving terminal 13 can share the receiving terminal 13.
  • difference signal is distributed in the form of a file
  • a file acquisition process can be given priority.
  • this signal can be stored in the HDD 63.
  • control information such as the attribute information and reference information concerning the 2K1K video signal and difference signal is newly defined and described in the PMT information described in the PSI of the PSI/SI.
  • transmission information like this can also be newly defined and described in event information table (EIT) information described in the SI of the PSI/SI.
  • EIT event information table
  • FIG. 8 shows an example of the data structure of control information newly defined and described in the
  • this control information can newly be defined by a name
  • extended_service_type is an 8-bit field, and indicates the type of extended service as shown in an example of FIG. 9.
  • return_to_brodcast_flag is a 1-bit flag, and indicates the necessity of
  • delivery_type is a 1-bit flag, and indicates the distribution method of a distribution content. For example “0” . indicates streaming, and “1” indicates download. “contact-flag” is a 1-bit flag, and indicates the necessity of a contract when
  • start_time_offset_polarity indicates the polarity of the offset time from the start of
  • start_time_offset is a 16-bit field, and designates, within the range of -12 hrs to +12 hrs, the offset time from the start of broadcasting of the program content to the start of reproduction of the communication content.
  • the place of 10 hrs, the place of 1 hr, the place of 10 min, and the place of 1 min of the offset time are encoded by a 4-bit binary coded decimal (BCD) .
  • expiration_date is a 40-bit field, and indicates the period of reproduction of the
  • the 16 lower bits of MJB are encoded by 16 bits, and the 24
  • BCD binary coded decimals
  • extended video resolution format flag is a 1-bit flag, and indicates information indicating whether the distribution content has a video resolution extended from the present broadcasting specifications. For example, “0" indicates no extension (compliance with the present broadcasting specifications), and “1" indicates extension.
  • extended_video_resolution_format is an 8-bit field, and indicates information of a video resolution
  • 3D_video_format_flag is a 1-bit flag, and indicates whether the distribution content contains 3D video. For example, "0" indicates that no 3D
  • 3D_video_format is an 8-bit field, and indicates 3D video format identification information as shown in an example of FIG. 11.
  • FIGS. 12 and 13 are flowcharts showing an example of a main processing operation performed by the
  • controller 59 of the receiving terminal 13 when newly defined control information having a data structure as shown in FIG. 8 is described in the EIT information contained in the SI of the aforementioned PSI/SI.
  • step Sll when the process is started (step Sll) , the controller 59 refers to "expiration_date" in the
  • step S12 determines in step S13 whether the content is within the period of reproduction. If it is determined that the content is within the period of reproduction (YES) , the controller 59 refers to "3D_video_format" in the EIT information in step S14, and determines in step S15 whether the content corresponds to 3D video.
  • the controller 59 refers to
  • step S16 determines in step S17 whether the content corresponds to 4K2K. If it is determined that the content corresponds to 4K2K (YES) , the controller 59 refers. to "contract_flag" in the EIT information in step S18.
  • step S18 or if it is determined in step S13 that the content is not within the period of
  • step S15 determines whether the entry is completed. If it is determined that the entry is not completed (NO) , the process advances to the next entry in step S20, and returns to step S12.
  • step S19 If it is determined in step S19 that the entry is completed (YES) , the controller 59 determines in step S21 whether there is a target content. If it is determined that there is no target content (NO) , the process advances to step S22, and the controller 59 causes the video display module 56 to display a message indicating that there is no target content
  • step S21 determines that there is a target content (YES)
  • the controller 59 causes the video display module 56 to display a target content selection screen in step S24, and determines in step S25 whether content is selected.
  • the controller 59 refers to "start_time_o-ffset" and "start_time_offset_polarity" in the EIT information in step S26, performs time measurement in step S27, and determines in step S28 whether the offset time has elapsed.
  • the controller 59 refers to the URI of the target content described i the EIT information in step S29, refers to "delivery_type" in the EIT
  • step S30 accesses, acquires, and reproduces the selected target content in step S31.
  • the original video signal is not limited to the 4K2K video format.
  • the embodiment is applicable to a 6K3K or 8K4K video signal.
  • the video encoding method to be applied is not limited to MPEG2 or HEVC .
  • the embodiment is applicable not only to differential transmission but also to general divisional transmission.
  • the above embodiment is applicable to a method of displaying a 3D (dimension) image by using two video signals, i.e., a right-eye video signal and left-eye video signal.
  • a first video signal is used as a left-eye video signal
  • a second video signal is used as a right-eye video signal
  • one video signal is displayed when displaying a 2D image
  • both the video signals are displayed when
  • a first video signal is used as a left-eye video signal
  • a second video signal is used as a difference signal for generating a right-eye video signal from the first video signal
  • the first video signal is displayed when displaying a 2D image
  • both the video signals are displayed when displaying a 3D image.
  • the various modules of the systems described herein can be implemented as software applications, hardware and/or software modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.

Abstract

According to one embodiment, a video signal transmitting method includes causing a control information addition module to add, to each of first and second video signals, control information for associating the first and second video signals with each other, and causing a transmitter to transmit the first and second video signals to which the control information is added.

Description

D E S C R I P-T I O N
VIDEO SIGNAL TRANSMITTING METHOD, VIDEO SIGNAL
RECEIVING APPARATUS, AND VIDEO SIGNAL RECEIVING METHOD
Cross-Reference to Related Applications This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2012-205016, filed September 18, 2012, the entire contents of which are incorporated herein by reference.
Field
Embodiments described herein relate generally to a video signal transmitting method, video signal
receiving apparatus, and video signal receiving method capable of transmitting or receiving a high-definition video signal.
Background
As is well known, high-definition television
(HDTV) corresponding to the display of a high- definition image called 2K1K consisting of 1,920 pixels in the horizontal direction and 1,080 pixels in the vertical . direction is currently most frequently used for high-definition digital television broadcast receivers for consumers.
On the other hand, the development of ultra-HDTV (UHDTV) is also being extensively made. UHDTV
corresponds to the display of ultra-high-definition images such as an image called 4K2K consisting of 3,840 pixels in the horizontal direction and
2,160 pixels in the vertical direction and having a resolution four times as high as that of HDTV, and an image called 8K4K consisting of 7,680 pixels in the horizontal direction and 4,320 pixels in the vertical direction and having a resolution 16 times as high as that of HDTV.
Techniques for displaying ultra-high-definition images as described above are still in the stage of development, and strong demands have arisen for the development of details so that these techniques become practical by taking the convenience of users into consideration. In particular, when distributing an ultra-high-definition video signal having a large information amount through a limited transmission path band, it is important not only to increase the
transmission efficiency but also to improve the
reliability of transmission.
Brief Description of the Drawings A general architecture that implements the various features of the embodiments will now be described with reference to the drawings . The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the
invention .
FIG. 1 is an exemplary block diagram for
explaining an example of a content distribution system as an embodiment;
FIG. 2 is a block diagram for explaining an example of a video signal processor in a broadcasting station configuring the content distribution system according to the embodiment;
FIG. 3 is a block diagram for explaining an example of. a content processor in the broadcasting station according to the embodiment;
FIG. 4 is a view for explaining an example of control information newly added in the broadcasting station according to the embodiment;
FIG. 5 is a schematic block diagram for explaining an example of a receiving terminal configuring the content distribution system according to the
embodiment;
FIG. 6 is a block diagram for explaining an example of a content restoration processor forming the receiving terminal according to the embodiment;
FIG. 7 is a block diagram for explaining an example of a video restoration processor forming the receiving terminal according to the embodiment;
FIG. 8 is a view for explaining another example of the control information newly added in the broadcasting station according to the embodiment;
FIG. 9 is a view for explaining details of the other example of the control information newly added in the broadcasting station according to the embodiment; FIG. 10 is a view for explaining details of the other example of the control information newly added in the broadcasting station according to the embodiment;
FIG. 11 is a view for explaining details of the other example of the control information newly added in the broadcasting station according to the embodiment;
FIG. 12 is a flowchart for explaining a part of an example of a main processing operation performed by the receiving terminal according to the embodiment; and
FIG. 13 is a flowchart for explaining the rest of the example of the main processing operation performed by the receiving terminal according to the embodiment.
Detailed Description
Various embodiments will be described hereinafter with reference to the accompanying drawings.
In general, according to one embodiment, a video signal transmitting method includes causing a control information addition module to add,, to each of first and second video signals, control information for associating the first and second video signals with each other, and causing a transmitter to transmit the first and second video signals to which the control information is added.
FIG. 1 is a schematic view showing an example of a content distribution system 11 to be explained in this embodiment. That is, in the content distribution system 11, a program content distributed by means of a radio broadcast signal as a medium from a broadcasting station 12 is received by a receiving terminal 13 and used to, for example, display an image and reproduce a sound .
Also, in the content distribution system 11, program contents are supplied by wired or wireless communication from the broadcasting station 12 to a server 14 and accumulated in it. The receiving terminal 13 can access the server 14 via a local area network (LAN) router 15 capable of wired or wireless communication, a network 16 such as a fixed Internet Protocol (IP) network, and a gateway 17.
Accordingly, the receiving terminal 13 can implement, for example, a so-called IP broadcasting function of acquiring a program content distributed based on a preset program distribution schedule from the server 14, and, for example, displaying an image and reproducing a sound, and a so-called video on demand (VOD) function of acquiring a program content demanded for the server 14 and, for example, displaying an image and reproducing a sound.
FIGS. 2 and 3 show transmission signal processing systems by which the broadcasting station 12 performs processing for transmission on a program content to be broadcast or communicated. FIG. 2 shows an example of a video signal processor 18 for performing signal processing for transmission on a video signal forming a program content.
That is, the video signal processor 18 includes an input terminal 19 to which an original video signal, for example, a video signal corresponding to 4K2K is supplied. This 4K2K video signal supplied to the input terminal 19 is supplied to a 4K/2K converter 20 and down-converted into a video signal corresponding to 2K1K, and this 2K1K video signal is extracted from an output terminal 21.
Also, the 2K1K video signal output from the 4K/2K converter 20 is supplied to a 2K/4K converter 22 and up-converted into a 4K2K video signal, and a subtracter 23 generates a difference signal between this 4K2K video signal and the 4K2K video signal supplied to the input terminal 19. This difference signal is extracted from an output terminal 24.
That is, the video signal processor 18 separates the 4K2K original video signal into the down-converted 2K1K video signal and the difference signal between the original video signal and the 4K2K video signal
obtained by up-converting the 2K1K video signal, and outputs the 2K1K video signal and difference signal.
FIG. 3 shows an example of a content processor 25 for performing signal processing for transmission on a program content containing the 2K1K video signal and difference signal separated by the video signal
processor 18. That is, the content processor 25 includes input terminals 26 and 27 to which the 2K1K video signal and difference signal output from the video signal processor 18 are respectively supplied.
The 2K1K video signal supplied to the input terminal 26 is supplied to and encoded by a moving experts group (MPEG) 2 encoder 28, packetized by PES by a packetized elementary stream (PES) processor 29, and supplied to a multiplexer 30.
Also, the content processor 25 includes an input terminal 31 to which an audio signal forming the program content is supplied. This audio signal
supplied to the input terminal 31 is encoded by an audio encoder 32, packetized by PES by a PES processor 33, and supplied to the multiplexer 30.
In addition, the content processor 25 includes an input terminal 34 to which subtitle/superimposed text ' information forming the program content is supplied. This subtitle/superimposed text information supplied to the input terminal 34 is packetized by PES by a PES processor 35, and supplied to the multiplexer 30.
Furthermore, the content processor 25 includes an input terminal 36 to which program related information forming the program content is supplied. This program related information supplied to the input terminal 36 is supplied to a program specific information
( PSI ) /service information (SI) generator 37, and used to generate PSI/SI. That is, the PSI/SI generator 37 mainly generates PSI for performing control such that the video signal, audio signal, and subtitle/superimposed text
information supplied to the input terminals 26, 31, and 34 are associated with each other as multiplexed transmission control information for each program, and, at the time of reception and reproduction, the
configuration signal of a selected program is
demultiplexed, and the individual signals are
synchronously reproduced at proper timings, and SI as information related to the transmission channel and program.
Although details will be described later,■ the PSI/SI generator 37 newly defines and describes control information in program map table (PMT) information described in the PSI. This control signal is used to associate the 2K1K video signal supplied to the input terminal 26 with the difference signal supplied to the input terminal 27.
The PSI/SI generated by the PSI/SI generator 37 is supplied to and sectioned by a sectioning processor 38, and supplied to the multiplexer 30.
The multiplexer 30 performs TS packet multiplexing on the video signal, audio signal, and
subtitle/superimposed text information output from the PES processors 29, 33, and 35, and on the PSI/SI output from the sectioning processor 38, thereby generating a transport stream (TS) .
This TS generated by the multiplexer 30 is supplied to a transmission path encoder 39 and
subjected to modulation corresponding to the
transmission path and error correction coding. After that, the TS is extracted from an output terminal 40 and supplied to be broadcast over a predetermined first channel .
Also, the difference signal supplied to the input terminal 27 is supplied to and encoded by a high- efficiency video coding (HEVC) encoder 41, packetized by PES by a PES processor 42, and supplied to a
multiplexer 43.
Furthermore, the program related information supplied to the input terminal 35 is supplied to a
PSI/SI generator 44. The PSI/SI generator 44 newly defines and describes control information in the PMT information in addition to the multiplexed transmission control information of the difference signal. This control information is used to associate the difference signal supplied to the input terminal 27 with the 2K1K video signal supplied to the input terminal 26.
This PSI/SI generated by the PSI/SI generator 44 is supplied to and sectioned by a sectioning processor 45, and supplied to the multiplexer 43.
The multiplexer 43 generates a TS by multiplexing the difference signal output from the PES processor 42 and the PSI/SI output from the sectioning processor 45. This TS generated by the multiplexer 43 is supplied to a transmission path encoder 46 and subjected to an encoding process corresponding to the transmission path. After that, the TS is extracted from an output terminal 47 and supplied to be broadcast over a second channel.
FIG. 4 shows an example of the data structure of the control information newly- defined and described in the PMT information in the PSI/SI generator 37. That is, this control information is newly defined by a name "extended_component_descriptor ()", and transmitted as it is described in a second descriptor area "descriptor ()" of the PMT, which corresponds to a video signal to be supplied to the input terminal 26.
Practical examples of the contents are as follows.
First, "complementary_component_flag" is a 1-bit flag, and is attribute information indicating whether a video signal having this component is singly
reproducible. For example, "0" indicates that the signal is meaningfully reproducible as a single
service, and "1" indicates that the signal is
meaningfully reproducible as a service by using another component .
Also, "ref_transport_stream_id" is a 16-bit field, and is reference information indicating an identifier "transport_stream_id" of a TS transmitting a component (difference signal) as a reference target of a 2K1K video signal forming this component.
Furthermore, "ref_program_number" is a 16-bit field, and is reference information indicating
"program_number" of a TS transmitting a component
(difference signal) as a reference target of a 2K1K video signal forming this component.
In addition, "ref_component__tag" is an 8-bit field, and is reference information indicating the tag value of a component (difference signal) as a reference target of a 2K1K video signal forming this component.
That is, when this descriptor is placed in the P T of the 2K1K video signal generated by the video signal generator 18, the descriptor indicates that a related video signal is separately transmitted. When a
component is transmitted by setting
complementary_component_flag = 0, reference information or the like indicating that this component is singly reproducible (2K1K) and indicating the acquisition location of a component (difference signal) as a reference target of this component when it is
reproduced as 4K2K is newly added and broadcast.
Also, when this descriptor is placed in the PMT of the difference signal generated by the video signal processor 18, the descriptor indicates that related video information is separately transmitted. By transmitting a component by setting complementary_component_flag = 1, reference information or the like indicating that this component is not singly reproducible and indicating the acquisition location of a component (2K1K video signal) as a reference target of this component when it is
reproduced as 4K2K is newly added and broadcast.
Note that the reference information may contain information indicating, for example, a distribution type indicating the form of distribution of a component as a reference target, i.e., indicating a distribution form such as broadcasting, network, storage medium, or file form, a distribution schedule, a uniform resource identifier (URI), and a charge type.
FIG. 5 is a schematic view showing an example of the signal processing system of the receiving terminal 13. That is, broadcast signals received by an antenna 48 are supplied to a tuner module 50 via an input terminal 49, and a broadcast signal of a desired channel is tuned. This broadcast signal tuned by the tuner module 50 is supplied to a signal processor 51.
Although details will be described later, the signal processor 51 performs a decoding process on the input broadcast signal, thereby restoring a video signal, an audio signal, subtitle/superimposed text information, and program related information. If the video signal is separated into a 2K1K video signal and difference signal, the signal processor 51 also performs a process of restoring an original 4K2K video signal from these signals. Then, the signal processor
51 outputs the restored video signal to a synthesizer 52, and the restored audio signal to an audio processor 53.
The synthesizer 52 superposes an on screen display (OSD) signal on the video signal supplied from the' signal processor 51, and outputs the synthetic video signal. This video signal output from the synthesizer
52 is supplied to a video processor 54, and converted into a format displayable on a flat panel video display module 56 having a liquid crystal display panel or the like in the output stage. After that, the video signal is supplied to the video display module 56 via an output terminal 55 and used to display an image.
The audio processor 53 converts the input audio signal into an audio signal having a format
reproducible by a loudspeaker 58 in the output stage. This audio signal output from the audio processor 53 is supplied to the loudspeaker 58 via an output terminal 57 and used to reproduce a sound.
A controller 59 comprehensively controls various operations including the above-mentioned various receiving operations of the receiving terminal 13. The controller 59 incorporates a central processing unit (CPU) 59a. When receiving operation information from an operation module 60 formed in the main body of the receiving terminal 13 or receiving operation
information transmitted from a remote controller 61 and received by a receiver 62, the controller 59 controls each module so as to reflect the operation contents.
In this case, the controller 59 uses a memory module 59b. The memory module 59b mainly includes a read only memory (ROM) storing a control program to be executed by the CPU 59a, a random access memory (RAM) for providing a work area for the CPU 59a, and a nonvolatile memory storing various kinds of setting information, control information, and the like.
A hard disk drive (HDD) 63 is also connected to the controller 59. Based on an operation performed on the operation module 60 or remote controller 61 by the user, the controller 59 can perform control so as to supply the video signal and audio signal obtained from the signal processor 51 to the HDD 63, and cause the HDD 63 to record these signals on a hard disk 63a.
In addition, based on an operation performed on the operation module 60 or remote controller 61 by the user, the controller 59 can perform control so as to cause the HDD 63 to read out a video signal and audio signal from the hard disk 63a and supply the readout signals to the signal processor 51, thereby using these signals in the above-mentioned video display and audio reproduction .
Furthermore, a network interface 64 is connected to the controller 59. The network interface 64 is connected to the LAN router 15 so as to be able to transmit information. Therefore, based on an operation performed on the operation module 60 or remote
controller 61 by the user, the controller 59 can access the server 14 via the LAN router, network 16, and gateway 17, and acquire a program content provided by the server 14.
Note that a video signal, audio signal, and the like forming a program content acquired from the server 14 are, of course, used in the above-mentioned video display and audio reproduction, and recorded on and reproduced from the hard disk 63a by the HDD 63.
FIGS. 6 and 7 show examples of signal processing modules in the signal processor 51. FIG. 6 shows an example of a content restoration processor 65 for performing a decoding process on a broadcast signal tuned by the tuner module 50, thereby restoring a video signal, an audio signal, subtitle/superimposed text information, and program related information.
That is, the content restoration processor 65 includes an input terminal 66 for receiving a broadcast signal broadcast by the above-mentioned first channel, i.e., a signal obtained by performing a transmission path encoding process on a TS formed by multiplexing a 2K1K video signal, an audio signal,
subtitle/superimposed text information, and PSI/SI. The content restoration processor 65 also includes an input terminal 67 for receiving a broadcast signal broadcast by the above-mentioned second channel, i.e., a signal obtained by performing a transmission channel encoding process on a TS formed by multiplexing a difference signal and PSI/SI. In this case, the tuner module 50 tunes in one of the first and second channels by time-divisionally switching them.
The broadcast signal supplied to the input
terminal 66 is supplied to a transmission path decoder 68 and subjected to demodulation and a transmission path decoding process of error correction decoding. Consequently, the TS decoded from the encoding process corresponding to the transmission path is supplied to a demultiplexer 69. The demultiplexer 69 demultiplexes the input TS into the 2K1K video signal, audio signal, subtitle/superimposed text information, and PSI/SI.
The 2K1K video signal is supplied to and
depacketized by a depacketizing processor 70, decoded by an MPEG2 decoder 71, and extracted from an output terminal 72. The audio signal is supplied to and depacketized by a depacketizing processor 73, decoded by an audio decoder 74, and extracted from an output terminal 75. The subtitle/superimposed text
information is supplied to and depacketized by a depacketizing processor 76, and extracted from an output terminal 77. The PSI/SI is supplied to and desectioned by a section restoration module 78, and extracted from an output terminal 79.
On the other hand, the broadcast signal supplied to the input terminal 67 is supplied to a transmission path decoder 80 and subjected to a transmission path decoding process. Consequently, the TS decoded from the encoding process corresponding to the transmission path is supplied to a demultiplexer 81. The
demultiplexer 81 demultiplexes the input TS into the . difference signal and PSI/SI.
The PSI/SI is supplied to and desectioned by a section restoration module 82, and extracted from an output terminal 83. The difference signal is supplied to and depacketized by a depacketizing processor 84, decoded by an HEVC decoder 85, and extracted from an output terminal 86.
FIG. 7 shows an example of a video restoration processor 87 for restoring the original 4K2K video signal from the 2K1K video signal and difference signal restored by the content restoration processor 65. That is, the video restoration processor 87 includes input terminals 88 and 89 to which the 2K1K video signal and difference signal . output from the content restoration processor 65 are respectively supplied.
The 2K1K video signal supplied to the input terminal 88 is supplied to a 2K/4K converter 90 and up- converted into a 4K2K video signal. A synthesizer 91 synthesizes this 4K2K video signal output from the 2K/4K converter 90 with the difference signal supplied to the input terminal 89, thereby restoring the original 4K2K video signal. This 4K2K video signal is extracted from an output terminal 92.
Note that the synthesizer 91 must synthesize the 4K2K video signal up-converted by the 2K/4K converter 90 and the difference signal output from the HEVC decoder 34 in a synchronized state by using, for example, a presentation time stamp (PTS).
In the above-mentioned embodiment, the
broadcasting station 11 does not broadcast a 4K2K video signal by simply encoding it, but encodes and
broadcasts a 2K1K video signal obtained by down- converting an original 4K2K video signal, and also encodes and broadcasts a difference signal between the original video signal and a 4K2K video signal obtained by up-converting the 2K1K video signal. This makes it possible to increase the transmission efficiency when distributing 4K2K video signals, i.e., ultra-high- definition video signals.
Also, for the 2K1K video signal and difference signal to be broadcast, attribute information
indicating whether the component is singly reproducible or reproducible when using another component, reference information indicating the acquisition location of a component as a reference target of the component when it is reproduced, and the like are newly defined and described in the PMT information contained in the PSI/SI.
Accordingly, the receiving terminal 13 can determine whether it is necessary to use another component in order to reproduce a received broadcast signal, by checking the attribute information of the broadcast signal. This makes it possible to avoid the inconvenience that a reproduction process is performed on only the difference signal, and transmit the video signal with higher reliability.
Furthermore, the receiving terminal 13 can
appropriately perform a process of acquiring another component as a reference target, by checking the reference information of a received broadcast signal. For example, if it is found by the reference
information that one of a 2K1K video signal and
difference signal is distributed in the form of a file, a file acquisition process can be given priority. In addition, when another signal is distributed as a stream before the acquisition of the file is completed, this signal can be stored in the HDD 63.
In the aforementioned embodiment, the control information such as the attribute information and reference information concerning the 2K1K video signal and difference signal is newly defined and described in the PMT information described in the PSI of the PSI/SI. However, transmission information like this can also be newly defined and described in event information table (EIT) information described in the SI of the PSI/SI.
FIG. 8 shows an example of the data structure of control information newly defined and described in the
EIT information contained in the SI of the above- mentioned PSI/SI. That is, this control information can newly be defined by a name
"extended_service_descriptor ()", and transmitted as it is described in a descriptor area "descriptor ()" of the EIT.
Practical examples of the content are as follows. First, "extended_service_type" is an 8-bit field, and indicates the type of extended service as shown in an example of FIG. 9. Also, "return_to_brodcast_flag" is a 1-bit flag, and indicates the necessity of
transition to broadcasting after communication content reproduction is completed. For example, "0" indicates that the return is unnecessary, and "1" indicates that the return to a broadcasting service before the
transition to the communication content is necessary.
Furthermore, "delivery_type" is a 1-bit flag, and indicates the distribution method of a distribution content. For example "0" . indicates streaming, and "1" indicates download. "contact-flag" is a 1-bit flag, and indicates the necessity of a contract when
receiving and reproducing a distribution content. For example, "0" indicates that the contract is
unnecessary, and "1" indicates that the contract is necessary. "start_time_offset_polarity" indicates the polarity of the offset time from the start of
broadcasting of the next program to the start of reproduction of the communication content. For
example, "0" is to advance the start of reproduction by the offset time, and "1" is to delay the start of reproduction by the offset time.
Also, "start_time_offset" is a 16-bit field, and designates, within the range of -12 hrs to +12 hrs, the offset time from the start of broadcasting of the program content to the start of reproduction of the communication content. The place of 10 hrs, the place of 1 hr, the place of 10 min, and the place of 1 min of the offset time are encoded by a 4-bit binary coded decimal (BCD) .
In addition, "expiration_date" is a 40-bit field, and indicates the period of reproduction of the
communication content. In this field, the 16 lower bits of MJB are encoded by 16 bits, and the 24
subsequent bits are encoded by six 4-bit binary coded decimals (BCD) . If no period of reproduction is defined, all the bits in this field are set to "1".
Next, practical examples of the format information are as follows.
First, "extended video resolution format flag" is a 1-bit flag, and indicates information indicating whether the distribution content has a video resolution extended from the present broadcasting specifications. For example, "0" indicates no extension (compliance with the present broadcasting specifications), and "1" indicates extension.
"extended_video_resolution_format" is an 8-bit field, and indicates information of a video resolution
extended from the present broadcasting specifications as shown in an example of FIG. 10.
Also, "3D_video_format_flag" is a 1-bit flag, and indicates whether the distribution content contains 3D video. For example, "0" indicates that no 3D
(dimension) video is contained, and "1" indicates that 3D video is contained. " 3D_video_format" is an 8-bit field, and indicates 3D video format identification information as shown in an example of FIG. 11.
FIGS. 12 and 13 are flowcharts showing an example of a main processing operation performed by the
controller 59 of the receiving terminal 13 when newly defined control information having a data structure as shown in FIG. 8 is described in the EIT information contained in the SI of the aforementioned PSI/SI.
That is, when the process is started (step Sll) , the controller 59 refers to "expiration_date" in the
EIT information in step S12, and determines in step S13 whether the content is within the period of reproduction. If it is determined that the content is within the period of reproduction (YES) , the controller 59 refers to "3D_video_format" in the EIT information in step S14, and determines in step S15 whether the content corresponds to 3D video.
If it is determined that the content corresponds to 3D video (YES), the controller 59 refers to
"extended_video_resolution_format" in the EIT
information in step S16, and determines in step S17 whether the content corresponds to 4K2K. If it is determined that the content corresponds to 4K2K (YES) , the controller 59 refers. to "contract_flag" in the EIT information in step S18.
After step S18, or if it is determined in step S13 that the content is not within the period of
reproduction (NO) , or if it is determined in step S15 that the content does not correspond to 3D video (NO) , or if it is determined in step S17 that the content does not correspond to 4K2K (NO) , the controller 59 determines in step S19 whether the entry is completed. If it is determined that the entry is not completed (NO) , the process advances to the next entry in step S20, and returns to step S12.
If it is determined in step S19 that the entry is completed (YES) , the controller 59 determines in step S21 whether there is a target content. If it is determined that there is no target content (NO) , the process advances to step S22, and the controller 59 causes the video display module 56 to display a message indicating that there is no target content
distribution, and terminates the process (step S23) .
On the other hand, if it is determined in step S21 that there is a target content (YES), the controller 59 causes the video display module 56 to display a target content selection screen in step S24, and determines in step S25 whether content is selected.
If it is determined that content is selected
(YES) , the controller 59 refers to "start_time_o-ffset" and "start_time_offset_polarity" in the EIT information in step S26, performs time measurement in step S27, and determines in step S28 whether the offset time has elapsed.
If it is determined that the offset time has elapsed (YES), the controller 59 refers to the URI of the target content described i the EIT information in step S29, refers to "delivery_type" in the EIT
information in step S30, and accesses, acquires, and reproduces the selected target content in step S31.
In the above-mentioned embodiment, the
differential transmission of a 4K2K video signal has been described. However, the original video signal is not limited to the 4K2K video format. For example, the embodiment is applicable to a 6K3K or 8K4K video signal. Also, the video encoding method to be applied is not limited to MPEG2 or HEVC .
Furthermore, the embodiment is applicable not only to differential transmission but also to general divisional transmission. For example, the above embodiment is applicable to a method of displaying a 3D (dimension) image by using two video signals, i.e., a right-eye video signal and left-eye video signal.
As a first 3D system of this method, a first video signal is used as a left-eye video signal, a second video signal is used as a right-eye video signal, one video signal is displayed when displaying a 2D image, and both the video signals are displayed when
displaying a 3D image. As a second 3D system of the method, a first video signal is used as a left-eye video signal, a second video signal is used as a difference signal for generating a right-eye video signal from the first video signal, the first video signal is displayed when displaying a 2D image, and both the video signals are displayed when displaying a 3D image.
The various modules of the systems described herein can be implemented as software applications, hardware and/or software modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their
equivalents are intended to cover such forms or
modifications as would fall within the scope and spirit of the inventions.

Claims

C L A I M S
1. A video signal transmitting method comprising: adding, by a control information addition module, to a first video signal and a second video signal, control information for associating the first video signal and the second video signal with each other; and transmitting, by a transmitter, the first video signal and the second video signal to which the control information is added.
2. The method of Claim 1, wherein
the first video signal is a signal obtained by down-converting an original video signal, and
the second video signal is a difference signal between the original video signal and a signal obtained by up-converting the first video signal.
3. The method of Claim 1, wherein the control information addition module is configured to add, to the first video signal and the second video signal, attribute information indicating whether each video signal is singly reproducible.
4. The method of Claim 1, wherein the control information addition module is configured to add, to the first video signal and the second video signal, reference information indicating information as a reference target of each video signal when it is reproduced .
5. The method of Claim 1, wherein the control information addition module is configured to describe the control information in one of PMT information in PSI multiplexed in each of the first video signal and the second video signal, and EIT information in PSI/SI multiplexed in each of the first video signal and the second video signal.
6. A video signal receiving apparatus comprising: an input module configured to receive a first video signal and a second video signal; and
a controller configured to associate the first video signal and the second video signal with each other, based on control information added to the first video signal and the second video signal input to said input module, and used to associate the first video signal and the second video signal with each other.
7. The apparatus of Claim 6, wherein
the first video signal is a signal obtained by down-converting an original video signal, and
the second video signal is a difference signal between the original video signal and a signal obtained by up-converting the first video signal.
8. The apparatus of Claim 6, wherein the control information is attribute information indicating whether each of the first video signal and the second video signal is singly reproducible.
9. The apparatus of Claim 6, wherein the control information is reference information indicating information as a reference target of each of the first video signal and the second video signal when it is reproduced.
10. The apparatus of Claim 6, wherein the control information is described in one of PMT information in PSI multiplexed in each of the first video signal and the second video signal, and EIT information in PSI/SI multiplexed in each of the first video signal and the second video signal.
11. A video signal receiving method comprising: receiving a first video signal and a second video signal by an input module; and
associating, by a controller, the first video signal and the second video signal with each other, based on control information added to the first video signal and the second video signal input to said input module, and used to associate the first video signal and the second video signal with each other.
PCT/JP2013/057571 2012-09-18 2013-03-12 Video signal transmitting method, video signal receiving apparatus, and video signal receiving method WO2014045614A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012205016A JP2014060625A (en) 2012-09-18 2012-09-18 Video signal transmission method, video signal receiver, and video signal reception method
JP2012-205016 2012-09-18

Publications (1)

Publication Number Publication Date
WO2014045614A1 true WO2014045614A1 (en) 2014-03-27

Family

ID=50340947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/057571 WO2014045614A1 (en) 2012-09-18 2013-03-12 Video signal transmitting method, video signal receiving apparatus, and video signal receiving method

Country Status (2)

Country Link
JP (1) JP2014060625A (en)
WO (1) WO2014045614A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6279463B2 (en) * 2014-12-26 2018-02-14 株式会社東芝 Content transmission device, content reception device, content transmission method, and content reception method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10136017A (en) * 1996-10-30 1998-05-22 Matsushita Electric Ind Co Ltd Data transfer system
EP0884904A1 (en) * 1996-12-06 1998-12-16 Matsushita Electric Industrial Co., Ltd. Method and apparatus for transmitting, encoding and decoding video signal and recording/reproducing method of optical disc

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10136017A (en) * 1996-10-30 1998-05-22 Matsushita Electric Ind Co Ltd Data transfer system
EP0884904A1 (en) * 1996-12-06 1998-12-16 Matsushita Electric Industrial Co., Ltd. Method and apparatus for transmitting, encoding and decoding video signal and recording/reproducing method of optical disc

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"OPERATIONAL GUIDELINES FOR DIGITAL SATELLITE BROADCASTING ARIB TECHNICAL REPORT ARIB TR-B15, Version 5.6", ASSOCIATION OF RADIO INDUSTRIES AND BUSINESSES, vol. 1, no. 4, 14 February 2012 (2012-02-14), pages 41, 189, 190, 395 - 397 *

Also Published As

Publication number Publication date
JP2014060625A (en) 2014-04-03

Similar Documents

Publication Publication Date Title
JP6373179B2 (en) Digital broadcast receiving apparatus, digital broadcast receiving method, and program
US11343549B2 (en) Reception apparatus, reception method, transmission apparatus, and transmission method
US11871087B2 (en) Receiver, reception method, transmitter, and transmission method
RU2547624C2 (en) Signalling method for broadcasting video content, recording method and device using signalling
US11962843B2 (en) Broadcast receiving apparatus, broadcast receiving method, and contents outputting method
US20160142744A1 (en) Transmission apparatus and transmission/reception system
US20120096495A1 (en) Broadcast reception device, broadcast reception method, and broadcast transmission device
US9998798B2 (en) Reception device, reception method, transmission device, and transmission method
US11956159B2 (en) Transmission device, transmission method, reception device, and reception method
JP2007043739A (en) Method and system for providing content description information and connection information
US20140168513A1 (en) Electronic apparatus, method of controlling an electronic apparatus and program for controlling an electronic apparatus
US20140119542A1 (en) Information processing device, information processing method, and information processing program product
US10333728B2 (en) Reception apparatus, reception method, transmission apparatus, and transmission method
WO2014045614A1 (en) Video signal transmitting method, video signal receiving apparatus, and video signal receiving method
JP7352889B2 (en) Broadcast receiving device, broadcast receiving method and program
KR20180058615A (en) Apparatus for converting broadcasting signal method for using the same
EP3668101B1 (en) Transmission device, transmission method, reception device, and reception method
JP6251834B2 (en) Broadcast signal receiving apparatus, broadcast signal receiving method, television receiver, control program, and recording medium
US10306298B2 (en) Image processing apparatus and control method thereof
WO2023234281A1 (en) Transmission device, transmission method, reception device, and reception method
JP7463586B2 (en) Digital content delivery methods
US10484116B2 (en) Apparatus for converting broadcast signal and method for using the same
JP2002176592A (en) Television broadcasting receiver and television broadcasting transmission system
JP7313940B2 (en) digital content transmitter
JP6251835B2 (en) Broadcast signal transmission / reception system and broadcast signal transmission / reception method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13838626

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13838626

Country of ref document: EP

Kind code of ref document: A1