US20170188099A1 - Broadcast transmitting apparatus, method of operating broadcast transmitting apparatus, broadcast receiving apparatus and method of operating broadcast receiving apparatus - Google Patents

Broadcast transmitting apparatus, method of operating broadcast transmitting apparatus, broadcast receiving apparatus and method of operating broadcast receiving apparatus Download PDF

Info

Publication number
US20170188099A1
US20170188099A1 US15/306,975 US201515306975A US2017188099A1 US 20170188099 A1 US20170188099 A1 US 20170188099A1 US 201515306975 A US201515306975 A US 201515306975A US 2017188099 A1 US2017188099 A1 US 2017188099A1
Authority
US
United States
Prior art keywords
component
broadcast
service
fragment
receiving apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/306,975
Inventor
Minsung Kwak
Seungryul Yang
Kyoungsoo Moon
Woosuk Ko
Sungryong Hong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Priority to US15/306,975 priority Critical patent/US20170188099A1/en
Publication of US20170188099A1 publication Critical patent/US20170188099A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2362Generation or processing of Service Information [SI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • H04N21/2355Processing of additional data, e.g. scrambling of additional data or processing content descriptors involving reformatting operations of additional data, e.g. HTML pages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2389Multiplex stream processing, e.g. multiplex stream encrypting
    • H04N21/23892Multiplex stream processing, e.g. multiplex stream encrypting involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26283Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists for associating distribution time parameters to content, e.g. to generate electronic program guide data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4335Housekeeping operations, e.g. prioritizing content for deletion because of storage space restrictions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4345Extraction or processing of SI, e.g. extracting service information from an MPEG stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47211End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting pay-per-view content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4821End-user interface for program selection using a grid, e.g. sorted out by channel and broadcast time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Definitions

  • the present invention relates to a broadcast transmitting apparatus, a method of operating the broadcast transmitting apparatus, a broadcast receiving apparatus and a method of operating the broadcast receiving apparatus.
  • hybrid broadcast using a broadband as well as broadcast networks is in the spotlight. Furthermore, such hybrid broadcast provides applications or broadcast services interoperating with terminal devices such as a smartphone and a tablet.
  • MPEG-2 Motion Picture Experts Group-2
  • TS transport stream
  • MPEG-2 TS does not consider hybrid broadcast and thus is restricted from being used for hybrid broadcast.
  • MPEG-2 TS does not provide various extend abilities and thus it is inefficient to transmit data used in a broadband through MPEG-2 TS.
  • IP packets need to be encapsulated into MPEG-2 TS.
  • a broadcast receiving apparatus needs to process both MPEG-2 TS packet and IP packet in order to support hybrid broadcast. Therefore, there is a need for a new broadcast transmission format having extendibility and efficiency for hybrid broadcast.
  • an electronic program guide displays broadcast services and programs provided by the broadcast services.
  • the ESG may be referred to as an electronic program guide.
  • the ESG displays the start time, end time, title, summary of contents, recommended rating, genre, appearance information and so on of a program.
  • broadcast services can be provided along with applications.
  • hybrid broadcast services include various media components and a broadcast receiving apparatus can selectively reproduce media components.
  • the broadcast receiving apparatus can not only provide programs according to predetermined schedule but also provide programs at the request of a user. Accordingly, the broadcast receiving apparatus needs to provide an ESG capable of effectively delivering content of hybrid broadcast services that provide complicated and various contents.
  • a broadcast transmitting apparatus needs to transmit ESG data in a new format.
  • the broadcast receiving apparatus needs to receive and display ESG data in a new format.
  • An object of the present invention is to provide a broadcast transmitting apparatus for transmitting ESG data for hybrid broadcast, a method of operating the broadcast transmitting apparatus, a broadcast receiving apparatus for receiving ESG data for hybrid broadcast and a method of operating the broadcast receiving apparatus.
  • a broadcast receiving apparatus includes: a broadcast receiving unit configured to receive a broadcast signal; and a controller configured to receive electrical service guide (ESG) data including information about a broadcast service guide based on the broadcast signal and to acquire information about a component included in at least one of a broadcast service and broadcast content based on the ESG data.
  • ESG electrical service guide
  • the information about the component may include device capability information indicating a device capability required to present the component.
  • the controller may display the information about the component based on the device capability information.
  • the controller may discriminately display information about a component presentable by the broadcast receiving apparatus and information about a component unpresentable by the broadcast receiving apparatus.
  • the information about the component may include reference information indicating inclusion relationships between the component and other components, between the component and the broadcast content and between the component and the broadcast service.
  • the information about the component may include association information indicating an associated component.
  • the associated component may represent a component presented along with the component.
  • the ESG data may be divided into fragments corresponding to information units and include a service fragment including information about the broadcast service and a content fragment including information about content included in the broadcast service.
  • the information about the content may be a content fragment included in the ESG data.
  • the information about the component may be a component element included in the content fragment.
  • the information about the component may include charging information about the component.
  • the controller may display the information about the component in a service guide menu.
  • the controller may display the role of the component in the service guide menu.
  • the controller may display the charging information about the component in the service guide menu.
  • the controller may display the information about the component based on types of data included in the component.
  • the controller may display the information about the component including data of a type selected by a user.
  • a method of operating a broadcast receiving apparatus includes: receiving a broadcast signal; receiving ESG data including information about a broadcast service guide based on the broadcast signal; and acquiring information about a component included in at least one of a broadcast service and broadcast content based on the ESG data.
  • a broadcast transmitting apparatus includes: a controller configured to acquire information about a component included in at least one of a broadcast service and broadcast content and to generate ESG data including information about a broadcast service guide based on the information about the component; and a transmitting unit configured to transmit a broadcast signal based on the ESG data.
  • FIG. 1 illustrates a structure of an apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention.
  • FIG. 2 illustrates an input formatting block according to one embodiment of the present invention.
  • FIG. 3 illustrates an input formatting block according to another embodiment of the present invention.
  • FIG. 4 illustrates a BICM block according to an embodiment of the present invention.
  • FIG. 5 illustrates a BICM block according to another embodiment of the present invention.
  • FIG. 6 illustrates a frame building block according to one embodiment of the present invention.
  • FIG. 7 illustrates an orthogonal frequency division multiplexing (OFMD) generation block according to an embodiment of the present invention.
  • FIG. 8 illustrates a structure of an apparatus for receiving broadcast signals for future broadcast services according to an embodiment of the present invention.
  • FIG. 9 illustrates a frame structure according to an embodiment of the present invention.
  • FIG. 10 illustrates a signaling hierarchy structure of the frame according to an embodiment of the present invention.
  • FIG. 11 illustrates preamble signaling data according to an embodiment of the present invention.
  • FIG. 12 illustrates PLS1 data according to an embodiment of the present invention.
  • FIG. 13 illustrates PLS2 data according to an embodiment of the present invention.
  • FIG. 14 illustrates PLS2 data according to another embodiment of the present invention.
  • FIG. 15 illustrates a logical structure of a frame according to an embodiment of the present invention.
  • FIG. 16 illustrates PLS mapping according to an embodiment of the present invention.
  • FIG. 17 illustrates EAC mapping according to an embodiment of the present invention.
  • FIG. 18 illustrates FIC mapping according to an embodiment of the present invention.
  • FIG. 19 illustrates a type of DP according to an embodiment of the present invention.
  • FIG. 20 illustrates a time interleaving according to an embodiment of the present invention.
  • FIG. 21 illustrates the basic operation of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 22 illustrates an operation of a twisted row-column block interleaver according to another embodiment of the present invention.
  • FIG. 23 illustrates a diagonal-wise reading pattern of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 24 illustrates interleaved XFECBLOCKs from each interleaving array according to an embodiment of the present invention.
  • FIG. 25 illustrates a protocol stack for broadcast service provision according to an embodiment of the present invention.
  • FIG. 26 is a block diagram of a broadcast transmitting apparatus for transmitting broadcast services, a content server for transmitting content associated with broadcast services, a broadcast receiving apparatus for receiving broadcast services and a companion apparatus interoperating with the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 27 illustrates values of serviceType element included in a service fragment and service types indicated by the values according to an embodiment of the present invention.
  • FIG. 28 illustrates an XML format of the serviceType element included in the service fragment according to an embodiment of the present invention.
  • FIG. 29 illustrates XML data indicating the serviceType element and a user interface when the service indicated by the service fragment is a linear service according to an embodiment of the present invention.
  • FIG. 30 illustrates XML data indicating the serviceType element and a user interface when the service indicated by the service fragment is a linear application-based service according to an embodiment of the present invention.
  • FIG. 31 illustrates XML data indicating the serviceType element and a user interface when the service indicated by the service fragment is a linear companion screen service according to an embodiment of the present invention.
  • FIG. 32 illustrates an XML format of a component fragment according to an embodiment of the present invention.
  • FIG. 33 illustrates component types that can be indicated by the component fragment according to an embodiment of the present invention.
  • FIG. 34 illustrates an XML format of a ComponentRangeType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 35 illustrates an XML format of a ComponentData element included in the component fragment according to an embodiment of the present invention.
  • FIG. 36 illustrates an XML format of a VideoDataType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 37 illustrates an XML format of an AudioDataType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 38 illustrates an XML format of a CCDataType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 39 illustrates an embodiment in which the component fragment according to an embodiment of the present invention represents a composite video component.
  • FIG. 40 illustrates another embodiment in which the component fragment according to an embodiment of the present invention represents a composite video component.
  • FIG. 41 illustrates another embodiment in which the component fragment according to an embodiment of the present invention represents a PickOne audio component.
  • FIG. 42 illustrates an XML format of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • FIG. 43 illustrates an XML format of an embodiment of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • FIG. 44 illustrates an XML format of an embodiment of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • FIG. 45 illustrates an XML format of component fragments when a component fragment representing an audio component and a component fragment representing a closed captioning component refer to a fragment representing an associated video component according to an embodiment of the present invention.
  • FIG. 46 illustrates a relationship among component fragments when a component fragment representing an audio component and a component fragment representing a closed captioning component refer to a fragment representing an associated video component according to an embodiment of the present invention.
  • FIG. 47 illustrates an XML format of component fragments when a component fragment representing a video component refers to a component fragment representing an audio component and a component fragment representing an associated closed captioning component according to an embodiment of the present invention.
  • FIG. 48 illustrates a relationship among component fragments when a component fragment representing a video component refers to a component fragment representing an audio component and a component fragment representing an associated closed captioning component according to an embodiment of the present invention.
  • FIG. 49 illustrates a reference relationship among fragments according to an embodiment of the present invention.
  • FIG. 50 illustrates an XML format of a component fragment when the component fragment refers to a higher component fragment, a content fragment and a service fragment.
  • FIG. 51 illustrates an XML format of a schedule fragment when the schedule fragment refers to a component fragment, a content fragment and a service fragment according to an embodiment of the present invention.
  • FIG. 52 illustrates a reference relationship among a service fragment, a content fragment and a component fragment representing a presentable video component, a presentable audio component and a presentable closed captioning component according to an embodiment of the present invention.
  • FIG. 53 illustrates a reference relationship between a component fragment representing a composite component and a component fragment representing a lower component according to an embodiment of the present invention.
  • FIG. 54 illustrates a reference relationship between a component fragment representing an App-based enhancement component and a component fragment representing a lower component according to an embodiment of the present invention.
  • FIG. 55 illustrates an XML format of a content fragment when the content fragment refers to a service according to another embodiment of the present invention.
  • FIG. 56 illustrates a reference relationship among content fragments and a service fragment according to another embodiment of the present invention.
  • FIG. 57 illustrates a reference relationship among fragments according to another embodiment of the present invention.
  • FIG. 58 illustrates an XML format of a service fragment according to another embodiment of the present invention.
  • FIG. 61 illustrates a reference relationship among a service fragment, a content fragment and a component fragment according to another embodiment of the present invention.
  • FIG. 62 illustrates a reference relationship between a component fragment representing a composite component and a lower component according to another embodiment of the present invention.
  • FIG. 63 illustrates a reference relationship between a component fragment representing an App-based enhancement component and a component fragment representing a lower component according to another embodiment of the present invention.
  • FIG. 64 illustrates a syntax of a component fragment according to another embodiment of the present invention.
  • FIG. 65 illustrates a syntax of a component fragment according to another embodiment of the present invention.
  • FIG. 66 illustrates a syntax of a component fragment according to another embodiment of the present invention.
  • FIG. 67 illustrates an XML format of a component fragment according to another embodiment of the present invention.
  • FIG. 68 illustrates an XML format of ComponentRangeType included in a component fragment according to another embodiment of the present invention.
  • FIG. 69 illustrates an XML format of ComponentRoleRangeType included in the component fragment according to another embodiment of the present invention.
  • FIG. 70 illustrates a relationship between the component fragment according to another embodiment of the present invention and a composite video component using scalable video encoding and components included in the composite video component.
  • FIG. 72 illustrates a relationship between the component fragment according to another embodiment of the present invention and a PickOne audio component and components included in the PickOne audio component.
  • FIG. 73 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 74 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 75 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 77 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 78 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 79 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 80 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 81 illustrates display of a component included in content by a broadcast receiving apparatus according to a capability element included in a component element according to another embodiment of the present invention.
  • FIG. 82 illustrates display of a component included in content by the broadcast receiving apparatus according to the capability element included in the component element according to another embodiment of the present invention.
  • FIG. 83 illustrates display of a component included in content by the broadcast receiving apparatus according to the capability element included in the component element according to another embodiment of the present invention.
  • FIG. 84 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 85 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 86 illustrates an XML format of the component element according to another embodiment of the present invention.
  • FIG. 87 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 88 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 89 illustrates an XML format of the component element according to another embodiment of the present invention.
  • FIG. 90 illustrates a syntax of a capability element according to another embodiment of the present invention.
  • FIG. 92 illustrates values of a category element included in the capability element according to another embodiment of the present invention.
  • FIG. 93 illustrates a user interface providing payment per component according to an embodiment of the present invention.
  • FIG. 94 illustrates a user interface providing payment per component according to an embodiment of the present invention.
  • FIG. 95 illustrates an operation of a broadcast transmitting apparatus according to an embodiment of the present invention.
  • FIG. 96 illustrates an operation of a broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 97 illustrates a content presentation screen of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 98 illustrates a service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 100 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 101 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 102 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 103 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 104 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 105 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • the present invention provides broadcast signal transmitting/receiving device and method.
  • the further broadcast services include a terrestrial broadcasting service, a mobile broadcasting server, and UHDTV service.
  • the present invention may process broadcast signals for the future broadcast services through non-MIMO (Multiple Input Multiple Output) or MIMO according to one embodiment.
  • a non-MIMO scheme according to an embodiment of the present invention may include a MISO (Multiple Input Single Output) scheme, a SISO (Single Input Single Output) scheme, etc.
  • the present invention is applicable to systems using two or more antennas.
  • the present invention may defines three physical layer (PL) profiles (base, handheld and advanced profiles) each optimized to minimize receiver complexity while attaining the performance required for a particular use case.
  • the physical layer (PHY) profiles are subsets of all configurations that a corresponding receiver should implement.
  • the three PHY profiles share most of the functional blocks but differ slightly in specific blocks and/or parameters. Additional PHY profiles can be defined in the future. For the system evolution, future profiles can also be multiplexed with the existing profiles in a single RF channel through a future extension frame (FEF). The details of each PHY profile are described below.
  • FEF future extension frame
  • the base profile represents a main use case for fixed receiving devices that are usually connected to a roof-top antenna.
  • the base profile also includes portable devices that could be transported to a place but belong to a relatively stationary reception category. Use of the base profile could be extended to handheld devices or even vehicular by some improved implementations, but those use cases are not expected for the base profile receiver operation.
  • Target SNR range of reception is from approximately 10 to 20 dB, which includes the 15 dB SNR reception capability of the existing broadcast system (e.g. ATSC A/53).
  • the receiver complexity and power consumption is not as critical as in the battery-operated handheld devices, which will use the handheld profile. Key system parameters for the base profile are listed in below table 1.
  • the handheld profile is designed for use in handheld and vehicular devices that operate with battery power.
  • the devices can be moving with pedestrian or vehicle speed.
  • the power consumption as well as the receiver complexity is very important for the implementation of the devices of the handheld profile.
  • the target SNR range of the handheld profile is approximately 0 to 10 dB, but can be configured to reach below 0 dB when intended for deeper indoor reception.
  • the advanced profile provides highest channel capacity at the cost of more implementation complexity.
  • This profile requires using MIMO transmission and reception, and UHDTV service is a target use case for which this profile is specifically designed.
  • the increased capacity can also be used to allow an increased number of services in a given bandwidth, e.g., multiple SDTV or HDTV services.
  • the target SNR range of the advanced profile is approximately 20 to 30 dB.
  • MIMO transmission may initially use existing elliptically-polarized transmission equipment, with extension to full-power cross-polarized transmission in the future.
  • Key system parameters for the advanced profile are listed in below table 3.
  • the base profile can be used as a profile for both the terrestrial broadcast service and the mobile broadcast service. That is, the base profile can be used to define a concept of a profile which includes the mobile profile. Also, the advanced profile can be divided advanced profile for a base profile with MIMO and advanced profile for a handheld profile with MIMO. Moreover, the three profiles can be changed according to intention of the designer.
  • auxiliary stream sequence of cells carrying data of as yet undefined modulation and coding, which may be used for future extensions or as required by broadcasters or network operators
  • base data pipe data pipe that carries service signaling data
  • baseband frame (or BBFKAME): set of Kbch bits which form the input to one FEC encoding process (BCH and LDPC encoding)
  • data pipe logical channel in the physical layer that carries service data or related metadata, which may carry one or multiple service(s) or service component(s).
  • data pipe unit a basic unit for allocating data cells to a DP in a frame.
  • DP_ID this 8 bit field identifies uniquely a DP within the system identified by the SYSTEM_ID
  • dummy cell cell carrying a pseudorandom value used to fill the remaining capacity not used for PLS signaling, DPs or auxiliary streams
  • emergency alert channel part of a frame that carries EAS information data
  • frame repetition unit a set of frames belonging to same or different physical layer profile including a FEF, which is repeated eight times in a super-frame
  • fast information channel a logical channel in a frame that carries the mapping information between a service and the corresponding base DP
  • FECBLOCK set of LDPC-encoded bits of a DP data
  • FFT size nominal FFT size used for a particular mode, equal to the active symbol period Ts expressed in cycles of the elementary period T
  • frame signaling symbol OFDM symbol with higher pilot density used at the start of a frame in certain combinations of FFT size, guard interval and scattered pilot pattern, which carries a part of the PLS data
  • frame edge symbol OFDM symbol with higher pilot density used at the end of a frame in certain combinations of FFT size, guard interval and scattered pilot pattern
  • frame-group the set of all the frames having the same PHY profile type in a super-frame.
  • future extension frame physical layer time slot within the super-frame that could be used for future extension, which starts with a preamble
  • input stream A stream of data for an ensemble of services delivered to the end users by the system.
  • PHY profile subset of all configurations that a corresponding receiver should implement
  • PLS1 a first set of PLS data carried in the FSS symbols having a fixed size, coding and modulation, which carries basic information about the system as well as the parameters needed to decode the PLS2
  • PLS2 a second set of PLS data transmitted in the FSS symbol, which carries more detailed PLS data about the system and the DPs
  • PLS2 dynamic data PLS2 data that may dynamically change frame-by-frame
  • PLS2 static data PLS2 data that remains static for the duration of a frame-group
  • preamble signaling data signaling data carried by the preamble symbol and used to identify the basic mode of the system
  • preamble symbol fixed-length pilot symbol that carries basic PLS data and is located in the beginning of a frame
  • the preamble symbol is mainly used for fast initial band scan to detect the system signal, its timing, frequency offset, and FFTsize.
  • superframe set of eight frame repetition units
  • time interleaving block set of cells within which time interleaving is carried out, corresponding to one use of the time interleaver memory
  • TI group unit over which dynamic capacity allocation for a particular DP is carried out, made up of an integer, dynamically varying number of XFECBLOCKs.
  • the TI group may be mapped directly to one frame or may be mapped to multiple frames. It may contain one or more TI blocks.
  • Type 1 DP DP of a frame where all DPs are mapped into the frame in TDM fashion
  • FIG. 1 illustrates a structure of an apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention.
  • the apparatus for transmitting broadcast signals for future broadcast services can include an input formatting block 1000 , a BICM (Bit interleaved coding & modulation) block 1010 , a frame structure block 1020 , an OFDM (Orthogonal Frequency Division Multiplexing) generation block 1030 and a signaling generation block 1040 .
  • BICM Bit interleaved coding & modulation
  • OFDM Orthogonal Frequency Division Multiplexing
  • IP stream/packets and MPEG2-TS are the main input formats, other stream types are handled as General Streams.
  • Management Information is input to control the scheduling and allocation of the corresponding bandwidth for each input stream.
  • One or multiple TS stream(s), IP stream(s) and/or General Stream(s) inputs are simultaneously allowed.
  • the input formatting block 1000 can demultiplex each input stream into one or multiple data pipe(s), to each of which an independent coding and modulation is applied.
  • the data pipe (DP) is the basic unit for robustness control, thereby affecting quality-of-service (QoS).
  • QoS quality-of-service
  • One or multiple service(s) or service component(s) can be carried by a single DP. Details of operations of the input formatting block 1000 will be described later.
  • the data pipe is a logical channel in the physical layer that carries service data or related metadata, which may carry one or multiple service(s) or service component(s).
  • the data pipe unit a basic unit for allocating data cells to a DP in a frame.
  • parity data is added for error correction and the encoded bit streams are mapped to complex-value constellation symbols.
  • the symbols are interleaved across a specific interleaving depth that is used for the corresponding DP.
  • MIMO encoding is performed in the BICM block 1010 and the additional data path is added at the output for MIMO transmission. Details of operations of the BICM block 1010 will be described later.
  • the Frame Building block 1020 can map the data cells of the input DPs into the OFDM symbols within a frame. After mapping, the frequency interleaving is used for frequency-domain diversity, especially to combat frequency-selective fading channels. Details of operations of the Frame Building block 1020 will be described later.
  • the OFDM Generation block 1030 can apply conventional OFDM modulation having a cyclic prefix as guard interval. For antenna space diversity, a distributed MISO scheme is applied across the transmitters. In addition, a Peak-to-Average Power Reduction (PAPR) scheme is performed in the time domain. For flexible network planning, this proposal provides a set of various FFT sizes, guard interval lengths and corresponding pilot patterns. Details of operations of the OFDM Generation block 1030 will be described later.
  • PAPR Peak-to-Average Power Reduction
  • the Signaling Generation block 1040 can create physical layer signaling information used for the operation of each functional block. This signaling information is also transmitted so that the services of interest are properly recovered at the receiver side. Details of operations of the Signaling Generation block 1040 will be described later.
  • FIGS. 2, 3 and 4 illustrate the input formatting block 1000 according to embodiments of the present invention. A description will be given of each figure.
  • FIG. 2 illustrates an input formatting block according to one embodiment of the present invention.
  • FIG. 2 shows an input formatting module when the input signal is a single input stream.
  • the input formatting block illustrated in FIG. 2 corresponds to an embodiment of the input formatting block 1000 described with reference to FIG. 1 .
  • the input to the physical layer may be composed of one or multiple data streams. Each data stream is carried by one DP.
  • the mode adaptation modules slice the incoming data stream into data fields of the baseband frame (BBF).
  • BBF baseband frame
  • the system supports three types of input data streams: MPEG2-TS, Internet protocol (IP) and Generic stream (GS).
  • MPEG2-TS is characterized by fixed length (188 byte) packets with the first byte being a sync-byte (0x47).
  • An IP stream is composed of variable length IP datagram packets, as signaled within IP packet headers.
  • the system supports both IPv4 and IPv6 for the IP stream.
  • GS may be composed of variable length packets or constant length packets, signaled within encapsulation packet headers.
  • the Input Stream Splitter splits the input TS, IP, GS streams into multiple service or service component (audio, video, etc.) streams.
  • the mode adaptation module 2010 is comprised of a CRC Encoder, BB (baseband) Frame Slicer, and BB Frame Header Insertion block.
  • the CRC Encoder provides three kinds of CRC encoding for error detection at the user packet (UP) level, i.e., CRC-8, CRC-16, and CRC-32.
  • the computed CRC bytes are appended after the UP.
  • CRC-8 is used for TS stream and CRC-32 for IP stream. If the GS stream doesn't provide the CRC encoding, the proposed CRC encoding should be applied.
  • the BB Frame Slicer maps the input into an internal logical-bit format.
  • the first received bit is defined to be the MSB.
  • the BB Frame Slicer allocates a number of input bits equal to the available data field capacity.
  • the UP packet stream is sliced to fit the data field of BBF.
  • BB Frame Header Insertion block can insert fixed length BBF header of 2 bytes is inserted in front of the BB Frame.
  • the BBF header is composed of STUFFI (1 bit), SYNCD (13 bits), and RFU (2 bits).
  • BBF can have an extension field (1 or 3 bytes) at the end of the 2-byte BBF header.
  • the stream adaptation 2010 is comprised of stuffing insertion block and BB scrambler.
  • the stuffing insertion block can insert stuffing field into a payload of a BB frame. If the input data to the stream adaptation is sufficient to fill a BB-Frame, STUFFI is set to ‘0’ and the BBF has no stuffing field. Otherwise STUFFI is set to ‘1’ and the stuffing field is inserted immediately after the BBF header.
  • the stuffing field comprises two bytes of the stuffing field header and a variable size of stuffing data.
  • the BB scrambler scrambles complete BBF for energy dispersal.
  • the scrambling sequence is synchronous with the BBF.
  • the scrambling sequence is generated by the feed-back shift register.
  • the PLS generation block 2020 can generate physical layer signaling (PLS) data.
  • PLS provides the receiver with a means to access physical layer DPs.
  • the PLS data consists of PLS1 data and PLS2 data.
  • the PLS1 data is a first set of PLS data carried in the FSS symbols in the frame having a fixed size, coding and modulation, which carries basic information about the system as well as the parameters needed to decode the PLS2 data.
  • the PLS1 data provides basic transmission parameters including parameters required to enable the reception and decoding of the PLS2 data. Also, the PLS1 data remains constant for the duration of a frame-group.
  • the PLS2 data is a second set of PLS data transmitted in the FSS symbol, which carries more detailed PLS data about the system and the DPs.
  • the PLS2 contains parameters that provide sufficient information for the receiver to decode the desired DP.
  • the PLS2 signaling further consists of two types of parameters, PLS2 Static data (PLS2-STAT data) and PLS2 dynamic data (PLS2-DYN data).
  • PLS2 Static data is PLS2 data that remains static for the duration of a frame-group and the PLS2 dynamic data is PLS2 data that may dynamically change frame-by-frame.
  • the PLS scrambler 2030 can scramble the generated PLS data for energy dispersal.
  • FIG. 3 illustrates an input formatting block according to another embodiment of the present invention.
  • the input formatting block illustrated in FIG. 3 corresponds to an embodiment of the input formatting block 1000 described with reference to FIG. 1 .
  • FIG. 3 shows a mode adaptation block of the input formatting block when the input signal corresponds to multiple input streams.
  • the mode adaptation block of the input formatting block for processing the multiple input streams can independently process the multiple input streams.
  • the mode adaptation block for respectively processing the multiple input streams can include an input stream splitter 3000 , an input stream synchronizer 3010 , a compensating delay block 3020 , a null packet deletion block 3030 , a head compression block 3040 , a CRC encoder 3050 , a BB frame slicer 3060 and a BB header insertion block 3070 . Description will be given of each block of the mode adaptation block.
  • Operations of the CRC encoder 3050 , BB frame slicer 3060 and BB header insertion block 3070 correspond to those of the CRC encoder, BB frame slicer and BB header insertion block described with reference to FIG. 2 and thus description thereof is omitted.
  • the input stream splitter 3000 can split the input TS, IP, GS streams into multiple service or service component (audio, video, etc.) streams.
  • the input stream synchronizer 3010 may be referred as ISSY.
  • the ISSY can provide suitable means to guarantee Constant Bit Rate (CBR) and constant end-to-end transmission delay for any input data format.
  • CBR Constant Bit Rate
  • the ISSY is always used for the case of multiple DPs carrying TS, and optionally used for multiple DPs carrying GS streams.
  • the compensating delay block 3020 can delay the split TS packet stream following the insertion of ISSY information to allow a TS packet recombining mechanism without requiring additional memory in the receiver.
  • the null packet deletion block 3030 is used only for the TS input stream case. Some TS input streams or split TS streams may have a large number of null-packets present in order to accommodate VBR (variable bit-rate) services in a CBR TS stream. In this case, in order to avoid unnecessary transmission overhead, null-packets can be identified and not transmitted. In the receiver, removed null-packets can be re-inserted in the exact place where they were originally by reference to a deleted null-packet (DNP) counter that is inserted in the transmission, thus guaranteeing constant bit-rate and avoiding the need for time-stamp (PCR) updating.
  • DNP deleted null-packet
  • the head compression block 3040 can provide packet header compression to increase transmission efficiency for TS or IP input streams. Because the receiver can have a priori information on certain parts of the header, this known information can be deleted in the transmitting unit.
  • the receiver For Transport Stream, the receiver has a-priori information about the sync-byte configuration (0x47) and the packet length (188 Byte). If the input TS stream carries content that has only one PID, i.e., for only one service component (video, audio, etc.) or service sub-component (SVC base layer, SVC enhancement layer, MVC base view or MVC dependent views), TS packet header compression can be applied (optionally) to the Transport Stream. IP packet header compression is used optionally if the input steam is an IP stream.
  • FIG. 4 illustrates a BICM block according to an embodiment of the present invention.
  • the BICM block illustrated in FIG. 4 corresponds to an embodiment of the BICM block 1010 described with reference to FIG. 1 .
  • the apparatus for transmitting broadcast signals for future broadcast services can provide a terrestrial broadcast service, mobile broadcast service, UHDTV service, etc.
  • the a BICM block according to an embodiment of the present invention can independently process DPs input thereto by independently applying SISO, MISO and MIMO schemes to the data pipes respectively corresponding to data paths. Consequently, the apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention can control QoS for each service or service component transmitted through each DP.
  • the BICM block shared by the base profile and the handheld profile and the BICM block of the advanced profile can include plural processing blocks for processing each DP.
  • a processing block 5000 of the BICM block for the base profile and the handheld profile can include a Data FEC encoder 5010 , a bit interleaver 5020 , a constellation mapper 5030 , an SSD (Signal Space Diversity) encoding block 5040 and a time interleaver 5050 .
  • a Data FEC encoder 5010 a bit interleaver 5020 , a constellation mapper 5030 , an SSD (Signal Space Diversity) encoding block 5040 and a time interleaver 5050 .
  • the Data FEC encoder 5010 can perform the FEC encoding on the input BBF to generate FECBLOCK procedure using outer coding (BCH), and inner coding (LDPC).
  • BCH outer coding
  • LDPC inner coding
  • the outer coding (BCH) is optional coding method. Details of operations of the Data FEC encoder 5010 will be described later.
  • the bit interleaver 5020 can interleave outputs of the Data FEC encoder 5010 to achieve optimized performance with combination of the LDPC codes and modulation scheme while providing an efficiently implementable structure. Details of operations of the bit interleaver 5020 will be described later.
  • the constellation mapper 5030 can modulate each cell word from the bit interleaver 5020 in the base and the handheld profiles, or cell word from the Cell-word demultiplexer 5010 - 1 in the advanced profile using either QPSK, QAM-16, non-uniform QAM (NUQ-64, NUQ-256, NUQ-1024) or non-uniform constellation (NUC-16, NUC-64, NUC-256, NUC-1024) to give a power-normalized constellation point, el.
  • This constellation mapping is applied only for DPs. Observe that QAM-16 and NUQs are square shaped, while NUCs have arbitrary shape. When each constellation is rotated by any multiple of 90 degrees, the rotated constellation exactly overlaps with its original one.
  • the time interleaver 5050 can operates at the DP level.
  • the parameters of time interleaving (TI) may be set differently for each DP. Details of operations of the time interleaver 5050 will be described later.
  • a processing block 5000 - 1 of the BICM block for the advanced profile can include the Data FEC encoder, bit interleaver, constellation mapper, and time interleaver. However, the processing block 5000 - 1 is distinguished from the processing block 5000 further includes a cell-word demultiplexer 5010 - 1 and a MIMO encoding block 5020 - 1 .
  • the operations of the Data FEC encoder, bit interleaver, constellation mapper, and time interleaver in the processing block 5000 - 1 correspond to those of the Data FEC encoder 5010 , bit interleaver 5020 , constellation mapper 5030 , and time interleaver 5050 described and thus description thereof is omitted.
  • the cell-word demultiplexer 5010 - 1 is used for the DP of the advanced profile to divide the single cell-word stream into dual cell-word streams for MIMO processing. Details of operations of the cell-word demultiplexer 5010 - 1 will be described later.
  • the MIMO encoding block 5020 - 1 can processing the output of the cell-word demultiplexer 5010 - 1 using MIMO encoding scheme.
  • the MIMO encoding scheme was optimized for broadcasting signal transmission.
  • the MIMO technology is a promising way to get a capacity increase but it depends on channel characteristics. Especially for broadcasting, the strong LOS component of the channel or a difference in the received signal power between two antennas caused by different signal propagation characteristics makes it difficult to get capacity gain from MIMO.
  • the proposed MIMO encoding scheme overcomes this problem using a rotation-based pre-coding and phase randomization of one of the MIMO output signals.
  • MIMO encoding is intended for a 2 ⁇ 2 MIMO system requiring at least two antennas at both the transmitter and the receiver.
  • Two MIMO encoding modes are defined in this proposal; full-rate spatial multiplexing (FR-SM) and full-rate full-diversity spatial multiplexing (FRFD-SM).
  • FR-SM full-rate spatial multiplexing
  • FRFD-SM full-rate full-diversity spatial multiplexing
  • the FR-SM encoding provides capacity increase with relatively small complexity increase at the receiver side while the FRFD-SM encoding provides capacity increase and additional diversity gain with a great complexity increase at the receiver side.
  • the proposed MIMO encoding scheme has no restriction on the antenna polarity configuration.
  • MIMO processing is required for the advanced profile frame, which means all DPs in the advanced profile frame are processed by the MIMO encoder. MIMO processing is applied at DP level. Pairs of the Constellation Mapper outputs NUQ (e 1 ,i and e 2 ,i) are fed to the input of the MIMO Encoder. Paired MIMO Encoder output (g 1 ,i and g 2 ,i) is transmitted by the same carrier k and OFDM symbol l of their respective TX antennas.
  • FIG. 5 illustrates a BICM block according to another embodiment of the present invention.
  • the BICM block illustrated in FIG. 5 corresponds to an embodiment of the BICM block 1010 described with reference to FIG. 1 .
  • the PLS FEC encoder 6000 can include a scrambler, BCH encoding/zero insertion block, LDPC encoding block and LDPC parity puncturing block. Description will be given of each block of the BICM block.
  • the PLS FEC encoder 6000 can encode the scrambled PLS 1/2 data, EAC and FIC section.
  • the scrambler can scramble PLS1 data and PLS2 data before BCH encoding and shortened and punctured LDPC encoding.
  • the BCH encoding/zero insertion block can perform outer encoding on the scrambled PLS 1/2 data using the shortened BCH code for PLS protection and insert zero bits after the BCH encoding.
  • the output bits of the zero insertion may be permuted before LDPC encoding.
  • the LDPC code parameters for PLS1 and PLS2 are as following table 4.
  • the LDPC parity puncturing block can perform puncturing on the PLS1 data and PLS 2 data.
  • the bit interleaver 6010 can interleave the each shortened and punctured PLS1 data and PLS2 data.
  • the constellation mapper 6020 can map the bit interleaved PLS1 data and PLS2 data onto constellations.
  • FIG. 6 illustrates a frame building block according to one embodiment of the present invention.
  • the frame building block illustrated in FIG. 6 corresponds to an embodiment of the frame building block 1020 described with reference to FIG. 1 .
  • the frame building block can include a delay compensation block 7000 , a cell mapper 7010 and a frequency interleaver 7020 . Description will be given of each block of the frame building block.
  • the delay compensation block 7000 can adjust the timing between the data pipes and the corresponding PLS data to ensure that they are co-timed at the transmitter end.
  • the PLS data is delayed by the same amount as data pipes are by addressing the delays of data pipes caused by the Input Formatting block and BICM block.
  • the delay of the BICM block is mainly due to the time interleaver.
  • In-band signaling data carries information of the next TI group so that they are carried one frame ahead of the DPs to be signaled.
  • the Delay Compensating block delays in-band signaling data accordingly.
  • the frequency interleaver 7020 can randomly interleave data cells received from the cell mapper 7010 to provide frequency diversity. Also, the frequency interleaver 7020 can operate on very OFDM symbol pair comprised of two sequential OFDM symbols using a different interleaving-seed order to get maximum interleaving gain in a single frame. Details of operations of the frequency interleaver 7020 will be described later.
  • FIG. 7 illustrates an OFMD generation block according to an embodiment of the present invention.
  • the OFMD generation block illustrated in FIG. 7 corresponds to an embodiment of the OFMD generation block 1030 described with reference to FIG. 1 .
  • the OFDM generation block modulates the OFDM carriers by the cells produced by the Frame Building block, inserts the pilots, and produces the time domain signal for transmission. Also, this block subsequently inserts guard intervals, and applies PAPR (Peak-to-Average Power Radio) reduction processing to produce the final RF signal.
  • the OFDM generation block can include a pilot and reserved tone insertion block 8000 , a 2D-eSFN encoding block 8010 , an IFNT (Inverse Fast Fourier Transform) block 8020 , a PAPR reduction block 8030 , a guard interval insertion block 8040 , a preamble insertion block 8050 , other system insertion block 8060 and a DAC block 8070 . Description will be given of each block of the frame building block.
  • the other system insertion block 8060 can multiplex signals of a plurality of broadcast transmission/reception systems in the time domain such that data of two or more different broadcast transmission/reception systems providing broadcast services can be simultaneously transmitted in the same RF signal bandwidth.
  • the two or more different broadcast transmission/reception systems refer to systems providing different broadcast services.
  • the different broadcast services may refer to a terrestrial broadcast service, mobile broadcast service, etc.
  • the apparatus for receiving broadcast signals for future broadcast services can correspond to the apparatus for transmitting broadcast signals for future broadcast services, described with reference to FIG. 1 .
  • the apparatus for receiving broadcast signals for future broadcast services can include a synchronization & demodulation module 9000 , a frame parsing module 9010 , a demapping & decoding module 9020 , an output processor 9030 and a signaling decoding module 9040 .
  • a description will be given of operation of each module of the apparatus for receiving broadcast signals.
  • the synchronization & demodulation module 9000 can receive input signals through m Rx antennas, perform signal detection and synchronization with respect to a system corresponding to the apparatus for receiving broadcast signals and carry out demodulation corresponding to a reverse procedure of the procedure performed by the apparatus for transmitting broadcast signals.
  • the frame parsing module 9010 can parse input signal frames and extract data through which a service selected by a user is transmitted. If the apparatus for transmitting broadcast signals performs interleaving, the frame parsing module 9010 can carry out deinterleaving corresponding to a reverse procedure of interleaving. In this case, the positions of a signal and data that need to be extracted can be obtained by decoding data output from the signaling decoding module 9040 to restore scheduling information generated by the apparatus for transmitting broadcast signals.
  • the demapping & decoding module 9020 can convert the input signals into bit domain data and then deinterleave the same as necessary.
  • the demapping & decoding module 9020 can perform demapping for mapping applied for transmission efficiency and correct an error generated on a transmission channel through decoding.
  • the demapping & decoding module 9020 can obtain transmission parameters necessary for demapping and decoding by decoding the data output from the signaling decoding module 9040 .
  • the output processor 9030 can perform reverse procedures of various compression/signal processing procedures which are applied by the apparatus for transmitting broadcast signals to improve transmission efficiency.
  • the output processor 9030 can acquire necessary control information from data output from the signaling decoding module 9040 .
  • the output of the output processor 8300 corresponds to a signal input to the apparatus for transmitting broadcast signals and may be MPEG-TSs, IP streams (v4 or v6) and generic streams.
  • the signaling decoding module 9040 can obtain PLS information from the signal demodulated by the synchronization & demodulation module 9000 .
  • the frame parsing module 9010 , demapping & decoding module 9020 and output processor 9030 can execute functions thereof using the data output from the signaling decoding module 9040 .
  • FIG. 9 illustrates a frame structure according to an embodiment of the present invention.
  • FIG. 9 shows an example configuration of the frame types and FRUs in a super-frame.
  • (a) shows a super frame according to an embodiment of the present invention
  • (b) shows FRU (Frame Repetition Unit) according to an embodiment of the present invention
  • (c) shows frames of variable PHY profiles in the FRU
  • (d) shows a structure of a frame.
  • a super-frame may be composed of eight FRUs.
  • the FRU is a basic multiplexing unit for TDM of the frames, and is repeated eight times in a super-frame.
  • Each frame in the FRU belongs to one of the PHY profiles, (base, handheld, advanced) or FEF.
  • the maximum allowed number of the frames in the FRU is four and a given PHY profile can appear any number of times from zero times to four times in the FRU (e.g., base, base, handheld, advanced).
  • PHY profile definitions can be extended using reserved values of the PHY_PROFILE in the preamble, if required.
  • the FEF part is inserted at the end of the FRU, if included.
  • the minimum number of FEFs is 8 in a super-frame. It is not recommended that FEF parts be adjacent to each other.
  • One frame is further divided into a number of OFDM symbols and a preamble. As shown in (d), the frame comprises a preamble, one or more frame signaling symbols (FSS), normal data symbols and a frame edge symbol (FES).
  • FSS frame signaling symbols
  • FES normal data symbols
  • FES frame edge symbol
  • the preamble is a special symbol that enables fast Futurecast UTB system signal detection and provides a set of basic transmission parameters for efficient transmission and reception of the signal. The detailed description of the preamble will be will be described later.
  • the main purpose of the FSS(s) is to carry the PLS data.
  • the FSS For fast synchronization and channel estimation, and hence fast decoding of PLS data, the FSS has more dense pilot pattern than the normal data symbol.
  • the FES has exactly the same pilots as the FSS, which enables frequency-only interpolation within the FES and temporal interpolation, without extrapolation, for symbols immediately preceding the FES.
  • FIG. 10 illustrates a signaling hierarchy structure of the frame according to an embodiment of the present invention.
  • FIG. 10 illustrates the signaling hierarchy structure, which is split into three main parts: the preamble signaling data 11000 , the PLS1 data 11010 and the PLS2 data 11020 .
  • the purpose of the preamble which is carried by the preamble symbol in every frame, is to indicate the transmission type and basic transmission parameters of that frame.
  • the PLS1 enables the receiver to access and decode the PLS2 data, which contains the parameters to access the DP of interest.
  • the PLS2 is carried in every frame and split into two main parts: PLS2-STAT data and PLS2-DYN data. The static and dynamic portion of PLS2 data is followed by padding, if necessary.
  • FIG. 11 illustrates preamble signaling data according to an embodiment of the present invention.
  • Preamble signaling data carries 21 bits of information that are needed to enable the receiver to access PLS data and trace DPs within the frame structure. Details of the preamble signaling data are as follows:
  • PHY_PROFILE This 3-bit field indicates the PHY profile type of the current frame. The mapping of different PHY profile types is given in below table 5.
  • FFT_SIZE This 2 bit field indicates the FFT size of the current frame within a frame-group, as described in below table 6.
  • GI_FRACTION This 3 bit field indicates the guard interval fraction value in the current super-frame, as described in below table 7.
  • EAC_FLAG This 1 bit field indicates whether the EAC is provided in the current frame. If this field is set to ‘1’, emergency alert service (EAS) is provided in the current frame. If this field set to ‘0’, EAS is not carried in the current frame. This field can be switched dynamically within a super-frame.
  • EAS emergency alert service
  • PILOT_MODE This 1-bit field indicates whether the pilot mode is mobile mode or fixed mode for the current frame in the current frame-group. If this field is set to ‘0’, mobile pilot mode is used. If the field is set to ‘1’, the fixed pilot mode is used.
  • PAPR_FLAG This 1-bit field indicates whether PAPR reduction is used for the current frame in the current frame-group. If this field is set to value ‘1’, tone reservation is used for PAPR reduction. If this field is set to ‘0’, PAPR reduction is not used.
  • FRU_CONFIGURE This 3-bit field indicates the PHY profile type configurations of the frame repetition units (FRU) that are present in the current super-frame. All profile types conveyed in the current super-frame are identified in this field in all preambles in the current super-frame.
  • the 3-bit field has a different definition for each profile, as show in below table 8.
  • FIG. 12 illustrates PLS1 data according to an embodiment of the present invention.
  • PLS1 data provides basic transmission parameters including parameters required to enable the reception and decoding of the PLS2. As above mentioned, the PLS1 data remain unchanged for the entire duration of one frame-group.
  • the detailed definition of the signaling fields of the PLS1 data are as follows:
  • PREAMBLE_DATA This 20-bit field is a copy of the preamble signaling data excluding the EAC_FLAG.
  • NUM_FRAME_FRU This 2-bit field indicates the number of the frames per FRU.
  • PAYLOAD_TYPE This 3-bit field indicates the format of the payload data carried in the frame-group. PAYLOAD_TYPE is signaled as shown in table 9.
  • Payload type 1XX TS stream is transmitted X1X IP stream is transmitted XX1 GS stream is transmitted
  • NUM_FSS This 2-bit field indicates the number of FSS symbols in the current frame.
  • SYSTEM_VERSION This 8-bit field indicates the version of the transmitted signal format.
  • the SYSTEM_VERSION is divided into two 4-bit fields, which are a major version and a minor version.
  • MSB four bits of SYSTEM_VERSION field indicate major version information.
  • a change in the major version field indicates a non-backward-compatible change.
  • the default value is ‘0000’.
  • the value is set to ‘0000’.
  • Minor version The LSB four bits of SYSTEM_VERSION field indicate minor version information. A change in the minor version field is backward-compatible.
  • CELL_ID This is a 16-bit field which uniquely identifies a geographic cell in an ATSC network.
  • An ATSC cell coverage area may consist of one or more frequencies, depending on the number of frequencies used per Futurecast UTB system. If the value of the CELL_ID is not known or unspecified, this field is set to ‘0’.
  • NETWORK_ID This is a 16-bit field which uniquely identifies the current ATSC network.
  • SYSTEM_ID This 16-bit field uniquely identifies the Futurecast UTB system within the ATSC network.
  • the Futurecast UTB system is the terrestrial broadcast system whose input is one or more input streams (TS, IP, GS) and whose output is an RF signal.
  • the Futurecast UTB system carries one or more PHY profiles and FEF, if any.
  • the same Futurecast UTB system may carry different input streams and use different RF frequencies in different geographical areas, allowing local service insertion.
  • the frame structure and scheduling is controlled in one place and is identical for all transmissions within a Futurecast UTB system.
  • One or more Futurecast UTB systems may have the same SYSTEM_ID meaning that they all have the same physical layer structure and configuration.
  • the following loop consists of FRU_PHY_PROFILE
  • FRU_FRAME_LENGTH FRU_GI_FRACTION
  • RESERVED RESERVED which are used to indicate the FRU configuration and the length of each frame type.
  • the loop size is fixed so that four PHY profiles (including a FEF) are signaled within the FRU. If NUM_FRAME_FRU is less than 4, the unused fields are filled with zeros.
  • FRU_PHY_PROFILE This 3-bit field indicates the PHY profile type of the (i+1)th (i is the loop index) frame of the associated FRU. This field uses the same signaling format as shown in the table 8.
  • FRU_FRAME_LENGTH This 2-bit field indicates the length of the (i+1)th frame of the associated FRU. Using FRU_FRAME_LENGTH together with FRU_GI_FRACTION, the exact value of the frame duration can be obtained.
  • FRU_GI_FRACTION This 3-bit field indicates the guard interval fraction value of the (i+1)th frame of the associated FRU.
  • FRU_GI_FRACTION is signaled according to the table 7.
  • the following fields provide parameters for decoding the PLS2 data.
  • PLS2_FEC_TYPE This 2-bit field indicates the FEC type used by the PLS2 protection.
  • the FEC type is signaled according to table 10. The details of the LDPC codes will be described later.
  • PLS2_MOD This 3-bit field indicates the modulation type used by the PLS2. The modulation type is signaled according to table 11.
  • PLS2_SIZE_CELL This 15-bit field indicates Ctotal_partial_block, the size (specified as the number of QAM cells) of the collection of full coded blocks for PLS2 that is carried in the current frame-group. This value is constant during the entire duration of the current frame-group.
  • PLS2_STAT_SIZE_BIT This 14-bit field indicates the size, in bits, of the PLS2-STAT for the current frame-group. This value is constant during the entire duration of the current frame-group.
  • PLS2_DYN_SIZE_BIT This 14-bit field indicates the size, in bits, of the PLS2-DYN for the current frame-group. This value is constant during the entire duration of the current frame-group.
  • PLS2_REP_FLAG This 1-bit flag indicates whether the PLS2 repetition mode is used in the current frame-group. When this field is set to value ‘1’, the PLS2 repetition mode is activated. When this field is set to value ‘0’, the PLS2 repetition mode is deactivated.
  • PLS2_REP_SIZE_CELL This 15-bit field indicates Ctotal_partial_block, the size (specified as the number of QAM cells) of the collection of partial coded blocks for PLS2 carried in every frame of the current frame-group, when PLS2 repetition is used. If repetition is not used, the value of this field is equal to 0. This value is constant during the entire duration of the current frame-group.
  • PLS2_NEXT_FEC_TYPE This 2-bit field indicates the FEC type used for PLS2 that is carried in every frame of the next frame-group. The FEC type is signaled according to the table 10.
  • PLS2_NEXT_MOD This 3-bit field indicates the modulation type used for PLS2 that is carried in every frame of the next frame-group. The modulation type is signaled according to the table 11.
  • PLS2_NEXT_REP_FLAG This 1-bit flag indicates whether the PLS2 repetition mode is used in the next frame-group. When this field is set to value ‘1’, the PLS2 repetition mode is activated. When this field is set to value ‘0’, the PLS2 repetition mode is deactivated.
  • PLS2_NEXT_REP_SIZE_CELL This 15-bit field indicates Ctotal_full_block, The size (specified as the number of QAM cells) of the collection of full coded blocks for PLS2 that is carried in every frame of the next frame-group, when PLS2 repetition is used. If repetition is not used in the next frame-group, the value of this field is equal to 0. This value is constant during the entire duration of the current frame-group.
  • PLS2_NEXT_REP_STAT_SIZE_BIT This 14-bit field indicates the size, in bits, of the PLS2-STAT for the next frame-group. This value is constant in the current frame-group.
  • PLS2_NEXT_REP_DYN_SIZE_BIT This 14-bit field indicates the size, in bits, of the PLS2-DYN for the next frame-group. This value is constant in the current frame-group.
  • PLS2_AP_MODE This 2-bit field indicates whether additional parity is provided for PLS2 in the current frame-group. This value is constant during the entire duration of the current frame-group. The below table 12 gives the values of this field. When this field is set to ‘00’, additional parity is not used for the PLS2 in the current frame-group.
  • PLS2_AP_SIZE_CELL This 15-bit field indicates the size (specified as the number of QAM cells) of the additional parity bits of the PLS2. This value is constant during the entire duration of the current frame-group.
  • PLS2_NEXT_AP_MODE This 2-bit field indicates whether additional parity is provided for PLS2 signaling in every frame of next frame-group. This value is constant during the entire duration of the current frame-group.
  • the table 12 defines the values of this field
  • PLS2_NEXT_AP_SIZE_CELL This 15-bit field indicates the size (specified as the number of QAM cells) of the additional parity bits of the PLS2 in every frame of the next frame-group. This value is constant during the entire duration of the current frame-group.
  • RESERVED This 32-bit field is reserved for future use.
  • CRC_32 A 32-bit error detection code, which is applied to the entire PLS1 signaling.
  • FIG. 13 illustrates PLS2 data according to an embodiment of the present invention.
  • FIG. 13 illustrates PLS2-STAT data of the PLS2 data.
  • the PLS2-STAT data are the same within a frame-group, while the PLS2-DYN data provide information that is specific for the current frame.
  • FIC_FLAG This 1-bit field indicates whether the FIC is used in the current frame-group. If this field is set to ‘1’, the FIC is provided in the current frame. If this field set to ‘0’, the FIC is not carried in the current frame. This value is constant during the entire duration of the current frame-group.
  • AUX_FLAG This 1-bit field indicates whether the auxiliary stream(s) is used in the current frame-group. If this field is set to ‘1’, the auxiliary stream is provided in the current frame. If this field set to ‘0’, the auxiliary stream is not carried in the current frame. This value is constant during the entire duration of current frame-group.
  • NUM_DP This 6-bit field indicates the number of DPs carried within the current frame. The value of this field ranges from 1 to 64, and the number of DPs is NUM_DP+1.
  • DP_ID This 6-bit field identifies uniquely a DP within a PHY profile.
  • DP_TYPE This 3-bit field indicates the type of the DP. This is signaled according to the below table 13.
  • DP_GROUP_ID This 8-bit field identifies the DP group with which the current DP is associated. This can be used by a receiver to access the DPs of the service components associated with a particular service, which will have the same DP_GROUP_ID.
  • BASE_DP_ID This 6-bit field indicates the DP carrying service signaling data (such as PSI/SI) used in the Management layer.
  • the DP indicated by BASE_DP_ID may be either a normal DP carrying the service signaling data along with the service data or a dedicated DP carrying only the service signaling data
  • DP_FEC_TYPE This 2-bit field indicates the FEC type used by the associated DP.
  • the FEC type is signaled according to the below table 14.
  • DP_COD This 4-bit field indicates the code rate used by the associated DP.
  • the code rate is signaled according to the below table 15.
  • DP_MOD This 4-bit field indicates the modulation used by the associated DP. The modulation is signaled according to the below table 16.
  • DP_SSD_FLAG This 1-bit field indicates whether the SSD mode is used in the associated DP. If this field is set to value ‘1’, SSD is used. If this field is set to value ‘0’, SSD is not used.
  • PHY_PROFILE is equal to ‘010’, which indicates the advanced profile:
  • DP_MIMO This 3-bit field indicates which type of MIMO encoding process is applied to the associated DP. The type of MIMO encoding process is signaled according to the table 17.
  • DP_TI_TYPE This 1-bit field indicates the type of time-interleaving. A value of ‘0’ indicates that one TI group corresponds to one frame and contains one or more TI-blocks. A value of ‘1’ indicates that one TI group is carried in more than one frame and contains only one TI-block.
  • DP_TI_LENGTH The use of this 2-bit field (the allowed values are only 1, 2, 4, 8) is determined by the values set within the DP_TI_TYPE field as follows:
  • the allowed PI values with 2-bit field are defined in the below table 18.
  • the allowed PI values with 2-bit field are defined in the below table 18.
  • DP_FRAME_INTERVAL This 2-bit field indicates the frame interval (HUMP) within the frame-group for the associated DP and the allowed values are 1, 2, 4, 8 (the corresponding 2-bit field is ‘00’, ‘01’, ‘10’, or ‘11’, respectively). For DPs that do not appear every frame of the frame-group, the value of this field is equal to the interval between successive frames. For example, if a DP appears on the frames 1, 5, 9, 13, etc., this field is set to ‘4’. For DPs that appear in every frame, this field is set to ‘1’.
  • DP_TI_BYPASS This 1-bit field determines the availability of time interleaver. If time interleaving is not used for a DP, it is set to ‘1’. Whereas if time interleaving is used it is set to ‘0’.
  • DP_FIRST_FRAME_IDX This 5-bit field indicates the index of the first frame of the super-frame in which the current DP occurs.
  • the value of DP_FIRST_FRAME_IDX ranges from 0 to 31
  • DP_NUM_BLOCK_MAX This 10-bit field indicates the maximum value of DP_NUM_BLOCKS for this DP. The value of this field has the same range as DP_NUM_BLOCKS.
  • DP_PAYLOAD_TYPE This 2-bit field indicates the type of the payload data carried by the given DP.
  • DP_PAYLOAD_TYPE is signaled according to the below table 19.
  • DP_INBAND_MODE This 2-bit field indicates whether the current DP carries in-band signaling information.
  • the in-band signaling type is signaled according to the below table 20.
  • INBAND-PLS In-band signaling is not carried. 01 INBAND-PLS is carried only 10 INBAND-ISSY is carried only 11 INBAND-PLS and INBAND-ISSY are carried
  • DP_PROTOCOL_TYPE This 2-bit field indicates the protocol type of the payload carried by the given DP. It is signaled according to the below table 21 when input payload types are selected.
  • DP_CRC_MODE This 2-bit field indicates whether CRC encoding is used in the Input Formatting block.
  • the CRC mode is signaled according to the below table 22.
  • DNP_MODE This 2-bit field indicates the null-packet deletion mode used by the associated DP when DP_PAYLOAD_TYPE is set to TS (‘00’). DNP_MODE is signaled according to the below table 23. If DP_PAYLOAD_TYPE is not TS (‘00’), DNP_MODE is set to the value ‘00’.
  • ISSY_MODE This 2-bit field indicates the ISSY mode used by the associated DP when DP_PAYLOAD_TYPE is set to TS (‘00’).
  • the ISSY_MODE is signaled according to the below table 24 If DP_PAYLOAD_TYPE is not TS (‘00’), ISSY_MODE is set to the value ‘00’.
  • HC_MODE_TS This 2-bit field indicates the TS header compression mode used by the associated DP when DP_PAYLOAD_TYPE is set to TS (‘00’).
  • the HC_MODE_TS is signaled according to the below table 25.
  • HC_MODE_IP This 2-bit field indicates the IP header compression mode when DP_PAYLOAD_TYPE is set to IP (‘01’).
  • the HC_MODE_IP is signaled according to the below table 26.
  • PID This 13-bit field indicates the PID number for TS header compression when DP_PAYLOAD_TYPE is set to TS (‘00’) and HC_MODE_TS is set to ‘01’ or ‘10’.
  • FIC_VERSION This 8-bit field indicates the version number of the FIC.
  • FIC_LENGTH_BYTE This 13-bit field indicates the length, in bytes, of the FIC.
  • NUM_AUX This 4-bit field indicates the number of auxiliary streams. Zero means no auxiliary streams are used.
  • AUX_CONFIG_RFU This 8-bit field is reserved for future use.
  • AUX_STREAM_TYPE This 4-bit is reserved for future use for indicating the type of the current auxiliary stream.
  • AUX_PRIVATE_CONFIG This 28-bit field is reserved for future use for signaling auxiliary streams.
  • FIG. 14 illustrates PLS2 data according to another embodiment of the present invention.
  • FIG. 14 illustrates PLS2-DYN data of the PLS2 data.
  • the values of the PLS2-DYN data may change during the duration of one frame-group, while the size of fields remains constant.
  • FRAME_INDEX This 5-bit field indicates the frame index of the current frame within the super-frame.
  • the index of the first frame of the super-frame is set to ‘0’.
  • PLS_CHANGE_COUNTER This 4-bit field indicates the number of super-frames ahead where the configuration will change. The next super-frame with changes in the configuration is indicated by the value signaled within this field. If this field is set to the value ‘0000’, it means that no scheduled change is foreseen: e.g., value ‘1’ indicates that there is a change in the next super-frame.
  • FIC_CHANGE_COUNTER This 4-bit field indicates the number of super-frames ahead where the configuration (i.e., the contents of the FIC) will change. The next super-frame with changes in the configuration is indicated by the value signaled within this field. If this field is set to the value ‘0000’, it means that no scheduled change is foreseen: e.g. value ‘0001’ indicates that there is a change in the next super-frame.
  • NUM_DP The following fields appear in the loop over NUM_DP, which describe the parameters associated with the DP carried in the current frame.
  • DP_ID This 6-bit field indicates uniquely the DP within a PHY profile.
  • DP_START This 15-bit (or 13-bit) field indicates the start position of the first of the DPs using the DPU addressing scheme.
  • the DP_START field has differing length according to the PHY profile and FFT size as shown in the below table 27.
  • DP_NUM_BLOCK This 10-bit field indicates the number of FEC blocks in the current TI group for the current DP.
  • the value of DP_NUM_BLOCK ranges from 0 to 1023
  • the following fields indicate the FIC parameters associated with the EAC.
  • EAC_FLAG This 1-bit field indicates the existence of the EAC in the current frame. This bit is the same value as the EAC_FLAG in the preamble.
  • EAS_WAKE_UP_VERSION_NUM This 8-bit field indicates the version number of a wake-up indication.
  • EAC_FLAG field If the EAC_FLAG field is equal to ‘1’, the following 12 bits are allocated for EAC_LENGTH_BYTE field. If the EAC_FLAG field is equal to ‘0’, the following 12 bits are allocated for EAC_COUNTER.
  • EAC_LENGTH_BYTE This 12-bit field indicates the length, in byte, of the EAC.
  • EAC_COUNTER This 12-bit field indicates the number of the frames before the frame where the EAC arrives.
  • AUX_PRIVATE_DYN This 48-bit field is reserved for future use for signaling auxiliary streams. The meaning of this field depends on the value of AUX_STREAM_TYPE in the configurable PLS2-STAT.
  • CRC_32 A 32-bit error detection code, which is applied to the entire PLS2.
  • FIG. 15 illustrates a logical structure of a frame according to an embodiment of the present invention.
  • the PLS, EAC, FIC, DPs, auxiliary streams and dummy cells are mapped into the active carriers of the OFDM symbols in the frame.
  • the PLS 1 and PLS2 are first mapped into one or more FSS(s). After that, EAC cells, if any, are mapped immediately following the PLS field, followed next by FIC cells, if any.
  • the DPs are mapped next after the PLS or EAC, FIC, if any. Type 1 DPs follows first, and Type 2 DPs next. The details of a type of the DP will be described later. In some case, DPs may carry some special data for EAS or service signaling data.
  • auxiliary stream or streams follow the DPs, which in turn are followed by dummy cells. Mapping them all together in the above mentioned order, i.e. PLS, EAC, FIC, DPs, auxiliary streams and dummy data cells exactly fill the cell capacity in the frame.
  • FIG. 16 illustrates PLS mapping according to an embodiment of the present invention.
  • PLS cells are mapped to the active carriers of FSS(s). Depending on the number of cells occupied by PLS, one or more symbols are designated as FSS(s), and the number of FSS(s) NFSS is signaled by NUM_FSS in PLS1.
  • the FSS is a special symbol for carrying PLS cells. Since robustness and latency are critical issues in the PLS, the FSS(s) has higher density of pilots allowing fast synchronization and frequency-only interpolation within the FSS.
  • PLS cells are mapped to active carriers of the NFSS FSS(s) in a top-down manner as shown in an example in FIG. 17 .
  • the PLS1 cells are mapped first from the first cell of the first FSS in an increasing order of the cell index.
  • the PLS2 cells follow immediately after the last cell of the PLS1 and mapping continues downward until the last cell index of the first FSS. If the total number of required PLS cells exceeds the number of active carriers of one FSS, mapping proceeds to the next FSS and continues in exactly the same manner as the first FSS.
  • DPs are carried next. If EAC, FIC or both are present in the current frame, they are placed between PLS and “normal” DPs.
  • FIG. 17 illustrates EAC mapping according to an embodiment of the present invention.
  • EAC is a dedicated channel for carrying EAS messages and links to the DPs for EAS. EAS support is provided but EAC itself may or may not be present in every frame. EAC, if any, is mapped immediately after the PLS2 cells. EAC is not preceded by any of the FIC, DPs, auxiliary streams or dummy cells other than the PLS cells. The procedure of mapping the EAC cells is exactly the same as that of the PLS.
  • EAC cells are mapped from the next cell of the PLS2 in increasing order of the cell index as shown in the example in FIG. 17 .
  • EAC cells may occupy a few symbols, as shown in FIG. 17 .
  • EAC cells follow immediately after the last cell of the PLS2, and mapping continues downward until the last cell index of the last FSS. If the total number of required EAC cells exceeds the number of remaining active carriers of the last FSS mapping proceeds to the next symbol and continues in exactly the same manner as FSS(s).
  • the next symbol for mapping in this case is the normal data symbol, which has more active carriers than a FSS.
  • FIC is carried next, if any exists. If FIC is not transmitted (as signaled in the PLS2 field), DPs follow immediately after the last cell of the EAC.
  • FIC is a dedicated channel for carrying cross-layer information to enable fast service acquisition and channel scanning. This information primarily includes channel binding information between DPs and the services of each broadcaster. For fast scan, a receiver can decode FIC and obtain information such as broadcaster ID, number of services, and BASE_DP_ID. For fast service acquisition, in addition to FIC, base DP can be decoded using BASE_DP_ID. Other than the content it carries, a base DP is encoded and mapped to a frame in exactly the same way as a normal DP. Therefore, no additional description is required for a base DP.
  • the FIC data is generated and consumed in the Management Layer. The content of FIC data is as described in the Management Layer specification.
  • the FIC data is optional and the use of FIC is signaled by the FIC_FLAG parameter in the static part of the PLS2. If FIC is used, FIC_FLAG is set to ‘1’ and the signaling field for FIC is defined in the static part of PLS2. Signaled in this field are FIC_VERSION, and FIC_LENGTH_BYTE. FIC uses the same modulation, coding and time interleaving parameters as PLS2. FIC shares the same signaling parameters such as PLS2_MOD and PLS2_FEC. FIC data, if any, is mapped immediately after PLS2 or EAC if any. FIC is not preceded by any normal DPs, auxiliary streams or dummy cells. The method of mapping FIC cells is exactly the same as that of EAC which is again the same as PLS.
  • FIC cells are mapped from the next cell of the PLS2 in an increasing order of the cell index as shown in an example in (a).
  • FIC cells may be mapped over a few symbols, as shown in (b).
  • mapping proceeds to the next symbol and continues in exactly the same manner as FSS(s).
  • the next symbol for mapping in this case is the normal data symbol which has more active carriers than a FSS.
  • one or more DPs are mapped, followed by auxiliary streams, if any, and dummy cells.
  • FIG. 19 illustrates an FEC structure according to an embodiment of the present invention before bit interleaving.
  • Data FEC encoder may perform the FEC encoding on the input BBF to generate FECBLOCK procedure using outer coding (BCH), and inner coding (LDPC).
  • BCH outer coding
  • LDPC inner coding
  • the illustrated FEC structure corresponds to the FECBLOCK.
  • the FECBLOCK and the FEC structure have same value corresponding to a length of LDPC codeword.
  • Nldpc 64800 bits (long FECBLOCK) or 16200 bits (short FECBLOCK).
  • the below table 28 and table 29 show FEC encoding parameters for a long FECBLOCK and a short FECBLOCK, respectively.
  • a 12-error correcting BCH code is used for outer encoding of the BBF.
  • the BCH generator polynomial for short FECBLOCK and long FECBLOCK are obtained by multiplying together all polynomials.
  • LDPC code is used to encode the output of the outer BCH encoding.
  • Pldpc parity bits
  • the completed Bldpc (FECBLOCK) are expressed as follow Math figure.
  • p 6138 p 6138 ⁇ i 0
  • p 6458 p 6458 ⁇ i 0
  • p 6162 p 6162 ⁇ i 1
  • p 6482 p 6482 ⁇ i 1
  • the addresses of the parity bit accumulators are given in the second row of the addresses of parity check matrix.
  • This LDPC encoding procedure for a short FECBLOCK is in accordance with t LDPC encoding procedure for the long FECBLOCK, except replacing the table 30 with table 31, and replacing the addresses of parity check matrix for the long FECBLOCK with the addresses of parity check matrix for the short FECBLOCK.
  • FIG. 20 illustrates a time interleaving according to an embodiment of the present invention.
  • the time interleaver operates at the DP level.
  • the parameters of time interleaving (TI) may be set differently for each DP.
  • DP_TI_TYPE (allowed values: 0 or 1): Represents the TI mode; ‘0’ indicates the mode with multiple TI blocks (more than one TI block) per TI group. In this case, one TI group is directly mapped to one frame (no inter-frame interleaving). ‘1’ indicates the mode with only one TI block per TI group. In this case, the TI block may be spread over more than one frame (inter-frame interleaving).
  • DP_NUM_BLOCK_MAX (allowed values: 0 to 1023): Represents the maximum number of XFECBLOCKs per TI group.
  • DP_FRAME_INTERVAL (allowed values: 1, 2, 4, 8): Represents the number of the frames HUMP between two successive frames carrying the same DP of a given PHY profile.
  • DP_TI_BYPASS (allowed values: 0 or 1): If time interleaving is not used for a DP, this parameter is set to ‘1’. It is set to ‘0’ if time interleaving is used.
  • the parameter DP_NUM_BLOCK from the PLS2-DYN data is used to represent the number of XFECBLOCKs carried by one TI group of the DP.
  • each TI group is a set of an integer number of XFECBLOCKs and will contain a dynamically variable number of XFECBLOCKs.
  • the number of XFECBLOCKs in the TI group of index n is denoted by N ⁇ BLOCK_Group(n) and is signaled as DP_NUM_BLOCK in the PLS2-DYN data.
  • N ⁇ BLOCK_Group(n) may vary from the minimum value of 0 to the maximum value N ⁇ BLOCK_Group_MAX (corresponding to DP_NUM_BLOCK_MAX) of which the largest value is 1023.
  • Each TI group is either mapped directly onto one frame or spread over PI frames.
  • Each TI group is also divided into more than one TI blocks(NTI), where each TI block corresponds to one usage of time interleaver memory.
  • the TI blocks within the TI group may contain slightly different numbers of XFECBLOCKs. If the TI group is divided into multiple TI blocks, it is directly mapped to only one frame. There are three options for time interleaving (except the extra option of skipping the time interleaving) as shown in the below table 33.
  • Each TI group contains one TI block and is mapped to more than one frame.
  • DP_TI_TYPE ‘1’.
  • DP_TI_TYPE ‘1’.
  • Each TI group is divided into multiple TI blocks and is mapped directly to one frame as shown in (c).
  • Each TI block may use full TI memory, so as to provide the maximum bit-rate for a DP.
  • the time interleaver will also act as a buffer for DP data prior to the process of frame building. This is achieved by means of two memory banks for each DP. The first TI-block is written to the first bank. The second TI-block is written to the second bank while the first bank is being read from and so on.
  • the TI is a twisted row-column block interleaver.
  • FIG. 21 illustrates the basic operation of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 21 ( a ) shows a writing operation in the time interleaver and FIG. 21 ( b ) shows a reading operation in the time interleaver
  • the first XFECBLOCK is written column-wise into the first column of the TI memory, and the second XFECBLOCK is written into the next column, and so on as shown in (a).
  • cells are read out diagonal-wise.
  • N r cells are read out as shown in (b).
  • the reading process in such an interleaving array is performed by calculating the row index R n,s,i , the column index C n,s,i , and the associated twisting parameter T n,s,i as follows expression.
  • N xBLOCK _ TI (n,s), and it is determined by N xBLOCK _ TI _ MAX given in the PLS2-STAT as follows expression.
  • FIG. 22 illustrates an operation of a twisted row-column block interleaver according to another embodiment of the present invention.
  • the number of TI groups is set to 3.
  • the maximum number of XFECBLOCK is signaled in the PLS2-STAT data by N ⁇ BLOCK_Group_MAX, which leads to
  • FIG. 23 illustrates a diagonal-wise reading pattern of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 24 illustrates interleaved XFECBLOCKs from each interleaving array according to an embodiment of the present invention.
  • FIG. 25 illustrates a protocol stack for providing broadcast services according to an embodiment of the present invention.
  • Broadcast services according to an embodiment of the present invention may provide additional services such as HTML5 applications, interactive service, ACR service, second screen service and personalization service as well as audio/video (A/V) data.
  • the broadcast services according to an embodiment of the present invention may provide not only real-time (RT) services but also non-real-time (NRT) services.
  • RT real-time
  • NRT non-real-time
  • content for services is transmitted in real time.
  • content for services is transmitted in non-real time.
  • content for RT services may be transmitted at a time when the content for RT services is used.
  • Content for NRT services may be transmitted prior to a time when the content for NRT services is used.
  • a broadcast receiving apparatus may previously receive and store content for NRT services and then use the stored content while providing the NRT services.
  • the broadcast receiving apparatus may previously receive and store the content for the NRT services and provide the NRT services using the stored content when user input for the NRT services is received.
  • NRT services and RT services have different transport characteristics and thus may be transmitted through different transport protocols.
  • Content for NRT services may be referred to as NRT data.
  • Such broadcast services may be transmitted through a broadcast network using ground waves, cables, satellites or the like.
  • the broadcast network using ground waves, cables, satellites or the like may be referred to as a physical layer.
  • content for the NRT services may be transmitted through data carousel.
  • a broadcast transmitting apparatus may periodically transmit NRT content at predetermined intervals and a broadcast receiving apparatus may receive data after waiting a data rotation period. Accordingly, even when the broadcast receiving apparatus receives a broadcast service during transmission of content, the broadcast receiving apparatus may receive content, transmitted before reception of the broadcast service, in the next period. Therefore, the broadcast receiving apparatus may also receive NRT services through a broadcast network corresponding to unidirectional communication.
  • the broadcast services according to an embodiment of the present invention may be transmitted through Internet (broadband).
  • the broadcast transmitting apparatus may encapsulate a broadcast service according to IP (Internet protocol) and transmit the encapsulated broadcast service through a broadcast network. Accordingly, when the broadcast service is transmitted through a broadcast network using ground waves, cables, satellites or the like, the broadcast receiving apparatus may demodulate a broadcast signal to extract IP packets.
  • the broadcast receiving apparatus may extract user datagram protocol (UDP) packets from the IP packets.
  • the broadcast receiving apparatus may extract asynchronous layered coding/layered coding transport (ALC/LCT) packets based on a real-time objective delivery over unidirectional transport (ROUTE) protocol from the UDP packets.
  • the ROUTE protocol is an application layer protocol for transmitting RT data using ALC/LCT packets.
  • the broadcast receiving apparatus may extract at least one of broadcast service signaling information, NRT data and media content from the ALC/LCT packets.
  • the media content may have MPEG-DASH (Dynamic Adaptive Streaming over HTTP) format.
  • the media content may be encapsulated into ISO base media file format (ISO BMFF) and transmitted through MPEG-DASH protocol.
  • ISO BMFF ISO base media file format
  • the broadcast receiving apparatus may extract MPEG-DASH segments from ROUTE packets.
  • the broadcast receiving apparatus may extract ISO BMFF files from the MPEG-DASH segments.
  • the broadcast transmitting apparatus may transmit broadcast services encapsulated through MPEG-2 TS along with broadcast services encapsulated through IP through a broadcast network for legacy broadcast receiving apparatuses.
  • the broadcast transmitting apparatus may multicast or unicast the broadcast service.
  • the broadcast receiving apparatus may receive IP packets from the broadband.
  • the broadcast receiving apparatus may extract TCP packets from the IP packets.
  • the broadcast receiving apparatus may extract HTTP packets from the TCP packets.
  • the broadcast receiving apparatus may extract at least one of broadcast service signaling information, NRT data and media content.
  • the media content may be in MPEG-DASH (Dynamic Adaptive Streaming over HTTP) format.
  • the media content may be encapsulated into ISO base media file format (ISO BMFF) and transmitted through MPEG-DASH protocol.
  • ISO BMFF ISO base media file format
  • the broadcast receiving apparatus may extract MPEG-DASH segments from ROUTE packets.
  • the broadcast receiving apparatus may extract ISO BMFF files from the MPEG-DASH segments.
  • a broadcast receiving apparatus that receives broadcast services transmitted according to the protocol stack for providing broadcast services will be described with reference to FIG. 26 .
  • FIG. 26 is a block diagram of a broadcast transmitting apparatus for transmitting broadcast services, a content server for transmitting content related to broadcast services, a broadcast receiving apparatus for receiving broadcast services and a companion apparatus interoperating with the broadcast receiving apparatus according to an embodiment of the present invention.
  • the broadcast transmitting apparatus 10 transmits broadcast services. Specifically, the broadcast transmitting apparatus 10 transmits broadcast services including media content through a broadcast network using at least one of a satellite, ground waves and cable. More specifically, the broadcast transmitting apparatus 10 may include a controller (not shown) and a transmitting unit (not shown). The controller controls operations of the broadcast transmitting apparatus 10 . The transmitting unit transmits broadcast signals.
  • the broadcast receiving apparatus 100 receives broadcast services.
  • the broadcast receiving apparatus 100 may include a broadcast receiving unit 110 , an IP transceiver 130 , a controller 150 , a display unit 180 and a power supply 190 .
  • the broadcast receiving unit 110 receives broadcast signals through a broadcast network.
  • the broadcast receiving unit 110 may receive broadcast signals through broadcast network using at least one of a satellite, ground waves and cable. More specifically, the broadcast receiving unit 110 may include a tuner for receiving broadcast signals.
  • the broadcast receiving unit 110 may include a demodulator for demodulating broadcast signals to extract link layer data.
  • the broadcast receiving unit 110 may include one or more processors for respectively executing a plurality of functions of the broadcast receiving unit 110 , one or more circuits, and one or more hardware modules.
  • the controller may be a system-on-chip (SOC) into which various semiconductor components are integrated.
  • the SOC may be a semiconductor chip into which various components for multimedia such as graphics, audio, video and modem, processors, DRAMs and so on are integrated.
  • the IP transceiver 130 may transmit and receive IP data. Specifically, the IP transceiver 130 may transmit a request for a content server 50 that provides content related to broadcast services. In addition, the IP transceiver 130 may receive a response to the request from the content server 50 . The IP transceiver 130 may transmit data to the companion apparatus 300 and receive data from the companion apparatus 300 . Specifically, the IP transceiver 130 may transmit a request to the companion apparatus 300 and receive a response to the request from the companion apparatus 300 . In addition, the IP transceiver 130 may transmit content related to broadcast services to the companion apparatus 300 .
  • the IP transceiver 130 may include one or more processors for respectively executing a plurality of functions of the IP transceiver 130 , one or more circuits, and one or more hardware modules.
  • the controller may be an SOC into which various semiconductor components are integrated.
  • the controller 150 may include a main processor 151 , a user input receiver 153 , a memory unit 155 , a storage 157 and a multimedia module 159 .
  • the main processor 151 controls the overall operation of the controller.
  • the user input receiver 153 receives user input.
  • the memory unit 155 temporarily stores data for the operation of the controller 150 .
  • the memory unit 155 may be a volatile memory.
  • the storage 157 stores data necessary for the operation of the controller 150 .
  • the storage 157 may be a nonvolatile memory.
  • the multimedia module 159 processes media content.
  • the controller 150 may include one or more processors for respectively executing a plurality of functions of the controller 150 , one or more circuits, and one or more hardware modules.
  • the controller may be an SOC into which various semiconductor components are integrated.
  • the display unit 180 displays images.
  • the power supply 190 supplies power necessary for operations of the broadcast receiving apparatus 100 .
  • the companion apparatus 300 interoperates with the broadcast receiving apparatus 100 . Specifically, the companion apparatus 300 provides information about broadcast services received by the broadcast receiving apparatus 100 by interoperating with the broadcast receiving apparatus 100 .
  • the companion apparatus 300 may include an IP transceiver 330 , a controller 350 , a display unit 380 and a power supply 390 .
  • the IP transceiver 330 may transmit and receive IP data. Specifically, the IP transceiver 330 may transmit a request for the content server 50 that provides content related to broadcast services. In addition, the IP transceiver 330 may receive a response to the request from the content server 50 . The IP transceiver 330 may transmit data to the broadcast receiving apparatus 100 and receive data from the broadcast receiving apparatus 100 . Specifically, the IP transceiver 330 may transmit a request to the broadcast receiving apparatus 100 and receive a response to the request from the broadcast receiving apparatus 100 . In addition, the IP transceiver 330 may receive content related to broadcast services from the broadcast receiving apparatus 100 .
  • the IP transceiver 330 may include one or more processors for respectively executing a plurality of functions of the IP transceiver 330 , one or more circuits, and one or more hardware modules.
  • the controller may be an SOC into which various semiconductor components are integrated.
  • the controller 350 may include a main processor 351 , a user input receiver 353 , a memory unit 355 , a storage 357 and a multimedia module 359 .
  • the main processor 351 controls the overall operation of the controller.
  • the user input receiver 353 receives user input.
  • the memory unit 355 temporarily stores data for the operation of the controller 350 .
  • the memory unit 355 may be a volatile memory.
  • the storage 357 stores data necessary for the operation of the controller 350 .
  • the storage 357 may be a nonvolatile memory.
  • the multimedia module 359 processes media content.
  • the controller 350 may include one or more processors for respectively executing a plurality of functions of the controller 350 , one or more circuits, and one or more hardware modules.
  • the controller may be an SOC into which various semiconductor components are integrated.
  • the display unit 380 displays images.
  • the power supply 390 supplies power necessary for operations of the companion apparatus 300 .
  • ESG data may include the start time, end time, titles and summary of content, parental rating, genres and information on appearances of programs.
  • the ESG data may include at least one of provision information indicating provision for viewing content, information for interactivity service, purchase information related to content and information for accessing content.
  • the broadcast transmitting apparatus 10 may structure ESG data in units of information and transmit the structured information.
  • the broadcast transmitting apparatus 10 may classify the ESG data by types of information included in the ESG data and transmit the ESG data.
  • the broadcast transmitting apparatus 10 may classify the ESG data into ESG data including information indicating broadcast services, ESG data including information indicating one or more content included in a broadcast service, and ESG data including information indicating the schedule of at least one of a service and content and transmit the classified ESG data.
  • the unit of information may be referred to as a service guide fragment or a fragment.
  • the ESG data including information indicating broadcast services may be referred to as a service fragment.
  • the ESG data including information indicating one or more content included in a broadcast service may be referred to as a content fragment.
  • the ESG data including information indicating the schedule of at least one of a service and content may be referred to as a schedule fragment.
  • the broadcast receiving apparatus 100 may process ESG data based on the unit of structured information. Specifically, the broadcast receiving apparatus 100 may process ESG data based on at least one of the service fragment, content fragment and schedule fragment. When the broadcast receiving apparatus 100 processes data based on a fragment in this manner, the broadcast receiving apparatus 100 may selectively process necessary data. Accordingly, the broadcast receiving apparatus 100 may selectively request necessary data through a communication network available for interactive communication. In addition, the broadcast receiving apparatus 100 may transmit data necessary for the companion apparatus 300 to the companion apparatus 300 with efficiency.
  • the service fragment will be described first with reference to FIGS. 27 to 31 .
  • the service in the service fragment refers to a broadcast service.
  • the service may be a set of content items constituting a broadcast service.
  • the service may be a set of broadcast programs transmitted through a logical transmission channel by one broadcaster.
  • the service fragment forms a center point referred to by other fragments included in an ESG.
  • the broadcast receiving apparatus 100 may display a broadcast service represented by the service fragment while displaying the ESG. More specifically, the broadcast receiving apparatus 100 may chronologically arrange and display content included in broadcast services for the broadcast services.
  • the broadcast receiving apparatus 100 may display the contents of corresponding content, how the corresponding content may be viewed and when the corresponding content may be viewed.
  • Services may have various service types.
  • a service may be an interactive service.
  • a service may be a unidirectional service through a broadcast network.
  • a service may include an interactive service and a unidirectional service.
  • a service may include one or more components depending on service type.
  • the service may include a component for service functionality, which is not directly related with content included in the service.
  • a service may include purchase information for purchasing content included in the service or the service.
  • a service may include subscription information for subscribing to content included in the service or the service. Services of various types that may be represented by services of service fragments will be described with reference to FIG. 27 .
  • FIG. 27 shows values of a serviceType element included in a service fragment and types of services indicated by the values according to an embodiment of the present invention.
  • FIG. 28 shows an XML format of the serviceType element included in the service fragment.
  • the service fragment may include an element representing service types of the service fragment.
  • the broadcast receiving apparatus 100 may display broadcast services to a user based on the element representing service types of the service fragment. Specifically, the broadcast receiving apparatus 100 may display an ESG based on the element representing service types of the service fragment. For example, the broadcast receiving apparatus 100 may display service types as characters depending on the element representing service types of the service fragment. In addition, the broadcast receiving apparatus 100 may display service types as icons depending on the element representing service types of the service fragment. Furthermore, the broadcast receiving apparatus 100 may display service types as graphics depending on the element representing service types of the service fragment.
  • the element representing service types may indicate at least one of a basic TV service, a basic radio service, a rights issuer service, cachecast, a file download service, software management services, a notification service, a service guide service, terminal provisioning services, an auxiliary data service, a streaming on demand service, a file download on demand service, smartcard provisioning services, a linear service, an App-based service and a companion screen service.
  • the basic TV service represents a service including video.
  • the basic radio service represents a service including audio without video.
  • the rights issuer service indicates a service that issues the right of digital right management (DRM) managing the right to present content.
  • the cachecast represents an NRT service that previously downloads a broadcast service prior to reproduction of the broadcast service and then presents the broadcast service.
  • the cachecast may be called an NRT service.
  • the file download services refer to services that require file download for service reproduction.
  • the software management services are services for managing software. Specifically, the software management services may represent services for updating software of the broadcast receiving apparatus 100 .
  • the notification service indicates a service for signaling notification to the broadcast receiving apparatus 100 .
  • a broadcast service provider or a content provider may deliver a message to the broadcast receiving apparatus 100 through the notification service.
  • the service guide service provides a service guide. Specifically, the service guide service may represent a service for receiving, by the broadcast receiving apparatus 100 , ESG data providing broadcast services.
  • the terminal provisioning services represent services for provisioning for subscribing to a service or content. Specifically, the broadcast receiving apparatus 100 may update provisioning through the terminal provisioning service to present broadcast services.
  • the auxiliary data service provides auxiliary data related to a broadcast service.
  • the streamlining on demand service provides a streaming service at the request of a user.
  • the file download on demand service provides file download at the request of a user.
  • the smartcard provisioning services may be services for updating smartcard provisioning.
  • the broadcast receiving apparatus 100 may use a smartcard that descrambles scrambled content in order to present scrambled content.
  • the broadcast receiving apparatus 100 may update provisioning of the smartcard through the smartcard provisioning services.
  • the linear service may be a service including continuous components of which primary content is consumed depending on the schedule and time base set by a broadcaster.
  • the continuous components may be content components presented in continuous streams.
  • the App-based service is a non-linear service that provides user interfaces and functions based on applications.
  • the companion screen service is a service by which the broadcast receiving apparatus 100 receiving broadcast services and the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 provide broadcast services together.
  • broadcast services may combinations of a plurality of services.
  • broadcast services may be a linear service and an App-based service providing additional services through applications.
  • a service fragment may include a plurality of elements respectively representing different service types.
  • ServiceType may indicate the basic TV service when ServiceType has a value of 1 in a specific embodiment.
  • ServiceType may indicate the basic radio service when ServiceType has a value of 2 in a specific embodiment.
  • ServiceType may indicate the rights issuer service when ServiceType has a value of 3 in a specific embodiment.
  • ServiceType may indicate the cachecast service when ServiceType has a value of 4 in a specific embodiment.
  • ServiceType may indicate the file download services when ServiceType has a value of 5 in a specific embodiment.
  • ServiceType may indicate the software management services when ServiceType has a value of 6 in a specific embodiment.
  • ServiceType may indicate the notification service when ServiceType has a value of 7 in a specific embodiment.
  • ServiceType may indicate the service guide service when ServiceType has a value of 8 in a specific embodiment.
  • ServiceType may indicate the terminal provisioning services when ServiceType has a value of 9 in a specific embodiment.
  • ServiceType may indicate the auxiliary data service when ServiceType has a value of 10 in a specific embodiment.
  • ServiceType may indicate the streaming on demand service when ServiceType has a value of 11 in a specific embodiment.
  • ServiceType may indicate the file download on demand service when ServiceType has a value of 12 in a specific embodiment.
  • ServiceType may indicate the smartcard provisioning services when ServiceType has a value of 13 in a specific embodiment.
  • ServiceType may indicate the linear service when ServiceType has a value of 14 in a specific embodiment.
  • ServiceType may indicate the App-based service when ServiceType has a value of 15 in a specific embodiment.
  • ServiceType may indicate the companion screen service when ServiceType has a value of 16 in a specific embodiment.
  • the service fragment may include an element representing the range of service types of the corresponding service.
  • the element representing the range of service types of the corresponding service may include a maximum value and a minimum value of the element representing service types.
  • the broadcast transmitting apparatus 10 may signal the range of broadcast services to the broadcast receiving apparatus 100 through the element representing the range of service types of the corresponding service.
  • the broadcast receiving apparatus 100 may determine types available for the corresponding service based on the element representing the range of service types of the corresponding service.
  • the element representing the range of service types of the corresponding service may be referred to as ServiceTypeRangeType.
  • the minimum value of the element representing service types may be referred to as a minInclusive value.
  • the maximum value of the element representing service types may be referred to as a maxInclusive value.
  • ServiceTypeRangeType has a minInclusive value of 0 and a maxInclusive value of 13. Accordingly, the broadcast receiving apparatus 100 may be aware that services represented by the corresponding service fragment do not correspond to the linear service, App-based service and companion screen service.
  • the broadcast receiving apparatus 100 may display services based on service types represented by the service fragment. Specifically, the broadcast receiving apparatus 100 may display services based on the element representing service types included in the service fragment. In a specific embodiment, the broadcast receiving apparatus 100 may display service types based on service types represented by the service fragment through a menu that indicates a broadcast service guide.
  • the menu indicating the broadcast service guide may be a menu indicating a plurality of broadcast services and content respectively included in the plurality of broadcast services.
  • the broadcast receiving apparatus 100 may display service types based on service types represented by the service fragment through a menu that indicates information of corresponding broadcast services on a screen displaying broadcast services.
  • the broadcast receiving apparatus 100 may display service names, service provision schedules and service types represented by the service fragment in the form of a bar positioned at the lower or upper part of the screen while reproducing a broadcast service.
  • the broadcast receiving apparatus 100 may display service types based on service types represented by the service fragment through a service list that indicates broadcast services and information representing the broadcast services.
  • the broadcast receiving apparatus 100 may display service types represented by the service fragment along with the names of corresponding services and virtual channel numbers indicating the corresponding services while displaying the service list.
  • FIG. 29 illustrates XML data indicating the serviceType element and a user interface when a service represented by the service fragment is the linear service according to an embodiment of the present invention.
  • the service fragment indicates that the corresponding service is the linear service.
  • the broadcast receiving apparatus 100 may display a start time and an end time of the linear service while reproducing the service because main content included in the linear service has the start time and the end time designated by a broadcaster. Accordingly, the broadcast receiving apparatus 100 may notify a user of the start time and end time of the corresponding service based on the service fragment of ESG data without additional information.
  • FIG. 30 illustrates XML data indicating the serviceType element and a user interface when a service represented by the service fragment is the linear service and the App-based service according to an embodiment of the present invention.
  • the service fragment indicates that the corresponding service is the linear service and the App-based service.
  • the broadcast receiving apparatus 100 may display that there is an application included in the corresponding service while reproducing the service.
  • the broadcast receiving apparatus 100 may execute the corresponding application. Accordingly, the broadcast receiving apparatus 100 may display that there is the application related to the service based on the service fragment of ESG data without additional information.
  • FIG. 31 illustrates XML data indicating the serviceType element and a user interface when a service represented by the service fragment is the linear service and the companion screen service according to an embodiment of the present invention.
  • the service fragment indicates that the corresponding service is the linear service and the companion screen service.
  • the broadcast receiving apparatus 100 may display that the corresponding service can be provided through the companion apparatus 300 while reproducing the service. Accordingly, the broadcast receiving apparatus 100 may display that the corresponding service can be provided through the companion apparatus 300 based on the service fragment of ESG data without additional information.
  • a broadcast service may include one or more components constituting the broadcast service.
  • a component may constitute media content included in the broadcast service.
  • the component may provide a specific function or information related to the broadcast service.
  • an application that provides information related to the broadcast service may be a component of the broadcast service.
  • hybrid broadcast may include various components. Furthermore, components included in broadcast services may vary according to content included in the broadcast services.
  • the broadcast receiving apparatus 100 may selectively present various components. Accordingly, it is necessary to indicate components through ESG data such that a user may select a broadcast service and content included in the broadcast service by viewing component information.
  • ESG data does not indicate components of broadcast services
  • the broadcast receiving apparatus 100 needs to acquire information about the components through broadcast service signaling information that signals a broadcast service that is currently broadcast. Accordingly, it is necessary for the broadcast receiving apparatus 100 to efficiently acquire information about components included in broadcast services and to obtain information about components to be broadcast as well as a currently broadcast component.
  • the broadcast receiving apparatus 100 needs to acquire a correct time when a component is broadcast. Further, the broadcast receiving apparatus 100 needs to acquire information about components that will be broadcast.
  • ESG data includes component fragments indicating components included in broadcast services, which will be described with reference to the attached drawings.
  • FIG. 32 illustrates an XML format of the component fragment according to an embodiment of the present invention.
  • ESG data may include the component fragment.
  • the component fragment includes information about components corresponding to part of a broadcast service or content.
  • the component fragment may include an identifier element for identifying the component fragment.
  • the component fragment may include a version element indicating whether a component is changed.
  • the component fragment may include an element indicating a valid period of the component fragment.
  • the element indicating the valid period of the component fragment may include a start time and an end time of the valid period.
  • the component fragment may include a component type element indicating component type. Since one component may include a plurality of properties, one component fragment may include a plurality of component type elements.
  • the component fragment may include a component data element indicating data types included in a component.
  • the component fragment may include an extension element reserved for future extension and specific applications.
  • the identifier element may be “id” and the version element may be “version.”
  • the element indicating the start time of the valid period of the component fragment may be “validFrom” and the end time of the valid period may be “validTo.”
  • the component type element may be “ComponentType”
  • the component data element may be “ComponentData.”
  • the extension element may be “PrivateExt”.
  • component type element Component types that can be represented by the component type element will be described with reference to FIG. 33 .
  • FIG. 33 shows component types that can be represented by the component fragment according to an embodiment of the present invention.
  • the component fragment may indicate various types of components. Specifically, the component fragment may indicate component types through the component type element included therein. Accordingly, the broadcast receiving apparatus 100 may recognize the type of a corresponding component.
  • the component fragment may indicate a continuous component.
  • the continuous component is a component presented in a continuous stream.
  • the continuous component may be one of audio, video and closed captioning.
  • the component type element may have a value of 1 when the component fragment indicates a continuous component.
  • the component fragment may indicate an elementary component.
  • the elementary component is a continuous component that is a single encoding.
  • the elementary component may be an audio component. Specifically, the elementary component may be a single encoding of a sound sequence.
  • the elementary component may be a video component. Specifically, the elementary component may be a single encoding of a picture sequence.
  • the elementary component may be a closed caption track.
  • the component type element may have a value of 2.
  • the component fragment may indicate a composite component.
  • the composite component is a collection of a plurality of continuous components necessary to present one scene.
  • the composite component is a collection of continuous components which have the same content type and represent the same scene, and which are to be combined in a combination to produce a presentation.
  • the composite component is a collection of a plurality of media components combined to represent one scene.
  • the composite component may be music, dialog and special effects necessary for complete audio.
  • the composite component may be right and left 3D views necessary to present 3D pictures.
  • the component type element may have a value of 3.
  • the component fragment may indicate a PickOne component.
  • the PickOne component is a collection of a plurality of alternative continuous components which represent one scene.
  • “PickOne” represents that one of a plurality of alternative continuous components may be selected and presented.
  • the PickOne component is a collection of a plurality of continuous components which have the same media type and represent the same scene, and one of which is selected to produce a presentation.
  • the PickOne component may be a collection of a plurality of media components encoded from the same content with different qualities.
  • the PickOne component may be a set of audio components encoded from the same sound sequence with different bitrates.
  • the PickOne component may be a set of video components encoded from the same picture sequence with different bitrates.
  • the PickOne component may be a regular closed caption track and an “easy reader” closed caption track for the same dialog.
  • the component type element may have a value of 4.
  • the component fragment may indicate a complex component.
  • the complex component indicates either a composite component or a PickOne component.
  • the component type element may have a value of 5.
  • the component fragment may indicate a presentable component.
  • the presentable component refers to a continuous component which is may be substantially presented by the broadcast receiving apparatus 100 .
  • the presentable component may be an elementary component.
  • the presentable component may be a complex component.
  • the component type element may have a value of 6.
  • the component fragment may indicate that a component is a non-real-time (NRT) file.
  • NRT file is a file delivered in non-real-time.
  • the NRT file may refer to a file which is previously downloaded by the broadcast receiving apparatus 100 before being executed and which is executed at an execution time.
  • the component type element may have a value of 7.
  • the component fragment may indicate that a component is an NRT content item.
  • An NRT content item is a collection of NRT files which are intended to be consumed as an integrated whole by a service provider.
  • the component type element may have a value of 8.
  • the component fragment may indicate that a component is an application.
  • An application may be a collection of documents constituting an enhanced or interactive service. Specifically, documents may include at least one of HTML, JavaScript, CSS, XML and multimedia files. An application may be regarded as an NRT content item.
  • the component type element may have a value of 9.
  • the component fragment may indicate that a component is an ATSC 3.0 application.
  • An ATSC 3.0 application represents an application executed in environments conforming to ATSC 3.0 specification.
  • the component type element may have a value of 10.
  • the component fragment may indicate that a component is an on demand component.
  • An on demand component represents a component that is delivered on demand.
  • the component type element may have a value of 11.
  • the component fragment may indicate that a component is a notification stream.
  • a notification stream delivers notifications to synchronize actions of applications with an underlying linear time base.
  • the component type element may have a value of 12.
  • the component fragment may indicate that a component is an App-based enhancement.
  • An App-based enhancement may include one or more notification streams.
  • An App-based enhancement may include one or more applications.
  • An App-based enhancement may include an NRT content item.
  • the NRT content item may be executed by an application.
  • An App-based enhancement may include an on demand component.
  • the on demand component may be managed by an application.
  • FIG. 34 shows an XML format of a ComponentRangeType element included in the component fragment according to an embodiment of the present invention.
  • the component fragment may include an element indicating the range of component types that a corresponding component may have.
  • the element indicating the range of component types that a corresponding component may have may include a minimum value and a maximum value of values that the element indicating component types may have.
  • the broadcast transmitting apparatus 10 may notify the broadcast receiving apparatus of the range of component types through the element indicating the range of component types that a corresponding component may have may.
  • the broadcast receiving apparatus 100 may determine component types of the corresponding component based on the element indicating the range of component types that the corresponding component may have.
  • the element indicating the range of component types that the corresponding component may have may be referred to as ComponentRangeType.
  • the minimum value of the element indicating component types may be minInclusive value and the maximum value thereof may be maxInclusive value.
  • ComponentRangeType has a minInclusive value of 0 and a maxInclusive value of 13. Accordingly, the broadcast receiving apparatus 100 may determine that the component indicated by the corresponding component fragment corresponds to one of the aforementioned component types.
  • the component type element indicating component types may not represent content types included in a component.
  • too many values need to be defined as values that the component type element may have. Accordingly, it is necessary to define a component data element indicating content types included in a component. This will be described with reference to FIG. 35 .
  • FIG. 35 shows an XML format of a ComponentData element included in the component fragment according to an embodiment of the present invention.
  • the component fragment may include the component data element indicating content types included in a component.
  • the component data element may represent that a component is a video component including a video.
  • the component data element may represent that a component is an audio component including an audio.
  • the component data element may represent that a component is a closed captioning component including a closed captioning.
  • the component data element may include a lower element depending on content type included in a component.
  • an element indicating that a component is a video component may be VideoComponent.
  • an element indicating that a component is an audio component may be AudioComponent.
  • an element indicating that a component is a closed captioning component may be CCComponent.
  • the lower element included in the component data element will be described in detail with reference to the attached drawings.
  • FIG. 36 shows an XML format of a VideoDataType element included in the component fragment according to an embodiment of the present invention.
  • the component data element may include a video role element specifying the role of the corresponding video.
  • the value of the video role element may be an integer.
  • the video role element may represent a default video when the component is a presentable component.
  • the video role element may have a value of 1.
  • the video role element may represent an alternative camera view when the component is a presentable component.
  • the video role element may have a value of 2.
  • the video role element may represent an alternative video component when the component is a presentable component.
  • the video role element may have a value of 3.
  • the video role element may represent a sign language inset when the component is a presentable component.
  • the video role element may have a value of 4.
  • the video role element may represent a follow subject video when the component is a presentable component.
  • the video role element may have a value of 5.
  • the video role element may represent a base layer for scalable video encoding.
  • the video role element may have a value of 6.
  • the video role element may represent an enhancement layer for scalable video encoding when the component is a composite component.
  • the video role element may have a value of 7.
  • the video role element may represent a 3D video left view when the component is a composite component.
  • the video role element may have a value of 8.
  • the video role element may represent a 3D video right view when the component is a composite component.
  • the video role element may have a value of 9.
  • the video role element may represent 3D video depth information when the component is a composite component.
  • the video role element may have a value of 10.
  • the video role element may represent that a media component is a video at a specific position of a picture divided into a plurality of regions when the component is a composite component.
  • the video role element may have a value of 11.
  • the video role element may represent follow-subject metadata when the component is a composite component.
  • the video role element may have a value of 12.
  • the follow-subject metadata may include at least one of the name, position and size of the corresponding subject.
  • the follow-subject metadata may represent a main video component region on which the subject is focused.
  • the video role element specifying the role of a video may be referred to as VideoRole.
  • the component data element may include an element specifying the range of values that the video role element may have.
  • the role of a video component may depend on component type. Accordingly, the element specifying the range of values that the video role element may have may include an element specifying the range of values that the video role element may have when the video component is a presentable video.
  • the element specifying the range of values that the video role element may have may include an element specifying the range of values that the video role element may have when the video component is a composite video.
  • the element specifying the range of values that the video role element may have may include an element specifying the range of values that the video role element may have when the video component is a composite video or a presentable video.
  • the element specifying the range of values that the video role element may have may be VideoRoleRangeType.
  • the component data element may include a target user element representing a target user of the corresponding component.
  • the target user element may be TargetUserProfile.
  • the component data element may include a target device element indicating a target device of the corresponding component.
  • the target device element may represent at least one of a primary apparatus that receives broadcast service, a companion apparatus and inclusion of both the primary apparatus and the companion apparatus in the companion screen services.
  • the target device element may be TargetDevice.
  • FIG. 37 shows an XML format of an AudioDataTape element included in the component fragment according to an embodiment of the present invention.
  • the component data element may include an audio role element specifying the role of an audio.
  • the value of the audio role element may be an integer.
  • the audio role element may indicate that an audio component is complete main.
  • the audio role element may have a value of 1.
  • the audio role element may indicate that an audio component is music.
  • the audio role element may have a value of 2.
  • the audio role element may indicate that an audio component is a dialog.
  • the audio role element may have a value of 3.
  • the audio role element may indicate that an audio component is effects.
  • the audio role element may have a value of 4.
  • the audio role element may indicate that an audio component is for visually impaired.
  • the audio role element may have a value of 5.
  • the audio role element may indicate that an audio component is for hearing impaired.
  • the audio role element may have a value of 6.
  • the audio role element may indicate that an audio component is a commentary.
  • the audio role element may have a value of 7.
  • the component data element may include an element specifying the range of values that the audio role element may have.
  • the element specifying the range of values that the audio role element may have may be AudioRoleRangeType.
  • the component data element may include a target user element indicating a target user of the corresponding component.
  • the target user element may be TargetUserProfile.
  • the component data element may include a target device element indicating a target device of the corresponding component.
  • the target device element may represent at least one of a primary apparatus that receives broadcast service, a companion apparatus and inclusion of both the primary apparatus and the companion apparatus in the companion screen services.
  • the target device element may be TargetDevice.
  • the component data element may include an element indicating that a component is associated with a presentable video component.
  • a component associated to a video component may refer to a component presented along with the video component.
  • a component associated to a video component may refer to a component synchronized with the video component and presented along with the video component.
  • This element may include an identifier for identifying a component fragment indicating an associated presentable video component. In a specific embodiment, this element may be associatedTo.
  • the component data element may include an element indicating the number of channels included in an audio component.
  • this element may be NumberOfAudioChannels.
  • FIG. 38 shows an XML format of a CCDataType element included in the component fragment according to an embodiment of the present invention.
  • the component data element may include a closed captioning role element indicating the role of a closed captioning.
  • the value of the closed captioning role element may be an integer.
  • the closed captioning role element may represent that a closed captioning component is a normal closed captioning.
  • the closed captioning role element may have a value of 1.
  • the closed captioning role element may represent that a closed captioning component is an easy-reader closed captioning for kindergarteners and elementary school students.
  • the closed captioning role element may have a value of 2.
  • the component data element may include an element specifying the range of values that the closed captioning role element may have.
  • the element specifying the range of values that the closed captioning role element may have may be CCRoleRangeType.
  • the component data element may include a target user element indicating a target user of the corresponding component.
  • the target user element may be TargetUserProfile.
  • the component data element may include a target device element indicating a target device of the corresponding component.
  • the target device element may represent at least one of a primary apparatus that receives broadcast service, a companion apparatus and inclusion of both the primary apparatus and the companion apparatus in the companion screen services.
  • the target device element may be TargetDevice.
  • the component data element may include an element indicating that a component is associated with a presentable video component. Specifically, whether a component is associated to a video component may represent whether the component is presented along with the video component.
  • a component associated to a video component may refer to a component synchronized with the video component and presented along with the video component. This element may include an identifier for identifying a component fragment indicating an associated presentable video component. In a specific embodiment, this element may be associatedTo.
  • the broadcast receiving apparatus 100 may determine the role of a component based on the aforementioned component fragment. In addition, the broadcast receiving apparatus 100 may display the role of a component based on the component data element. In a specific embodiment, the broadcast receiving apparatus 100 may display the roles of components included in content in a service guide menu. The broadcast receiving apparatus 100 may display the roles of component fragments based on the component fragments in a bar-shaped menu positioned at the lower or upper part of the screen. In another specific embodiment, the broadcast receiving apparatus 100 may display the roles of components represented by service fragments through a service list indicating information about broadcast services based on the service fragments. For example, the broadcast receiving apparatus 100 may display the role of a currently broadcast component in the corresponding service along with the name of the corresponding service and a virtual channel number indicating the corresponding service while displaying the service list.
  • the broadcast receiving apparatus 100 may determine at least one of a target device and a target user profile of each component based on the component data element.
  • the broadcast receiving apparatus 100 may display the roles of components based on the component data element.
  • the broadcast receiving apparatus 100 may display information about a component in the service guide menu based on at least one of a target device and a target user profile of the component.
  • the broadcast receiving apparatus 100 may display the corresponding component differently from other components. For example, when the broadcast receiving apparatus 100 does not support 3D pictures, the broadcast receiving apparatus 100 may display a video component for the 3D pictures in gray in an ESG.
  • the broadcast receiving apparatus 100 when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may not display the corresponding component. In a specific embodiment, the broadcast receiving apparatus 100 may display at least one of an icon, text and graphic form indicating a target user profile of the corresponding component in the service guide menu. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may display the corresponding component differently from other components. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may not display the corresponding component in the service guide menu.
  • the broadcast receiving apparatus 100 may display information about a component in a bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented based on at least one of a target device and a target user profile of the component. Specifically, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented, differently from other components. For example, when the broadcast receiving apparatus 100 does not support 3D pictures, the broadcast receiving apparatus 100 may display a video component for the 3D pictures in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented.
  • the broadcast receiving apparatus 100 when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may not display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented. In a specific embodiment, the broadcast receiving apparatus 100 may display at least one of an icon, text and graphic form indicating a target user profile of the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented.
  • the broadcast receiving apparatus 100 may display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented, differently from other components. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may not display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented.
  • the broadcast receiving apparatus 100 may display the roles of components represented by service fragments through a service list indicating information about broadcast services based on the service fragments. For example, the broadcast receiving apparatus 100 may display the role of a currently broadcast component in the corresponding service along with the name of the corresponding service and a virtual channel number indicating the corresponding service while displaying the service list.
  • the broadcast receiving apparatus 100 may display information about a component in a broadcast service list based on at least one of a target device and a target user profile of the component. Specifically, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may display the corresponding component in the broadcast service list, differently from other components. For example, when the broadcast receiving apparatus 100 does not support 3D pictures, the broadcast receiving apparatus 100 may display a video component for the 3D pictures in gray in the broadcast service list. In another specific embodiment, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may not display the corresponding component in the broadcast service list.
  • the broadcast receiving apparatus 100 may display at least one of an icon, text and graphic form indicating a target user profile of the corresponding component in the broadcast service list. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may display the corresponding component in the broadcast service list, differently from other components. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may not display the corresponding component in the broadcast service list.
  • FIG. 39 shows an embodiment in which the component fragment according to an embodiment of the present invention indicates a composite video component.
  • the component fragment in the embodiment shown in FIG. 39 represents that a composite component having an identifier of bcast://lge.com/Component/1 includes an elementary component having an identifier of bcast://lge.com/Component/2 and an elementary component having an identifier of bcast://lge.com/Component/3.
  • the component data element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 is a base layer for scalable video encoding and the elementary component having the identifier of bcast://lge.com/Component/3 is an enhancement layer for scalable video encoding.
  • FIG. 40 shows another embodiment in which the component fragment indicates a composite video component according to an embodiment of the present invention.
  • the component fragment represents that a composite component having an identifier of bcast://lge.com/Component/1 includes an PickOne component having an identifier of bcast://lge.com/Component/2 and a PickOne component having an identifier of bcast://lge.com/Component/3.
  • the component data element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 indicates a 3D video left view and the elementary component having the identifier of bcast://lge.com/Component/3 is a 3D video right view.
  • FIG. 41 shows another embodiment in which the component fragment according to an embodiment of the present invention indicates a PickOne audio component.
  • the component fragment represents that a PickOne component having an identifier of bcast://lge.com/Component/1 includes a PickOne component having an identifier of bcast://lge.com/Component/2 and a composite component having an identifier of bcast://lge.com/Component/3, which are alternative.
  • the component data element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/2 is a completely main audio.
  • the component data element of the component fragment represents that the composite component having the identifier of bcast://lge.com/Component/3 includes a PickOne component having an identifier of bcast://lge.com/Component/4 and a PickOne component having an identifier of bcast://lge.com/Component/5.
  • the component data element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/4 is music.
  • the component data element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/5 is a dialog.
  • Embodiments in which ESG data includes component fragments, the broadcast transmitting apparatus 10 transmits component information and the broadcast receiving apparatus 100 receives the component information have been described.
  • components may have organic correlation and the broadcast receiving apparatus 100 needs to present components in consideration of correlation among components. Accordingly, there is a need for a method of representing relationship among components. This will be described below with reference to the attached drawings.
  • FIG. 42 shows an XML format of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • a content fragment includes an extension element for extendibility.
  • the broadcast transmitting apparatus 10 may insert a component element indicating component information into the extension element.
  • the content fragment may include the component element in the form of a sequence.
  • information included in the component element may be the same as information included in the aforementioned component fragment.
  • the component element may include at least one of the element representing the role of a video, the target user profile element indicating the profile of a target user of the component and the target device element indicating a target device of the component.
  • the component element may include at least one of the element representing the role of an audio, the target user profile element indicating the profile of a target user of the component and the target device element indicating a target device of the component.
  • the component element may include at least one of the element representing the role of a closed captioning, the target user profile element indicating the profile of a target user of the component and the target device element indicating a target device of the component.
  • FIGS. 43 and 44 show XML formats of embodiments of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • the broadcast receiving apparatus 100 may acquire information about a video component, an audio component and a closed captioning component based on a content fragment. That is, the broadcast receiving apparatus 100 may obtain the information about the video component, audio component and closed captioning component, required to present content indicated by the content fragment. Accordingly, the broadcast receiving apparatus 100 may recognize that the video component, audio component and closed captioning component need to be presented together without additional information.
  • the broadcast receiving apparatus 100 may acquire information about a video component, an audio component and a closed captioning component based on a content fragment. Particularly, the broadcast receiving apparatus 100 may recognize that corresponding content includes a component containing a 3D video left view and a component containing a 3D video right view based on the content fragment.
  • the broadcast receiving apparatus 100 may be aware of all components included in the content without an element indicating correlation among the components.
  • Previous broadcast receiving apparatuses may malfunction when a new fragment such as a component fragment is added because they do not support the component fragment. This problem may be solved when a content fragment includes component information as shown in FIGS. 42 to 44 .
  • a component fragment indicating an audio component and a component fragment indicating a closed captioning component may refer to a fragment indicating a related video component. This will be described with reference to FIGS. 45 and 46 .
  • FIG. 45 shows an XML format of component fragments when a component fragment indicating an audio component and a component fragment indicating a closed captioning component refer to a fragment indicating a related video component according to an embodiment of the present invention.
  • the component fragment indicating the audio component and the component fragment indicating the closed captioning component may include an association element representing the fragment indicating the related video component.
  • the related video component may indicate a component presented along with the closed captioning component.
  • the related video component may indicate a video component presented along with the closed captioning component in synchronization therewith.
  • the association element may have an identifier for identifying the component fragment indicating the related video component.
  • the association element may be associatedTo.
  • FIG. 46 shows a relationship among component fragments when a component fragment indicating an audio component and a component fragment indicating a closed captioning component refer to a fragment indicating a related video component according to an embodiment of the present invention.
  • a component fragment of an audio component having an identifier of bcast://lge.com/Component/2 refers to a component fragment of a video component having an identifier of bcast://lge.com/Component/1.
  • a component fragment of a closed captioning component having an identifier of bcast://lge.com/Component/3 refers to the component fragment of the video component having the identifier of bcast://lge.com/Component/1.
  • the broadcast receiving apparatus 100 may acquire information about the video component associated to the audio component based on the component fragment of the audio component.
  • the broadcast receiving apparatus 100 may acquire information about the video component associated to the closed captioning component based on the component fragment of the closed captioning component.
  • FIG. 47 shows an XML format of component fragments when a component fragment indicating a video component refers to a component fragment indicating a related audio component and a component fragment indicating a related closed captioning component according to an embodiment of the present invention.
  • the component fragment indicating the video component may include association elements that represent the component fragment indicating the related audio component and the component fragment indicating the related closed captioning component.
  • the related audio component and the related closed captioning component may respectively represent an audio component and a closed captioning component which are presented along with the video component.
  • the related audio component and the related closed captioning component may respectively represent an audio component and a closed captioning component which are presented along with the video component in synchronization therewith.
  • the association elements may have identifiers for identifying the component fragments indicating the audio component and the closed captioning component associated to the video component. In a specific embodiment, the association elements may be associatedTo.
  • FIG. 48 shows a relationship among component fragments when a component fragment indicating a video component refers to a component fragment indicating a related audio component and a component fragment indicating a related closed captioning component according to an embodiment of the present invention.
  • a component fragment of a video component having an identifier of bcast://lge.com/Component//1 refers to a component fragment of an audio component having an identifier of bcast://lge.com/Component/2 and a component fragment of a closed captioning component having an identifier of bcast://lge.com/Component/3.
  • the broadcast receiving apparatus 100 may acquire information about the audio component and closed captioning component associated to the video component based on the component fragment of the video component.
  • a reference relationship among fragments is necessary for the broadcast receiving apparatus 100 to generate a service guide based on fragments included in ESG data. This will be described with reference to FIGS. 49 to 63 .
  • FIG. 49 shows a reference relationship among fragments according to an embodiment of the present invention.
  • a content fragment representing content may refer to a fragment indicating a service including the content.
  • a component fragment indicating a component may refer to a content fragment representing content including the component.
  • a component fragment indicating a component may refer to a service fragment representing a service including the component.
  • a schedule component may refer to a service fragment representing a service corresponding to the corresponding schedule, a content fragment representing content corresponding to the schedule and a component fragment representing a component corresponding to the schedule.
  • a component fragment needs to include additional component in addition to the aforementioned elements. This will be described with reference to FIG. 50 .
  • FIG. 50 shows an XML format of a component fragment according to an embodiment of the present invention when the component fragment refers to a higher component fragment, a content fragment and a service fragment.
  • a component fragment indicating a component may include a reference service element that indicates a service fragment of a service including the component.
  • the reference service element may have an identifier for identifying a service fragment indicating a service referred to.
  • the reference service element may be ServiceReference.
  • a component fragment indicating a component may include a reference content element that indicates a content fragment of content including the component.
  • the reference content element may have an identifier for identifying a content fragment indicating content referred to.
  • the reference content element may be ContentReference.
  • the broadcast receiving apparatus 100 may acquire information about a higher component, content and a service including a component indicated by a component fragment based on the component fragment.
  • a schedule fragment may include an element for referring to the component fragment. This will be described with reference to FIG. 51 .
  • FIG. 51 shows an XML format of a schedule fragment according to an embodiment of the present invention when the schedule fragment refers to a component fragment, a content fragment and a service fragment.
  • a schedule fragment indicating a schedule may include a reference component element that indicates a content fragment of a component corresponding to the schedule.
  • the reference component element may have an identifier for identifying a component fragment referred to by the schedule fragment.
  • the reference component element may be ComponentReference.
  • the broadcast receiving apparatus 100 may acquire information about a component, content and a service corresponding to the schedule indicated by the schedule fragment based on the schedule fragment.
  • FIG. 52 shows a reference relationship among a service fragment, a content fragment and component fragments representing a presentable video component, a presentable audio component and a presentable closed captioning component according to an embodiment of the present invention.
  • a service may include content and the content may include a presentable video component, a presentable audio component and a presentable closed captioning component.
  • FIG. 52 shows such relationship among a service fragment, a content fragment and a plurality of component fragments.
  • the component fragments respectively representing the presentable audio component, presentable closed captioning component and presentable video component refer to the content fragment representing content including the presentable closed captioning component, presentable video component and presentable audio component.
  • the component fragment representing the presentable closed captioning component and the component fragment representing the presentable audio component are associated to the component fragment representing the presentable video component.
  • the content fragment representing the content refers to the service fragment indicating a service including the content.
  • the broadcast receiving apparatus 100 may acquire information about correlation among the components and content including the components based on the component fragments. In addition, the broadcast receiving apparatus 100 may acquire information about a service including the content based on the content fragment representing the content.
  • FIG. 53 shows a reference relationship among a component fragment representing a composite component and component fragments representing lower components according to an embodiment of the present invention.
  • a composite component may include a plurality of components constituting one scene.
  • the component fragments in the embodiment of FIG. 53 show such relationship.
  • a component fragment representing a video component of a second enhancement layer for scalable video encoding refers to a component fragment representing a video component of a first enhancement layer.
  • the component fragment representing the video component of the first enhancement layer refers to a component fragment representing a video component of a base layer.
  • the component fragment representing the video component of the base layer refers to a component fragment representing a composite video component.
  • the broadcast receiving apparatus 100 may acquire information about a relationship among components constituting the composite component based on the plurality of component fragments representing the plurality of components for scalable video coding. Specifically, the broadcast receiving apparatus 100 may recognize that the video component of the first enhancement layer is necessary to present the video component of the second enhancement layer based on the component fragment representing the video component of the second enhancement layer. In addition, the broadcast receiving apparatus 100 may recognize that the video component of the base layer is necessary to present the video component of the first enhancement layer based on the component fragment representing the video component of the first enhancement layer.
  • FIG. 54 shows a reference relationship among a component fragment representing an App-based enhancement component and component fragments representing lower components according to an embodiment of the present invention.
  • the App-based enhancement component may include an NRT content item component.
  • the NRT content item component may include an NRT file component.
  • the App-based enhancement component may include an application component.
  • the App-based enhancement component may include an on demand component.
  • the component fragments in the embodiment of FIG. 54 show such relationship. Specifically, in the embodiment shown in FIG. 54 , a component fragment representing an NRT file component refers to a component fragment representing an NRT content item component. In addition, the component fragment representing the NRT content item component refers to a component fragment representing the App-based enhancement component. Furthermore, a component fragment representing an application component refers to the component fragment representing the App-based enhancement component.
  • a component fragment representing an on demand component refers to the component fragment representing the App-based enhancement component.
  • the broadcast receiving apparatus 100 may acquire information about a relationship between the App-based enhancement component and components included in the App-based enhancement component. Specifically, the broadcast receiving apparatus 100 may recognize that the App-based enhancement component includes the NRT content item. In addition, the broadcast receiving apparatus 100 may recognize the NRT file component necessary to execute the NRT content item.
  • a service may include various types of content in hybrid broadcast.
  • a service may include at least one of programs, on demand content and NRT content in hybrid broadcast. Accordingly, content fragments need to represent such relationship while referring to service fragments. This will be described with reference to the attached drawings.
  • FIG. 55 shows an XML format of a content fragment according to another embodiment of the present invention when the content fragment refers to a service.
  • the content fragment may include a relationship element indicating a relationship with a service fragment referred to.
  • the relationship element may indicate that the corresponding content is a program of the service represented by the service fragment referred to.
  • the relationship element may have a value of 1.
  • the relationship element may indicate that the corresponding content is a content item of the service represented by the service fragment referred to.
  • the relationship element may have a value of 2.
  • the relationship element may indicate that the corresponding content is on demand content of the service represented by the service fragment referred to.
  • the on demand content refers to content executed at the request of a user.
  • the relationship element may have a value of 3.
  • the relationship element may be referred to as “relationship.”
  • the broadcast receiving apparatus 100 may recognize a relationship between content and a service referred to by the content based on the relationship element.
  • FIG. 56 shows a reference relationship among content fragments and a service fragment according to another embodiment of the present invention.
  • a content fragment having an identifier of bcast://lge.com/Content/1 refers to a service fragment having an identifier of bcast://lge.com/Service/1 and has a relationship element value of 1. Accordingly, the broadcast receiving apparatus 100 may recognize that content indicated by the content fragment having the identifier of bcast://lge.com/Content/1 is a program of a service indicated by the service fragment having the identifier of bcast://lge.com/Service/1 based on the content fragment.
  • a content fragment having an identifier of bcast://lge.com/Content/2 refers to the service fragment having the identifier of bcast://lge.com/Service/1 and has a relationship element value of 2. Accordingly, the broadcast receiving apparatus 100 may recognize that content indicated by the content fragment having the identifier of bcast://lge.com/Content/2 is a content item of the service indicated by the service fragment having the identifier of bcast://lge.com/Service/1 based on the content fragment.
  • a content fragment having an identifier of bcast://lge.com/Content/3 refers to the service fragment having the identifier of bcast://lge.com/Service/1 and has a relationship element value of 3. Accordingly, the broadcast receiving apparatus 100 may recognize that content indicated by the content fragment having the identifier of bcast://lge.com/Content/3 is on deman content of the service indicated by the service fragment having the identifier of bcast://lge.com/Service/1 based on the content fragment.
  • the broadcast receiving apparatus 100 may recognize a relationship between content indicated by a content fragment and a service including the content based on the content fragment. Accordingly, the broadcast receiving apparatus 100 may recognize a relationship between a service and content based on ESG data without additional service signaling information.
  • the broadcast receiving apparatus 100 may recognize a relationship between the higher fragment and the lower fragment after processing the lower fragment first. Accordingly, when the broadcast receiving apparatus 100 intends to display only some services, the broadcast receiving apparatus 100 needs to check not only fragments indicating content and components included in the services but also other fragments. This is inefficient considering that the broadcast receiving apparatus 100 displays contents based on services, in general. Therefore, embodiments in which a higher fragment refers to a lower fragment will be described with reference to the attached drawings.
  • FIG. 57 illustrates a reference relationship among fragments according to another embodiment of the present invention.
  • a fragment representing a service may refer to a content fragment indicating content.
  • a content fragment representing content including a component may refer to a component fragment representing the component.
  • a component fragment representing a component may refer to a service fragment representing a service including the component.
  • a schedule component may refer to a service fragment representing a service corresponding to the schedule indicated thereby, a content fragment representing content corresponding to the schedule and a component fragment representing a component corresponding to the schedule.
  • the service fragment, content fragment and component fragment need to include additional components in addition to the aforementioned elements. This will be described with reference to FIGS. 58 to 63 .
  • FIG. 58 shows an XML format of a service fragment according to another embodiment of the present invention.
  • the service fragment may include a content reference element that represents a content fragment indicating content included in a service indicated by the service fragment.
  • the content reference element may have an identifier for identifying the content fragment representing the content included in the service.
  • the content reference element may be ContentReference.
  • the service fragment may include a component reference element that represents a component fragment indicating a component included in the service indicated by the service fragment.
  • the component reference element may have an identifier for identifying the component fragment representing the component included in the service.
  • the content reference element may be ComponentReference.
  • the broadcast receiving apparatus 100 may acquire information about content included in the service represented by the service fragment based on the service fragment. In addition, the broadcast receiving apparatus 100 may acquire information about a component included in the service indicated by the service fragment based on the service fragment.
  • FIG. 59 shows an XML format of a content fragment according to another embodiment of the present invention.
  • the content fragment may include a component reference element that represents a component fragment indicating a component included in content indicated by the content fragment.
  • the component reference element may have an identifier for identifying the component fragment representing the component included in the content.
  • the content reference element may be ComponentReference.
  • the broadcast receiving apparatus 100 may acquire information about a component included in the content represented by the content fragment based on the content fragment.
  • FIG. 60 shows an XML format of a component fragment according to another embodiment of the present invention.
  • the component fragment may include a component reference element that represents a component fragment indicating a component included in a component indicated by the component fragment.
  • the component reference element may have an identifier for identifying the component fragment representing the component included in the component.
  • the content reference element may be ComponentReference.
  • the broadcast receiving apparatus 100 may acquire information about a component included in the component represented by the component fragment based on the component fragment.
  • FIG. 61 shows a reference relationship among a service fragment, a content fragment component fragments according to another embodiment of the present invention.
  • a service may include content and the content may include a presentable video component, a presentable audio component and a presentable closed captioning component.
  • FIG. 61 shows such relationship among a service fragment, a content fragment and a plurality of component fragments.
  • the service fragment refers to the content fragment representing content included in a service indicated by the service fragment.
  • the content fragment refers to a component fragment indicating a presentable video component included in the content represented by the content fragment. While the content fragment refers to only the component fragment indicating the presentable video component in FIG. 61 , the content fragment may refer to a component fragment indicating a presentable audio component and a component fragment indicating a presentable closed captioning component.
  • the figure shows that the component fragment indicating the presentable closed captioning component and the component fragment indicating the presentable audio component are associated to the component fragment indicating the presentable video component.
  • the broadcast receiving apparatus 100 may acquire information about the content included in the service represented by the service fragment based on the service fragment. In addition, the broadcast receiving apparatus 100 may acquire information about the component included in the content based on the component fragment.
  • FIG. 62 shows a reference relationship among a component fragment indicating a composite component and component fragments representing lower components.
  • a composite component may include a plurality of components constituting one scene.
  • the component fragments in the embodiment of FIG. 62 show such relationship.
  • a component fragment representing a composite component refers to a component fragment representing a video component of a base layer for scalable video encoding.
  • the component fragment representing the video component of the base layer for scalable video encoding refers to a component fragment representing a video component of a first enhancement layer.
  • the component fragment representing the video component of the first enhancement layer refers to a component fragment representing a video component of a second enhancement layer.
  • the broadcast receiving apparatus 100 may acquire information about a relationship among components constituting the composite component based on the plurality of component fragments representing the plurality of components for scalable video coding. Specifically, the broadcast receiving apparatus 100 may recognize that the video component of the base layer and the video component of the first enhancement layer may be presented together based on the component fragment indicating the video component of the base layer. In addition, the broadcast receiving apparatus 100 may recognize that the video component of the first enhancement layer and the video component of the second enhancement layer may be presented together based on the component fragment indicating the video component of the first enhancement layer.
  • FIG. 63 shows a reference relationship among a component fragment representing an App-based enhancement component and component fragments representing lower components according to another embodiment of the present invention.
  • the App-based enhancement component may include an RNT content item component.
  • the NRT content item component may include an NRT file component.
  • the App-based enhancement component may include an application component.
  • the App-based enhancement component may include an on demand component.
  • the component fragments in the embodiment of FIG. 63 show such relationship. Specifically, in the embodiment shown in FIG. 63 , a component fragment representing an App-based enhancement component refers to a component fragment representing an NRT content item component.
  • the component fragment representing the NRT content item component refers to a component fragment representing an NRT file component.
  • the component fragment representing the App-based enhancement component refers to a component fragment representing an application component.
  • the component fragment representing the App-based enhancement component refers to a component fragment representing an on demand component.
  • the broadcast receiving apparatus 100 may acquire information about a relationship between the App-based enhancement component and components included in the App-based enhancement component based on the component fragment representing the App-based enhancement component. Specifically, the broadcast receiving apparatus 100 may recognize that the App-based enhancement component includes the NRT content item. In addition, the broadcast receiving apparatus 100 may recognize the NRT file component necessary to execute the NRT content item.
  • a service fragment may include the component reference element representing a component fragment corresponding to a component included in the service represented by the service fragment.
  • a content fragment may include the component reference element representing a component fragment corresponding to a component included in the content represented by the content fragment.
  • a component fragment may include the component reference element representing a component fragment corresponding to a component included in the component represented by the component fragment.
  • FIGS. 64 to 66 show a syntax of a component fragment according to another embodiment of the present invention.
  • the component fragment according to another embodiment of the present invention includes information about a component that is part of a broadcast service or content.
  • the component fragment may include an identifier attribute identifying the component fragment.
  • the identifier attribute may be “id.”
  • the component fragment may include a version attribute indicating whether the component has changed.
  • the version attribute may be “version.”
  • the component fragment may include a valid period attribute indicating a valid period of the component fragment.
  • the attribute indicating the valid period of the component fragment may include a start time and an end time of the valid period.
  • the attribute indicating the start time of the valid period may be “validFrom” and the attribute indicating the end time of the valid period may be “validTo.”
  • the component fragment may include a service reference element indicating a service fragment of a service including the component.
  • the service reference element may have an identifier for identifying the service fragment representing the service referred to as an identifier attribute.
  • the service reference element may be ReferenceService.
  • the identifier attribute may be idRef.
  • the component fragment may include a content reference element indicating a content fragment of content including the component.
  • the content reference element may have an identifier for identifying the content fragment representing the content referred to as an identifier attribute.
  • the content reference element may be ReferenceContent.
  • the identifier attribute may be idRef.
  • the component fragment may include a component reference element indicating a component fragment of a higher component including the component.
  • the component reference element may have an identifier for identifying the component fragment representing the component referred to as an identifier attribute.
  • the component reference element may be ReferenceComponent.
  • the identifier attribute may be idRef.
  • the component fragment may include a component type element indicating a component type. Since one component may simultaneously include multiple properties, one component fragment may include a plurality of component type elements.
  • the component type element may represent various types of components.
  • the component type element may represent a continuous component.
  • the continuous component is a component presented in a continuous stream.
  • the continuous component may be one of audio, video and closed captioning.
  • the component type element may have a value of 1 when the component type element indicates a continuous component.
  • the component type element may represent an elementary component.
  • the component type element may have a value of 2 when the component type element indicates an elementary component.
  • the component type element may represent a composite component.
  • the component type element may have a value of 3 when the component type element indicates a composite component.
  • the component type element may represent a PickOne component.
  • the component type element may have a value of 4 when the component type element indicates a PickOne component.
  • the component type element may represent a complex component.
  • the component type element may have a value of 5 when the component type element indicates a complex component.
  • the component type element may represent a video presentable component.
  • the component type element may have a value of 6 when the component type element indicates a presentable video component.
  • the component type element may represent a presentable audio component.
  • the component type element may have a value of 7 when the component type element indicates a presentable audio component.
  • the component type element may represent a presentable closed captioning component.
  • the component type element may have a value of 8 when the component type element indicates a presentable closed captioning component.
  • the component type element may represent that a component is an NRT file.
  • the component type element may have a value of 9 when the component type element indicates that a component is an NRT file.
  • the component type element may represent that a component is NRT content.
  • the component type element may have a value of 10 when the component type element indicates that a component is NRT content.
  • the component type element may represent that a component is an application.
  • An application may be a set of documents constituting an additional service or an interactive service.
  • a document may include at least one of HTML, JavaScript, CSS, XML and multimedia files.
  • an application may be regarded as an NRT content item.
  • the component type element may have a value of 11 when the component type element indicates that a component is an application.
  • the component type element may represent that a component is an ATSC 3.0 application.
  • An ATSC 3.0 application may refer to an application executed in environments conforming to ATSC 3.0 specification.
  • the component type element may have a value of 12 when the component type element indicates that a component is an ATSC 3.0 application.
  • the component type element may indicate that a component is an on demand component.
  • An on demand component represents a component that is delivered on demand.
  • the component type element may have a value of 13.
  • the component type element may indicate that a component is a notification stream.
  • a notification stream delivers notifications to synchronize actions of applications with an underlying linear time base.
  • the component type element may have a value of 14.
  • the component type element may indicate that a component is an App-based enhancement.
  • the component type element may have a value of 15.
  • component type element may be referred to as ComponentType.
  • the component fragment may include a component role element specifying the role of the component.
  • the value of the component role element may be an integer. Since one component may simultaneously execute various functions, one component fragment may include a plurality of component role elements.
  • the component role element may represent a default video when the component is a presentable video component.
  • the component role element may have a value of 1.
  • the component role element may represent an alternative camera view when the component is a presentable video component.
  • the component role element may have a value of 2.
  • the component role element may represent an alternative video component when the component is a presentable video component.
  • the component role element may have a value of 3.
  • the component role element may represent a sign language inset when the component is a presentable video component.
  • the component role element may have a value of 4.
  • the component role element may represent a follow subject video when the component is a presentable video component.
  • the component role element may have a value of 5.
  • the component role element may represent a base layer for scalable video encoding.
  • the component role element may have a value of 6.
  • the component role element may represent an enhancement layer for scalable video encoding when the component is a composite video component.
  • the component role element may have a value of 7.
  • the component role element may represent a 3D video left view when the component is a composite video component.
  • the component role element may have a value of 8.
  • the component role element may represent a 3D video right view when the component is a composite video component.
  • the component role element may have a value of 9.
  • the component role element may represent 3D video depth information when the component is a composite video component.
  • the component role element may have a value of 10.
  • the component role element may represent that a media component is a video at a specific position of a picture divided into a plurality of regions when the component is a composite video component.
  • the component role element may have a value of 11.
  • the component role element may represent follow-subject metadata when the component is a composite video component.
  • the component role element may have a value of 12.
  • the follow-subject metadata may include at least one of the name, position and size of the corresponding subject.
  • the follow-subject metadata may represent a main video component region on which the subject is focused.
  • the component role element may represent that the component is complete main.
  • the component role element may have a value of 13.
  • the component role element may represent that the component is music.
  • the component role element may have a value of 14.
  • the component role element may represent that the component is a dialog.
  • the component role element may have a value of 15.
  • the component role element may represent that the component is effects.
  • the component role element may have a value of 16.
  • the component role element may represent that the component is visual impaired.
  • the component role element may have a value of 17.
  • the component role element may represent that the component is hearing impaired.
  • the component role element may have a value of 18.
  • the component role element may represent that the component is a commentary.
  • the component role element may have a value of 19.
  • the component role element may represent that the component is a normal closed captioning.
  • the component role element may have a value of 20.
  • the component role element may represent that the component is an easy-reader closed captioning for kindergarteners and elementary school students.
  • the component role element may have a value of 21.
  • the component fragment may include an extension element that may be reserved for future extension.
  • the extension element may be referred to as PrivateExt.
  • the component fragment may include a proprietary element for use of a specific application.
  • the proprietary element may be referred to as ProprietaryElements.
  • FIG. 67 shows an XML format of the component fragment according to another embodiment of the present invention.
  • the component fragment in an embodiment shown in FIG. 68 includes the elements and attributes described with reference to FIGS. 65 to 68 .
  • FIG. 68 shows an XML format of ComponentRangeType included in the component fragment according to another embodiment of the present invention.
  • the component type element included in the component fragment may include an attribute specifying the range of values that the component type element may have.
  • the attribute specifying the range of values that the component type element may have may be referred to as ComponentTypeRangeType.
  • FIG. 69 shows an XML format of ComponentRoleRangeType included in the component fragment according to another embodiment of the present invention.
  • the component role element included in the component fragment may include an attribute specifying the range of values that the component role element may have.
  • the attribute specifying the range of values that the component role element may have may be referred to as ComponentRoleRangeType.
  • FIG. 70 shows a relationship among the component fragment according to another embodiment and a composite video component using scalable video encoding and components included in the composite video component.
  • the component fragment in the embodiment shown in FIG. 70 represents that a composite component having an identifier of bcast://lge.com/Component/1 includes an elementary component having an identifier of bcast://lge.com/Component/2 and an elementary component having an identifier of bcast://lge.com/Component/3.
  • the component role element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 is a base layer for scalable video encoding and the elementary component having the identifier of bcast://lge.com/Component/3 is an enhancement layer for scalable video encoding.
  • FIG. 71 shows a relationship among the component fragment according to another embodiment and a composite video component including a 3D video and components included in the composite video component.
  • the component fragment in the embodiment shown in FIG. 71 represents that a composite component having an identifier of bcast://lge.com/Component/1 includes a PickOne component having an identifier of bcast://lge.com/Component/2 and a PickOne component having an identifier of bcast://lge.com/Component/3.
  • the component role element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 is a 3D video left view and the elementary component having the identifier of bcast://lge.com/Component/3 is a 3D video right view.
  • FIG. 72 shows a relationship among the component fragment according to another embodiment and a PickOne audio component and components included in the PickOne audio component.
  • the component fragment in the embodiment shown in FIG. 72 represents that a PickOne component having an identifier of bcast://lge.com/Component/1 includes a PickOne component having an identifier of bcast://lge.com/Component/2 and a composite component having an identifier of bcast://lge.com/Component/3, which are alternative.
  • the component role element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/2 is complete main.
  • the component role element of the component fragment represents that the composite component having the identifier of bcast://lge.com/Component/3 includes a PickOne component having the identifier of bcast://lge.com/Component/4 and a PickOne component having the identifier of bcast://lge.com/Component/5.
  • the component role element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/4 is music.
  • the component role element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/5 is a dialog.
  • ESG data may include information about a component as lower elements of the content fragment. This will be described with reference to the attached drawings.
  • FIGS. 73 to 75 shows a syntax of a component element according to another embodiment of the present invention.
  • a content fragment includes an extension element for extendibility.
  • the content fragment may include a component element that represents information about a component included in content as a lower element of the extension element.
  • the component element may include a component type element representing a component type.
  • the component type element may represent a presentable video component.
  • the component type element may have a value of 1.
  • the component type element may represent a presentable audio component.
  • the component type element may have a value of 2.
  • the component type element may represent a presentable closed captioning component.
  • the component type element may have a value of 3.
  • the component type element may represent an App-based enhancement component.
  • the component type element may have a value of 4.
  • the component type element may be referred to as ComponentType.
  • the component type element may include an element specifying the range of values that the component type element may have.
  • the element specifying the range of values that the component type element may have may be referred to as ComponentTypeRangeType.
  • the number of component types represented by the component type element may be less than the number of component types indicated by the aforementioned component fragment because properties of content including components are indicated by a content fragment and thus there is no need to repeatedly describe the properties in the content fragment.
  • the component element may include a component role element representing the role of a component.
  • the component role element may represent a default video when the component is a presentable video component.
  • the default video may refer to a primary video.
  • the component role element may have a value of 1.
  • the component role element may represent an alternative camera view when the component is a presentable video component.
  • the component role element may have a value of 2.
  • the component role element may represent an alternative video component when the component is a presentable video component.
  • the component role element may have a value of 3.
  • the component role element may represent an alternative video component when the component is a presentable video component.
  • the component role element may have a value of 3.
  • the component role element may represent a sign language inset when the component is a presentable video component.
  • the component role element may have a value of 4.
  • the component role element may represent a follow subject video when the component is a presentable video component.
  • the component role element may have a value of 5.
  • the component role element may represent that the component is complete main.
  • the component role element may have a value of 6.
  • the component role element may represent that the component is music.
  • the component role element may have a value of 7.
  • the component role element may represent that the component is a dialog.
  • the component role element may have a value of 8.
  • the component role element may represent that the component is effects.
  • the component role element may have a value of 9.
  • the component role element may represent that the component is visual impaired.
  • the component role element may have a value of 10.
  • the component role element may represent that the component is hearing impaired.
  • the component role element may have a value of 11.
  • the component role element may represent that the component is a commentary.
  • the component role element may have a value of 12.
  • the component role element may represent that the component is a normal closed captioning.
  • the component role element may have a value of 13.
  • the component role element may represent that the component is an easy-reader closed captioning for kindergarteners and elementary school students.
  • the component role element may have a value of 14.
  • the component role element may represent that the component is an application.
  • the component role element may have a value of 15.
  • the component role element may represent that the component is an NRT content item.
  • the component role element may have a value of 16.
  • the component role element may represent that the component is an on demand component.
  • the component role element may have a value of 17.
  • the component role element may represent that the component is a notification stream component.
  • the component role element may have a value of 18.
  • a notification stream delivers notifications to synchronize actions of applications with an underlying linear time base.
  • the component role element may represent that the component is a start-over component.
  • the start-over component refers to an application component that provides a function of viewing content from the beginning after the content starts to be broadcast at the request of a user.
  • the component role element may have a value of 19.
  • the component role element may represent that the component is a companion screen component that may be consumed in the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 .
  • the component role element may have a value of 20.
  • the component role element may be referred to as ComponentRole.
  • the component role element may include an element specifying the range of values that the component role element may have.
  • the element specifying the range of values that the component role element may have may be referred to as ComponentRoleRangeType.
  • the component element may include a start time element representing a display start time of a component.
  • the start time element may be referred to as StartTime.
  • the component element may include an end time element representing a display end time of a component.
  • the end time element may be referred to as EndTime.
  • the component element may include a language element indicating a component description language.
  • the language element may be referred to as Language.
  • the component element may include a session description language element indicating a session description language when a component is delivered according to a session based transport protocol.
  • the session description language element may be referred to as languageSDP.
  • the component element may include an element indicating a component display duration.
  • the duration element may be referred to as Length.
  • the component element may include an element indicating a parental rating of a component.
  • the parental rating element may be referred to as ParentalRating.
  • the component element may include a capability element indicating the capability of the broadcast receiving apparatus 100 necessary to present components.
  • the capability element may represent that connection to a broadband is necessary for component presentation.
  • the capability element may have a value 1.
  • the capability element may represent that a simple definition (SD) video presentation capability is needed.
  • the capability element may be 2.
  • the capability element may represent that a high definition (HD) video presentation capability is needed.
  • the capability element may be 3.
  • the capability element may represent that a video presentation capability of ultra-high definition (UHD) of 4K or higher is needed.
  • the capability element may be 4.
  • the capability element may represent that a video presentation capability corresponding to definition of 8K or higher is needed.
  • the capability element may be 5.
  • the capability element may represent that definition for presenting 3 D video is needed.
  • the capability element may be 6.
  • the capability element may represent that definition for presenting high dynamic range imaging video is needed.
  • the capability element may be 7.
  • the capability element may represent that definition for presenting wide color gamut video is needed.
  • the capability element may be 8.
  • the capability element may represent that 2.0 channel audio needs to be presented.
  • the capability element may be 9.
  • the capability element may represent that 2.1 channel audio needs to be presented.
  • the capability element may be 10.
  • the capability element may represent that 5.1 channel audio needs to be presented.
  • the capability element may be 11.
  • the capability element may represent that 6.1 channel audio needs to be presented.
  • the capability element may be 12.
  • the capability element may represent that 7.1 channel audio needs to be presented.
  • the capability element may be 13.
  • the capability element may represent that 21.1 channel audio needs to be presented.
  • the capability element may be 14.
  • the capability element may represent that 3D audio needs to be presented.
  • the capability element may be 15.
  • the capability element may represent that a dialog level adjustment is needed.
  • the capability element may be 16.
  • the capability element may represent that a magic remote control input needs to be received for component presentation.
  • the capability element may be 17.
  • the capability element may represent that a touchscreen input needs to be received for component presentation.
  • the capability element may be 18.
  • the capability element may represent that a mouse input needs to be received for component presentation.
  • the capability element may be 19.
  • the capability element may represent that a keyboard input needs to be received for component presentation.
  • the capability element may be 20.
  • the capability element may represent that application rendering is needed.
  • the capability element may be 21.
  • the capability element may be referred to as Device Capability.
  • the capability element may include an element specifying the range of values that the capability element may have.
  • the element specifying the range of values that the capability element may have may be referred to as DeviceCapabilityRangeType.
  • the component element may include a target device element indicating a target device of a component.
  • the target device element may represent a component provided through the screen of the broadcast receiving apparatus 100 that is a primary apparatus receiving broadcast services.
  • the target device element may be 1.
  • the target device element may represent a component provided through the screen of the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 that is a primary apparatus receiving broadcast services.
  • the target device element may be 2.
  • the target device element may represent a component provided through part of the screen of broadcast receiving apparatus 100 that is a primary apparatus receiving broadcast services.
  • the target device element may be 3.
  • the target device element may be referred to as TargetDevice.
  • the target device element may include an element specifying the range of values that the target device element may have.
  • the element specifying the range of values that the target device element may have may be referred to as TargetDeviceRangeType.
  • FIGS. 76 to 80 show an XML format of the aforementioned component element.
  • the broadcast receiving apparatus 100 may display component information based on the component element. Specifically, the broadcast receiving apparatus 100 may display component information included in content based on the capability element included in the component element. This will be described in detail with reference to the attached drawings.
  • FIGS. 81 to 83 illustrate another embodiment of the present invention, in which the broadcast receiving apparatus displays components included in content based on the capability element included in the component element.
  • the broadcast receiving apparatus 100 may display component information included in content based on the capability element included in the component element. Specifically, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 such that the component that may be presented is discriminated from the component that may not be presented based on the capability element included in the component element. Specifically, the broadcast receiving apparatus 100 may differently display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 based on the capability element included in the component element.
  • the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 in different colors based on the capability element included in the component element.
  • the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 on a white background and display a component that may not be presented by the broadcast receiving apparatus 100 on a gray background based on the capability element included in the component element.
  • the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 through different icons based on the capability element included in the component element.
  • the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 as different texts based on the capability element included in the component element. In another specific embodiment, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 using different graphical symbols based on the capability element included in the component element.
  • the broadcast receiving apparatus 100 may display component information in a service guide menu, as described above. In addition, the broadcast receiving apparatus 100 may display component information in a service list menu. Furthermore, the broadcast receiving apparatus 100 may display component information in a bar-shaped menu positioned at the lower or upper part of a screen on which content is presented.
  • content includes a component that requires HD video presentation and a component that requires 5.1 channel audio presentation.
  • the broadcast receiving apparatus 100 may display component information in a white out, as shown in FIG. 81( b ) .
  • the broadcast receiving apparatus 100 may display audio component information that requires 5.1 channel presentation in a gray out, as shown in FIG. 81( c ) .
  • content includes a component that requires UHD video presentation, an audio component including English and an audio component including Spanish that requires broadband connection.
  • the broadcast receiving apparatus 100 may display the audio component including Spanish in a gray out, as shown in FIG. 82( b ) .
  • content includes a component that requires UHD video presentation, an audio component that requires 5.1 channel presentation and a video component that requires a wide color gamut.
  • the broadcast receiving apparatus 100 may display the video component that requires the wide color gamut in a gray out, as shown in FIG. 83( c ) .
  • a user may select content depending on whether the broadcast receiving apparatus 100 may present the content. Specifically, the user may previously recognize that the broadcast receiving apparatus 100 may not present the corresponding content. Accordingly, the user may be prevented from waiting for the content that may not be presented by the broadcast receiving apparatus 100 .
  • This function may enhance user convenience considering that broadcast content types are diversified and high device capabilities are required in hybrid broadcast.
  • FIGS. 84 and 85 show a syntax of a component type according to another embodiment of the present invention.
  • the component element may include information indicating a component type.
  • the component element may include the information indicating a component type in the form of an element.
  • the component element may not include the information representing a component type in the form of an element.
  • the component element may include the information representing a component type as a component type attribute.
  • the component type attribute may represent a presentable video component.
  • the component type attribute may have a value of 1.
  • the component type attribute may represent a presentable audio component.
  • the component type attribute may be 2.
  • the component type attribute may represent a presentable closed captioning component.
  • the component type attribute may be 3.
  • the component type attribute may represent an App-based enhancement component.
  • the component type attribute may be 4.
  • the component role element may have an integer value.
  • the component role element may have a string value that indicates the role of a component for convenient future data addition.
  • the broadcast receiving apparatus 100 may display the string value of the component role element.
  • the component role element may represent that the component is a default video.
  • the default video may refer to a primary video.
  • the component role element may have a value of “primary video.”
  • the component role element may represent that the component is an alternative camera view.
  • the component role element may have a value of “Alternative camera view.”
  • the component role element may represent that the component is an alternative video component.
  • the component role element may have a value of “Other alternative video component.”
  • the component role element may represent that the component is a sign language inset.
  • the component role element may have a value of “Sign language inset.”
  • the component role element may represent that the component is a follow subject video.
  • the component role element may have a value of “Follow subject video.”
  • the component role element may represent that the component is complete main.
  • the component role element may have a value of “Complete main.”
  • the component role element may represent that the component is music.
  • the component role element may have a value of “Music.”
  • the component role element may represent that the component is a dialog.
  • the component role element may have a value of “Dialog.”
  • the component role element may represent that the component is effects.
  • the component role element may have a value of “Effects.”
  • the component role element may represent that the component is visual impaired.
  • the component role element may have a value of “Visually impaired.”
  • the component role element may represent that the component is hearing impaired.
  • the component role element may have a value of “Hearing impaired.”
  • the component role element may represent that the component is a commentary.
  • the component role element may have a value of “Commentary.”
  • the component role element may represent that the component is a normal closed captioning.
  • the component role element may have a value of “Normal.”
  • the component role element may represent that the component is an easy-reader closed captioning for kindergarteners and elementary school students.
  • the component role element may have a value of “Easy reader.”
  • the component role element may represent that the component is an application.
  • the component role element may have a value of “Application.”
  • the component role element may represent that the component is an NRT content item.
  • the component role element may have a value of “NRT content item.”
  • the component role element may represent that the component is an on demand component.
  • the component role element may have a value of “On demand.”
  • the component role element may represent that the component is a notification stream component.
  • the component role element may have a value of “Notification Stream.”
  • a notification stream delivers notifications to synchronize actions of applications with an underlying linear time base.
  • the component role element may represent that the component is a start-over component.
  • the start-over component refers to an application component that provides a function of viewing content from the beginning after the content starts to be broadcast at the request of a user.
  • the component role element may have a value of “Start-over.”
  • the component role element may represent that the component is a companion screen component that may be consumed in the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 .
  • the component role element may have a value of “Companion-Screen.”
  • FIG. 86 shows an XML format of the aforementioned component element.
  • the aforementioned component element includes information about components without discrimination of component types.
  • the component element includes component information classified depending on component types, redundant use of lower elements and attributes may be reduced. This will be described with reference to the attached drawings.
  • FIGS. 87 and 88 show a syntax of a component element according to another embodiment of the present invention.
  • the component element may include lower elements indicating component information, which are classified according to component types.
  • the component element may include component role elements indicating component roles, which are classified depending on component types.
  • the component element may include a presentable video component element that indicates the role of a presentable video component.
  • the presentable video component element may represent that the presentable video component is a default video component.
  • the default video may refer to a primary video.
  • the presentable video component element may have a value of “primary video.”
  • the presentable video component element may represent that the presentable video component is an alternative camera view.
  • the presentable video component element may have a value of “Alternative camera view.”
  • the presentable video component element may represent that the presentable video component is an alternative video component.
  • the presentable video component element may have a value of “Other alternative video component.”
  • the presentable video component element may represent that the presentable video component is a sign language inset.
  • the presentable video component element may have a value of “Sign language inset.”
  • the presentable video component element may represent that the presentable video component is a follow subject video.
  • the presentable video component element may have a value of “Follow subject video.”
  • the component element may include a presentable audio component element that indicates the role of a presentable audio component.
  • the presentable audio component element may represent that the presentable audio component is complete main.
  • the presentable audio component element may have a value of “Complete main”
  • the presentable audio component element may represent that the presentable audio component is music.
  • the presentable audio component element may have a value of “Music.”
  • the presentable audio component element may represent that the presentable audio component is a dialog.
  • the presentable audio component element may have a value of “Dialog.”
  • the presentable audio component element may represent that the presentable audio component is effects.
  • the presentable audio component element may have a value of “Effects.”
  • the presentable audio component element may represent that the presentable audio component is visual impaired.
  • the presentable audio component element may have a value of “Visually impaired.”
  • the presentable audio component element may represent that the presentable audio component is hearing impaired.
  • the presentable audio component element may have a value of “hearing impaired.”
  • the presentable audio component element may represent that the presentable audio component is a commentary.
  • the presentable audio component element may have a value of “Commentary.”
  • the component element may include a presentable closed captioning component element that indicates the role of a presentable closed captioning component.
  • the presentable closed captioning component element may represent that the presentable closed captioning component is a normal closed captioning.
  • the presentable closed captioning component element may have a value of “Normal.”
  • the presentable closed captioning component element may represent that the presentable closed captioning component is an easy-reader closed captioning for kindergarteners and elementary school students.
  • the presentable closed captioning component element may have a value of “Easy reader.”
  • the component element may include a presentable App component element that indicates the role of a presentable App-based enhancement component.
  • the presentable App component element may represent that the presentable App-based enhancement component is an application.
  • the presentable App component element may have a value of “Application.”
  • the presentable App component element may represent that the presentable App-based enhancement component is an NRT content item.
  • the presentable App component element may have a value of “NRT content item.”
  • the presentable App component element may represent that the presentable App-based enhancement component is an on demand component.
  • the presentable App component element may have a value of “On demand”
  • the presentable App component element may represent that the presentable App-based enhancement component is a notification stream component.
  • the presentable App component element may have a value of “Notification Stream.”
  • the presentable App component element may represent that the presentable App-based enhancement component is a start-over component.
  • the start-over component refers to an application component that provides a function of viewing content from the beginning after the content starts to be broadcast at the request of a user.
  • the presentable App component element may have a value of “Start-over.”
  • the presentable App component element may represent that the presentable App-based enhancement component is a companion screen component that may be consumed in the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 .
  • the presentable App component element may have a value of “Companion-Screen.”
  • FIG. 89 shows an XML format of the aforementioned component element, the presentable video component element, presentable audio component element, presentable closed captioning component element and presentable App component element included in the component element.
  • the component element may include information indicating component types as attributes.
  • the component element may include an element value indicating the role of a component as a string, thereby facilitating future addition of roles of components.
  • the broadcast receiving apparatus 100 may display an element value indicating the role of a component to a user.
  • Component information included in ESG data may contain information indicating a device capability necessary to present components.
  • the aforementioned component fragment may include information indicating a device capability necessary to present component as an element.
  • the aforementioned component element may include information indicating a device capability necessary to present component as an element. This will be described with reference to the attached drawings.
  • FIG. 90 shows a syntax of a capability element according to another embodiment of the present invention.
  • the capability element may include a capability code that is a value indicating a capability.
  • the value indicating a capability may be referred to as CapabilityCodes. Meanings indicated by the capability code will be described with reference to FIG. 91 .
  • FIG. 91 shows values of the capability code element included in the capability element according to another embodiment of the present invention.
  • the capability code may represent that broadband connection is required for component presentation. Specifically, the capability code may represent that download through a broadband is required for component presentation. In this case, the capability code may have a value of 0x02.
  • the capability code may represent a device capability necessary for video rendering. Specifically, the capability code may represent that SD video presentation is required for component presentation. In this case, the capability code may have a value of 0x80. Specifically, the capability code may represent that HD video presentation is required for component presentation. In this case, the capability code may have a value of 0x81. Specifically, the capability code may represent that UHD video presentation is required for component presentation. In this case, the capability code may have a value of 0x82. Specifically, the capability code may represent that presentation of E-UHD video of 8K or higher is required for component presentation. In this case, the capability code may have a value of 0x83. Specifically, the capability code may represent that presentation of 3D video is required for component presentation.
  • the capability code may have a value of 0x90. Specifically, the capability code may represent that presentation of high dynamic range video is required for component presentation. In this case, the capability code may have a value of 0x91. Specifically, the capability code may represent that presentation of wide color gamut video is required for component presentation. In this case, the capability code may have a value of 0x92.
  • the capability code may represent a device capability required for audio rendering. Specifically, the capability code may represent that presentation of stereo (2-channel) audio is required for component presentation. In this case, the capability code may have a value of 0xA0. Specifically, the capability code may represent that presentation of 5.1 channel audio is required for component presentation. In this case, the capability code may have a value of 0xA1. Specifically, the capability code may represent that presentation of 3D audio is required for component presentation. In this case, the capability code may have a value of 0xA2. Specifically, the capability code may represent that dialog level adjustment is required for component presentation. In this case, the capability code may have a value of 0xB1.
  • the capability code may represent a device capability required for application rendering. Specifically, the capability code may represent that a personal video recorder (PVR) function is required for component presentation. Here, the capability code may have a value of 0xC0. Specifically, the capability code may represent that a download function is required for component presentation. Specifically, the download function may refer to downloading to a persistent storage. Here, the capability code may have a value of 0xC1. Specifically, the capability code may represent that a DRM processing function is required for component presentation. Here, the capability code may have a value of 0xC2. Specifically, the capability code may represent that a conditional access (CA) processing function is required for component presentation. Here, the capability code may have a value of 0xC3.
  • CA conditional access
  • the capability element will be described with reference to FIG. 90 .
  • the capability element may include a capability string element indicating a string that represents a capability required for component presentation.
  • the broadcast receiving apparatus 100 may display a capability required for component presentation based on the capability string element. Specifically, the broadcast receiving apparatus 100 may display a string indicated by the capability string element.
  • the capability string element may be referred to as CapabilityString.
  • the capability element may include a category element that indicates the category of the capability code.
  • the category element may be referred to as “category”. This will be described with reference to FIG. 92 .
  • FIG. 92 shows values that may be represented by the category element of the capability element accordion to another embodiment of the present invention.
  • the category element may represent a download protocol required for component presentation.
  • the category element may have a value of 0x01.
  • the category element may represent a forward error correction (FEC) algorithm required for component presentation.
  • the category element may have a value of 0x02.
  • the category element may represent a wrapper/Archive format required for component presentation.
  • the category element may have a value of 0x03.
  • the category element may represent a compression algorithm required for component presentation.
  • the category element may have a value of 0x04.
  • the category element may represent a media type required for component presentation.
  • the category element may have a value of 0x05.
  • the category element may represent a rendering capability required for component presentation.
  • the category element may have a value of 0x06.
  • broadcast receiving apparatuses 100 In hybrid broadcast, various services including various types of component may be delivered.
  • broadcast receiving apparatuses 100 have different presentation capabilities. Accordingly, the broadcast receiving apparatus 100 may display whether each component may be presented based on the capability element, as described above. Accordingly, the user may select content based on components included in services.
  • Broadcasters or content providers may sell content on a component-by-component basis. Specifically, broadcasters or content providers may separately sell some components included in a service. Specifically, broadcasters or content providers may sell components based on pay per-view (PPV). For example, broadcasters or content providers may provide base layer components of scalable video coding without charging and provide charged enhancement layer components. In addition, broadcasters or content providers may provide multi-view content while charging for components of some views. Furthermore, broadcasters or content providers may provide charged UHD components while providing HD component without charging. Broadcasters or content providers may provide charged stereo audio components. Broadcasters or content providers may provide audition related content while charging for a vote application with respect to the audition related content. This this end, the broadcast transmitting apparatus 100 needs to transmit charging information per component. In addition, the broadcast receiving apparatus 100 needs to display the charting information per component and provide an interface through which each component may be purchased. This will be described with reference to FIGS. 93 and 94 .
  • PSV pay per-view
  • FIGS. 93 and 94 illustrate a user interface providing payment per component according to an embodiment of the present invention.
  • the broadcast transmitting apparatus 10 may include charging information on each component in the aforementioned component information and deliver the component information including the charging information. Specifically, the broadcast transmitting apparatus 10 may include charging information on each component in the aforementioned component element and deliver the component element. In addition, the broadcast transmitting apparatus 10 may include charging information on each component in the aforementioned component fragment and deliver the component fragment.
  • the broadcast receiving apparatus 100 may acquire the charging information on each component based on the component information. Specifically, the broadcast receiving apparatus 100 may acquire the charging information on each component based on the component fragment. In addition, the broadcast receiving apparatus 100 may acquire the charging information on each component based on the component element. The broadcast receiving apparatus 100 may display the charging information on each component in the service guide menu. Specifically, the broadcast receiving apparatus 100 may represent that a corresponding component needs to be purchased to be presented in the service guide menu. Furthermore, the broadcast receiving apparatus 100 may display purchase conditions of the corresponding component in the service guide menu. The purchase conditions may include a price. The purchase conditions may include a presentable period. In the embodiment shown in FIG. 93 , the broadcast receiving apparatus 100 displays the fact that a component including an alternative view of broadcast content needs to be purchased to be presented.
  • the broadcast receiving apparatus 100 may display the fact that a component included in content needs to be charged to be presented while presenting the content.
  • the broadcast receiving apparatus 100 may provide a user interface through which the component is purchased.
  • the broadcast receiving apparatus 100 may display the fact that the component included in the content needs to be charged to be presented in the form of a message box while presenting the content.
  • the broadcast receiving apparatus 100 may perform a procedure for purchasing the component based on user input.
  • the broadcast receiving apparatus 100 displays the fact that alternative views of a baseball game may be viewed according to payment. When a user input for alternative views is received, the broadcast receiving apparatus 100 presents a component including the alternative views.
  • FIG. 95 illustrates an operation of the broadcast transmitting apparatus 10 according to an embodiment of the present invention.
  • the broadcast transmitting apparatus 10 acquires information about components (S 101 ).
  • the broadcast transmitting apparatus 10 may acquire the information about the components through the controller. Specifically, the broadcast transmitting apparatus 10 may acquire at least one of component types, device capabilities required for presentation of the components, roles of components, relationships with other components, information about services including the components, information about content including the components, information on target devices of the components, information on target users of the components, information on valid periods of the components, display start time of the components, display end time of the components and parental ratings of the components.
  • the broadcast transmitting apparatus 10 generates ESG data based on the information about the components (S 103 ).
  • the broadcast transmitting apparatus 10 may generate the ESG data based on the information about the component through the controller.
  • the broadcast transmitting apparatus 10 may generate the aforementioned component fragment based on the component.
  • the broadcast transmitting apparatus 10 may generate the aforementioned component element based on the components.
  • the broadcast transmitting apparatus 10 transmits a broadcast signal based on the information about the components (S 105 ).
  • the broadcast transmitting apparatus 10 may transmit the broadcast signal based on the information about the components through the transmitting unit.
  • the broadcast transmitting apparatus 10 may transmit a broadcast signal including the ESG data.
  • the broadcast transmitting apparatus 10 may transmit a broadcast signal including ESG data signaling information for ESG data reception.
  • the content server 50 may separately transmit the ESG data through a broadband.
  • FIG. 96 illustrates an operation of the broadcast receiving apparatus 100 according to an embodiment of the present invention.
  • the broadcast receiving apparatus 100 receives a broadcast signal (S 201 ).
  • the broadcast receiving apparatus 100 may receive the broadcast signal through the broadcast receiving unit 110 .
  • the broadcast receiving apparatus 100 acquires ESG data based on the broadcast signal (S 203 ).
  • the broadcast receiving apparatus 100 may acquire the ESG data based on the broadcast signal through the controller 150 .
  • the broadcast receiving apparatus 100 may acquire the ESG data from the broadcast signal.
  • the broadcast receiving apparatus 100 may extract the ESG data signaling information for ESG data reception from the broadcast signal.
  • the broadcast receiving apparatus 100 may acquire the ESG data from the content server 50 based on the ESG data signaling information.
  • the broadcast receiving apparatus 100 acquires information about components based on the ESG data (S 205 ).
  • the broadcast receiving apparatus 100 may acquire the information about the components based on the ESG data through the controller 150 .
  • the information about the components may include at least one of component types, device capabilities required for presentation of the components, roles of components, relationships with other components, information about services including the components, information about content including the components, information on target devices of the components, information on target users of the components, information on valid periods of the components, display start time of the components, display end time of the components and parental ratings of the components.
  • the broadcast receiving apparatus 100 displays the information about the components (S 207 ).
  • the broadcast receiving apparatus 100 may display the information about the components through the controller 150 .
  • the broadcast receiving apparatus 100 may display the information about the components in the service guide menu.
  • the broadcast receiving apparatus 100 may display the roles of components included in content in the service guide menu.
  • the broadcast receiving apparatus 100 may display charging information of the components included in the content in the service guide menu.
  • the broadcast receiving apparatus 100 may display the information about the components on a content presentation screen.
  • the broadcast receiving apparatus 100 may display the roles of the components in a message box positioned at the lower or upper part of the content presentation screen.
  • the broadcast receiving apparatus 100 may display the information about the components in a service list.
  • the broadcast receiving apparatus 100 receives a user input for a component (S 209 ).
  • the broadcast receiving apparatus 100 may receive a user input for a component through the controller 150 .
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of a component.
  • the broadcast receiving apparatus 100 may receive a user input for reservation recording of a component.
  • the broadcast receiving apparatus 100 may receive a user input for viewing a component.
  • the broadcast receiving apparatus 100 may receive a user input for recording a component.
  • the broadcast receiving apparatus 100 presents a component based on the user input (S 211 ).
  • the broadcast receiving apparatus 100 may present a component based on the user input through the controller 150 .
  • the broadcast receiving apparatus 100 may present a component corresponding to the user input.
  • the broadcast receiving apparatus 100 may present the component corresponding to the user input at a reservation viewing time.
  • the broadcast receiving apparatus 100 may immediately present the component corresponding to the user input.
  • the broadcast receiving apparatus 100 may immediately record the component corresponding to the user input.
  • the broadcast receiving apparatus 100 may record the component corresponding to the user input at a reservation recording time. Specific embodiments of the operation of the broadcast receiving apparatus 100 will be described with reference to the attached drawings.
  • FIG. 97 illustrates a content presentation screen of the broadcast receiving apparatus according to an embodiment of the present invention.
  • a service or content of hybrid broadcast may include a plurality of components.
  • the plurality of component may be simultaneously presented.
  • FIG. 97 shows simultaneous presentation of a component including a movie, a component including a sign language inset displayed on part of the screen, and a component including a follow subject video.
  • FIG. 98 shows a service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • the broadcast receiving apparatus 100 may display information about components. Specifically, the broadcast receiving apparatus 100 may display information about components included in a service. Specifically, the broadcast receiving apparatus 100 may display information about components included in content. In a specific embodiment, the broadcast receiving apparatus 100 may display information about components in the service guide menu. For example, the broadcast receiving apparatus 100 may display the roles of components included in content, as shown in FIG. 98( b ) . The broadcast receiving apparatus 100 may display information about components in the service guide menu based on user input. For example, the broadcast receiving apparatus 100 may display the service guide menu without displaying component information, as shown in FIG. 98( b ) .
  • the broadcast receiving apparatus 100 may display information about components in the service guide menu as shown in FIG. 98( b ) .
  • the broadcast receiving apparatus 100 may display components based on content data type.
  • the broadcast receiving apparatus 100 may display information about components including data of a type selected by a user. For example, when a user input for selecting a video component is received, the broadcast receiving apparatus 100 may display the roles of video components in the service guide menu.
  • the broadcast receiving apparatus 100 may display the roles of audio components in the service guide menu.
  • the broadcast receiving apparatus 100 may display the roles of closed captioning components in the service guide menu.
  • the broadcast receiving apparatus 100 may display the roles of App-based enhancement components in the service guide menu. Accordingly, the user may select content based on components included content or services.
  • FIGS. 99 to 105 illustrate reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • the broadcast receiving apparatus 100 may display information about video components included in content through the service guide menu.
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a component including a sign language inset through the service guide menu.
  • the broadcast receiving apparatus 100 may display setting of reservation viewing in the service guide menu.
  • the broadcast receiving apparatus 100 may present the corresponding content at a reservation viewing time.
  • the broadcast receiving apparatus 100 may cause the companion apparatus 300 to present the component for which the user sets reservation viewing. Specifically, the broadcast receiving apparatus 100 may deliver information about the component for which reservation viewing is set to the companion apparatus 300 . Here, the companion apparatus 300 may present the component based on the information about the component for which reservation viewing is set.
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including an alternative view component through the service guide menu.
  • the broadcast receiving apparatus 100 may display setting of reservation viewing in the service guide menu.
  • the broadcast receiving apparatus 100 may deliver information about the component to the companion apparatus 300 .
  • the companion apparatus 300 may present the corresponding content at a reservation viewing time.
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a follow subject video component through the service guide menu.
  • the companion apparatus 300 may present the corresponding content at a reservation viewing time.
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a follow subject video component through the service guide menu.
  • the broadcast receiving apparatus 100 may display information about audio components included in content in the service guide menu.
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a music component through the service guide menu. As shown in FIG. 103 , the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a component including a dialog level adjustment function through the service guide menu. As shown in FIG. 104 , the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a component for visually impaired through the service guide menu. The broadcast receiving apparatus 100 may display setting of reservation viewing through the service guide menu. In addition, the broadcast receiving apparatus 100 may present corresponding content at a reservation viewing time.
  • the broadcast receiving apparatus 100 may display information about closed captioning components included in content in the service guide menu.
  • the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a closed captioning component for kindergarteners and elementary school students through the service guide menu.
  • the broadcast receiving apparatus 100 may display setting of reservation viewing in the service guide menu.
  • the broadcast receiving apparatus 100 may present the corresponding content at a reservation viewing time.
  • the broadcast receiving apparatus 100 may display information about App-based enhancement components included in content in the service guide menu.
  • the broadcast receiving apparatus 100 may provide information about various components provided by hybrid broadcast based on ESG data. Accordingly, the broadcast receiving apparatus 100 may enhance convenience of selection of services or content by a user. Particularly, the broadcast receiving apparatus 100 may provide information about content or services scheduled to be broadcast in the future as well as currently presented services or content such that the user may easily select content or services.

Abstract

A broadcast receiving apparatus receiving a broadcast signal is disclosed. The broadcast receiving apparatus includes a broadcast receiving unit configured to receive a broadcast signal and a controller configured to receive electrical service guide (ESG) data including information about a broadcast service guide based onbased on the broadcast signal and to acquire information about a component included in at least one of a broadcast service and broadcast content based onbased on the ESG data.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is the National Stage filing under 35 U.S.C. 371 of International Application No. PCT/KR2015/004162, filed on Apr. 27, 2015, which claims the benefit of U.S. Provisional Application No. 61/984,854, filed on Apr. 27, 2014, U.S. Provisional Application No. 61/991,624, filed on May 12, 2014, U.S. Provisional Application No. 62/000,515, filed on May 19, 2014, and U.S. Provisional Application No. 62/003,039, filed on May 27, 2014, the contents of which are all hereby incorporated by reference herein in their entirety.
  • TECHNICAL FIELD
  • The present invention relates to a broadcast transmitting apparatus, a method of operating the broadcast transmitting apparatus, a broadcast receiving apparatus and a method of operating the broadcast receiving apparatus.
  • BACKGROUND ART
  • With the development of digital broadcast environments and communication environments, hybrid broadcast using a broadband as well as broadcast networks is in the spotlight. Furthermore, such hybrid broadcast provides applications or broadcast services interoperating with terminal devices such as a smartphone and a tablet.
  • However, MPEG-2 (Moving Picture Experts Group-2) transport stream (TS) used to transmit conventional broadcast streams was established before activation of hybrid broadcast. Accordingly, MPEG-2 TS does not consider hybrid broadcast and thus is restricted from being used for hybrid broadcast. Specifically, MPEG-2 TS does not provide various extend abilities and thus it is inefficient to transmit data used in a broadband through MPEG-2 TS. For example, to transmit IP packet data used in the broadband through MPEG-2 TS, IP packets need to be encapsulated into MPEG-2 TS. Furthermore, when MPEG-2 TS is used, a broadcast receiving apparatus needs to process both MPEG-2 TS packet and IP packet in order to support hybrid broadcast. Therefore, there is a need for a new broadcast transmission format having extendibility and efficiency for hybrid broadcast.
  • Additionally, an electronic program guide (ESG) displays broadcast services and programs provided by the broadcast services. The ESG may be referred to as an electronic program guide. Specifically, the ESG displays the start time, end time, title, summary of contents, recommended rating, genre, appearance information and so on of a program. In the case of hybrid broadcast, however, broadcast services can be provided along with applications. Furthermore, hybrid broadcast services include various media components and a broadcast receiving apparatus can selectively reproduce media components. In addition, the broadcast receiving apparatus can not only provide programs according to predetermined schedule but also provide programs at the request of a user. Accordingly, the broadcast receiving apparatus needs to provide an ESG capable of effectively delivering content of hybrid broadcast services that provide complicated and various contents. To this ends, a broadcast transmitting apparatus needs to transmit ESG data in a new format. Furthermore, the broadcast receiving apparatus needs to receive and display ESG data in a new format.
  • DISCLOSURE Technical Problem
  • An object of the present invention is to provide a broadcast transmitting apparatus for transmitting ESG data for hybrid broadcast, a method of operating the broadcast transmitting apparatus, a broadcast receiving apparatus for receiving ESG data for hybrid broadcast and a method of operating the broadcast receiving apparatus.
  • Technical Solution
  • A broadcast receiving apparatus according to an embodiment of the present invention includes: a broadcast receiving unit configured to receive a broadcast signal; and a controller configured to receive electrical service guide (ESG) data including information about a broadcast service guide based on the broadcast signal and to acquire information about a component included in at least one of a broadcast service and broadcast content based on the ESG data.
  • The information about the component may include device capability information indicating a device capability required to present the component.
  • The controller may display the information about the component based on the device capability information.
  • The controller may discriminately display information about a component presentable by the broadcast receiving apparatus and information about a component unpresentable by the broadcast receiving apparatus.
  • The information about the component may include reference information indicating inclusion relationships between the component and other components, between the component and the broadcast content and between the component and the broadcast service.
  • The information about the component may include association information indicating an associated component.
  • The associated component may represent a component presented along with the component.
  • The ESG data may be divided into fragments corresponding to information units and include a service fragment including information about the broadcast service and a content fragment including information about content included in the broadcast service.
  • The information about the content may be a content fragment included in the ESG data.
  • The information about the component may be a component element included in the content fragment.
  • The information about the component may include charging information about the component.
  • The controller may display the information about the component in a service guide menu.
  • The controller may display the role of the component in the service guide menu.
  • The controller may display the charging information about the component in the service guide menu.
  • The controller may display the information about the component based on types of data included in the component.
  • The controller may display the information about the component including data of a type selected by a user.
  • A method of operating a broadcast receiving apparatus according to an embodiment of the present invention includes: receiving a broadcast signal; receiving ESG data including information about a broadcast service guide based on the broadcast signal; and acquiring information about a component included in at least one of a broadcast service and broadcast content based on the ESG data.
  • A broadcast transmitting apparatus according to an embodiment of the present invention includes: a controller configured to acquire information about a component included in at least one of a broadcast service and broadcast content and to generate ESG data including information about a broadcast service guide based on the information about the component; and a transmitting unit configured to transmit a broadcast signal based on the ESG data.
  • Advantageous Effects
  • An embodiment of the present invention provides a broadcast transmitting apparatus for transmitting ESG data for hybrid broadcast, a method of operating the broadcast transmitting apparatus, a broadcast receiving apparatus for receiving ESG data for hybrid broadcast and a method of operating the broadcast receiving apparatus.
  • DESCRIPTION OF DRAWINGS
  • FIG. 1 illustrates a structure of an apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention.
  • FIG. 2 illustrates an input formatting block according to one embodiment of the present invention.
  • FIG. 3 illustrates an input formatting block according to another embodiment of the present invention.
  • FIG. 4 illustrates a BICM block according to an embodiment of the present invention.
  • FIG. 5 illustrates a BICM block according to another embodiment of the present invention.
  • FIG. 6 illustrates a frame building block according to one embodiment of the present invention.
  • FIG. 7 illustrates an orthogonal frequency division multiplexing (OFMD) generation block according to an embodiment of the present invention.
  • FIG. 8 illustrates a structure of an apparatus for receiving broadcast signals for future broadcast services according to an embodiment of the present invention.
  • FIG. 9 illustrates a frame structure according to an embodiment of the present invention.
  • FIG. 10 illustrates a signaling hierarchy structure of the frame according to an embodiment of the present invention.
  • FIG. 11 illustrates preamble signaling data according to an embodiment of the present invention.
  • FIG. 12 illustrates PLS1 data according to an embodiment of the present invention.
  • FIG. 13 illustrates PLS2 data according to an embodiment of the present invention.
  • FIG. 14 illustrates PLS2 data according to another embodiment of the present invention.
  • FIG. 15 illustrates a logical structure of a frame according to an embodiment of the present invention.
  • FIG. 16 illustrates PLS mapping according to an embodiment of the present invention.
  • FIG. 17 illustrates EAC mapping according to an embodiment of the present invention.
  • FIG. 18 illustrates FIC mapping according to an embodiment of the present invention.
  • FIG. 19 illustrates a type of DP according to an embodiment of the present invention.
  • FIG. 20 illustrates a time interleaving according to an embodiment of the present invention.
  • FIG. 21 illustrates the basic operation of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 22 illustrates an operation of a twisted row-column block interleaver according to another embodiment of the present invention.
  • FIG. 23 illustrates a diagonal-wise reading pattern of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 24 illustrates interleaved XFECBLOCKs from each interleaving array according to an embodiment of the present invention.
  • FIG. 25 illustrates a protocol stack for broadcast service provision according to an embodiment of the present invention.
  • FIG. 26 is a block diagram of a broadcast transmitting apparatus for transmitting broadcast services, a content server for transmitting content associated with broadcast services, a broadcast receiving apparatus for receiving broadcast services and a companion apparatus interoperating with the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 27 illustrates values of serviceType element included in a service fragment and service types indicated by the values according to an embodiment of the present invention.
  • FIG. 28 illustrates an XML format of the serviceType element included in the service fragment according to an embodiment of the present invention.
  • FIG. 29 illustrates XML data indicating the serviceType element and a user interface when the service indicated by the service fragment is a linear service according to an embodiment of the present invention.
  • FIG. 30 illustrates XML data indicating the serviceType element and a user interface when the service indicated by the service fragment is a linear application-based service according to an embodiment of the present invention.
  • FIG. 31 illustrates XML data indicating the serviceType element and a user interface when the service indicated by the service fragment is a linear companion screen service according to an embodiment of the present invention.
  • FIG. 32 illustrates an XML format of a component fragment according to an embodiment of the present invention.
  • FIG. 33 illustrates component types that can be indicated by the component fragment according to an embodiment of the present invention.
  • FIG. 34 illustrates an XML format of a ComponentRangeType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 35 illustrates an XML format of a ComponentData element included in the component fragment according to an embodiment of the present invention.
  • FIG. 36 illustrates an XML format of a VideoDataType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 37 illustrates an XML format of an AudioDataType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 38 illustrates an XML format of a CCDataType element included in the component fragment according to an embodiment of the present invention.
  • FIG. 39 illustrates an embodiment in which the component fragment according to an embodiment of the present invention represents a composite video component.
  • FIG. 40 illustrates another embodiment in which the component fragment according to an embodiment of the present invention represents a composite video component.
  • FIG. 41 illustrates another embodiment in which the component fragment according to an embodiment of the present invention represents a PickOne audio component.
  • FIG. 42 illustrates an XML format of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • FIG. 43 illustrates an XML format of an embodiment of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • FIG. 44 illustrates an XML format of an embodiment of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • FIG. 45 illustrates an XML format of component fragments when a component fragment representing an audio component and a component fragment representing a closed captioning component refer to a fragment representing an associated video component according to an embodiment of the present invention.
  • FIG. 46 illustrates a relationship among component fragments when a component fragment representing an audio component and a component fragment representing a closed captioning component refer to a fragment representing an associated video component according to an embodiment of the present invention.
  • FIG. 47 illustrates an XML format of component fragments when a component fragment representing a video component refers to a component fragment representing an audio component and a component fragment representing an associated closed captioning component according to an embodiment of the present invention.
  • FIG. 48 illustrates a relationship among component fragments when a component fragment representing a video component refers to a component fragment representing an audio component and a component fragment representing an associated closed captioning component according to an embodiment of the present invention.
  • FIG. 49 illustrates a reference relationship among fragments according to an embodiment of the present invention.
  • FIG. 50 illustrates an XML format of a component fragment when the component fragment refers to a higher component fragment, a content fragment and a service fragment.
  • FIG. 51 illustrates an XML format of a schedule fragment when the schedule fragment refers to a component fragment, a content fragment and a service fragment according to an embodiment of the present invention.
  • FIG. 52 illustrates a reference relationship among a service fragment, a content fragment and a component fragment representing a presentable video component, a presentable audio component and a presentable closed captioning component according to an embodiment of the present invention.
  • FIG. 53 illustrates a reference relationship between a component fragment representing a composite component and a component fragment representing a lower component according to an embodiment of the present invention.
  • FIG. 54 illustrates a reference relationship between a component fragment representing an App-based enhancement component and a component fragment representing a lower component according to an embodiment of the present invention.
  • FIG. 55 illustrates an XML format of a content fragment when the content fragment refers to a service according to another embodiment of the present invention.
  • FIG. 56 illustrates a reference relationship among content fragments and a service fragment according to another embodiment of the present invention.
  • FIG. 57 illustrates a reference relationship among fragments according to another embodiment of the present invention.
  • FIG. 58 illustrates an XML format of a service fragment according to another embodiment of the present invention.
  • FIG. 59 illustrates an XML format of a content fragment according to another embodiment of the present invention.
  • FIG. 60 illustrates an XML format of a component fragment according to another embodiment of the present invention.
  • FIG. 61 illustrates a reference relationship among a service fragment, a content fragment and a component fragment according to another embodiment of the present invention.
  • FIG. 62 illustrates a reference relationship between a component fragment representing a composite component and a lower component according to another embodiment of the present invention.
  • FIG. 63 illustrates a reference relationship between a component fragment representing an App-based enhancement component and a component fragment representing a lower component according to another embodiment of the present invention.
  • FIG. 64 illustrates a syntax of a component fragment according to another embodiment of the present invention.
  • FIG. 65 illustrates a syntax of a component fragment according to another embodiment of the present invention.
  • FIG. 66 illustrates a syntax of a component fragment according to another embodiment of the present invention.
  • FIG. 67 illustrates an XML format of a component fragment according to another embodiment of the present invention.
  • FIG. 68 illustrates an XML format of ComponentRangeType included in a component fragment according to another embodiment of the present invention.
  • FIG. 69 illustrates an XML format of ComponentRoleRangeType included in the component fragment according to another embodiment of the present invention.
  • FIG. 70 illustrates a relationship between the component fragment according to another embodiment of the present invention and a composite video component using scalable video encoding and components included in the composite video component.
  • FIG. 71 illustrates a relationship between the component fragment according to another embodiment of the present invention and a composite video component including 3D video and components included in the composite video component.
  • FIG. 72 illustrates a relationship between the component fragment according to another embodiment of the present invention and a PickOne audio component and components included in the PickOne audio component.
  • FIG. 73 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 74 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 75 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 76 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 77 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 78 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 79 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 80 illustrates an XML format of a component element according to another embodiment of the present invention.
  • FIG. 81 illustrates display of a component included in content by a broadcast receiving apparatus according to a capability element included in a component element according to another embodiment of the present invention.
  • FIG. 82 illustrates display of a component included in content by the broadcast receiving apparatus according to the capability element included in the component element according to another embodiment of the present invention.
  • FIG. 83 illustrates display of a component included in content by the broadcast receiving apparatus according to the capability element included in the component element according to another embodiment of the present invention.
  • FIG. 84 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 85 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 86 illustrates an XML format of the component element according to another embodiment of the present invention.
  • FIG. 87 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 88 illustrates a syntax of a component element according to another embodiment of the present invention.
  • FIG. 89 illustrates an XML format of the component element according to another embodiment of the present invention.
  • FIG. 90 illustrates a syntax of a capability element according to another embodiment of the present invention.
  • FIG. 91 illustrates values of a capability code element included in the capability element according to another embodiment of the present invention.
  • FIG. 92 illustrates values of a category element included in the capability element according to another embodiment of the present invention.
  • FIG. 93 illustrates a user interface providing payment per component according to an embodiment of the present invention.
  • FIG. 94 illustrates a user interface providing payment per component according to an embodiment of the present invention.
  • FIG. 95 illustrates an operation of a broadcast transmitting apparatus according to an embodiment of the present invention.
  • FIG. 96 illustrates an operation of a broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 97 illustrates a content presentation screen of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 98 illustrates a service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 100 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 101 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 102 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 103 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 104 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • FIG. 105 illustrates reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • BEST MODE
  • Hereinafter, embodiments of the present invention will be described in more detail with reference to the accompanying drawings, in order to allow those skilled in the art to easily realize the present invention. The present invention may be realized in different forms, and is not limited to the embodiments described herein. Moreover, detailed descriptions related to well-known functions or configurations will be ruled out in order not to unnecessarily obscure subject matters of the present invention. Like reference numerals refer to like elements throughout.
  • In additional, when a part “includes” some components, this means that the part does not exclude other components unless stated specifically and further includes other components.
  • The present invention provides broadcast signal transmitting/receiving device and method. According to the embodiment of the present invention, the further broadcast services include a terrestrial broadcasting service, a mobile broadcasting server, and UHDTV service. The present invention may process broadcast signals for the future broadcast services through non-MIMO (Multiple Input Multiple Output) or MIMO according to one embodiment. A non-MIMO scheme according to an embodiment of the present invention may include a MISO (Multiple Input Single Output) scheme, a SISO (Single Input Single Output) scheme, etc.
  • While MISO or MIMO uses two antennas in the following for convenience of description, the present invention is applicable to systems using two or more antennas. The present invention may defines three physical layer (PL) profiles (base, handheld and advanced profiles) each optimized to minimize receiver complexity while attaining the performance required for a particular use case. The physical layer (PHY) profiles are subsets of all configurations that a corresponding receiver should implement.
  • The three PHY profiles share most of the functional blocks but differ slightly in specific blocks and/or parameters. Additional PHY profiles can be defined in the future. For the system evolution, future profiles can also be multiplexed with the existing profiles in a single RF channel through a future extension frame (FEF). The details of each PHY profile are described below.
  • 1. Base Profile
  • The base profile represents a main use case for fixed receiving devices that are usually connected to a roof-top antenna. The base profile also includes portable devices that could be transported to a place but belong to a relatively stationary reception category. Use of the base profile could be extended to handheld devices or even vehicular by some improved implementations, but those use cases are not expected for the base profile receiver operation.
  • Target SNR range of reception is from approximately 10 to 20 dB, which includes the 15 dB SNR reception capability of the existing broadcast system (e.g. ATSC A/53). The receiver complexity and power consumption is not as critical as in the battery-operated handheld devices, which will use the handheld profile. Key system parameters for the base profile are listed in below table 1.
  • TABLE 1
    LDPC codeword length 16K, 64K bits
    Constellation size
    4~10 bpcu (bits per channel use)
    Time de-interleaving memory size ≦219 data cells
    Pilot patterns Pilot pattern for fixed reception
    FFT size 16K, 32K points
  • 2. Handheld Profile
  • The handheld profile is designed for use in handheld and vehicular devices that operate with battery power. The devices can be moving with pedestrian or vehicle speed. The power consumption as well as the receiver complexity is very important for the implementation of the devices of the handheld profile. The target SNR range of the handheld profile is approximately 0 to 10 dB, but can be configured to reach below 0 dB when intended for deeper indoor reception.
  • In addition to low SNR capability, resilience to the Doppler Effect caused by receiver mobility is the most important performance attribute of the handheld profile. Key system parameters for the handheld profile are listed in the below table 2.
  • TABLE 2
    LDPC codeword length 16K bits
    Constellation size
    2~8 bpcu
    Time de-interleaving memory size ≦218 data cells
    Pilot patterns Pilot patterns for mobile and indoor
    reception
    FFT size
    8K, 16K points
  • 3. Advanced Profile
  • The advanced profile provides highest channel capacity at the cost of more implementation complexity. This profile requires using MIMO transmission and reception, and UHDTV service is a target use case for which this profile is specifically designed. The increased capacity can also be used to allow an increased number of services in a given bandwidth, e.g., multiple SDTV or HDTV services.
  • The target SNR range of the advanced profile is approximately 20 to 30 dB. MIMO transmission may initially use existing elliptically-polarized transmission equipment, with extension to full-power cross-polarized transmission in the future. Key system parameters for the advanced profile are listed in below table 3.
  • TABLE 3
    LDPC codeword length 16K, 64K bits
    Constellation size
    8~12 bpcu
    Time de-interleaving memory size ≦219 data cells
    Pilot patterns Pilot pattern for fixed reception
    FFT size 16K, 32K points
  • In this case, the base profile can be used as a profile for both the terrestrial broadcast service and the mobile broadcast service. That is, the base profile can be used to define a concept of a profile which includes the mobile profile. Also, the advanced profile can be divided advanced profile for a base profile with MIMO and advanced profile for a handheld profile with MIMO. Moreover, the three profiles can be changed according to intention of the designer.
  • The following terms and definitions may apply to the present invention. The following terms and definitions can be changed according to design.
  • auxiliary stream: sequence of cells carrying data of as yet undefined modulation and coding, which may be used for future extensions or as required by broadcasters or network operators
  • base data pipe: data pipe that carries service signaling data
  • baseband frame (or BBFKAME): set of Kbch bits which form the input to one FEC encoding process (BCH and LDPC encoding)
  • cell: modulation value that is carried by one carrier of the OFDM transmission
  • coded block: LDPC-encoded block of PLS1 data or one of the LDPC-encoded blocks of PLS2 data
  • data pipe: logical channel in the physical layer that carries service data or related metadata, which may carry one or multiple service(s) or service component(s).
  • data pipe unit: a basic unit for allocating data cells to a DP in a frame.
  • data symbol: OFDM symbol in a frame which is not a preamble symbol (the frame signaling symbol and frame edge symbol is included in the data symbol)
  • DP_ID: this 8 bit field identifies uniquely a DP within the system identified by the SYSTEM_ID
  • dummy cell: cell carrying a pseudorandom value used to fill the remaining capacity not used for PLS signaling, DPs or auxiliary streams
  • emergency alert channel: part of a frame that carries EAS information data
  • frame: physical layer time slot that starts with a preamble and ends with a frame edge symbol
  • frame repetition unit: a set of frames belonging to same or different physical layer profile including a FEF, which is repeated eight times in a super-frame
  • fast information channel: a logical channel in a frame that carries the mapping information between a service and the corresponding base DP
  • FECBLOCK: set of LDPC-encoded bits of a DP data
  • FFT size: nominal FFT size used for a particular mode, equal to the active symbol period Ts expressed in cycles of the elementary period T
  • frame signaling symbol: OFDM symbol with higher pilot density used at the start of a frame in certain combinations of FFT size, guard interval and scattered pilot pattern, which carries a part of the PLS data
  • frame edge symbol: OFDM symbol with higher pilot density used at the end of a frame in certain combinations of FFT size, guard interval and scattered pilot pattern
  • frame-group: the set of all the frames having the same PHY profile type in a super-frame.
  • future extension frame: physical layer time slot within the super-frame that could be used for future extension, which starts with a preamble
  • Futurecast UTB system: proposed physical layer broadcasting system, of which the input is one or more MPEG2-TS or IP or general stream(s) and of which the output is an RF signal
  • input stream: A stream of data for an ensemble of services delivered to the end users by the system.
  • normal data symbol: data symbol excluding the frame signaling symbol and the frame edge symbol
  • PHY profile: subset of all configurations that a corresponding receiver should implement
  • PLS: physical layer signaling data consisting of PLS1 and PLS2
  • PLS1: a first set of PLS data carried in the FSS symbols having a fixed size, coding and modulation, which carries basic information about the system as well as the parameters needed to decode the PLS2
  • NOTE: PLS1 data remains constant for the duration of a frame-group.
  • PLS2: a second set of PLS data transmitted in the FSS symbol, which carries more detailed PLS data about the system and the DPs
  • PLS2 dynamic data: PLS2 data that may dynamically change frame-by-frame
  • PLS2 static data: PLS2 data that remains static for the duration of a frame-group
  • preamble signaling data: signaling data carried by the preamble symbol and used to identify the basic mode of the system
  • preamble symbol: fixed-length pilot symbol that carries basic PLS data and is located in the beginning of a frame
  • NOTE: The preamble symbol is mainly used for fast initial band scan to detect the system signal, its timing, frequency offset, and FFTsize.
  • reserved for future use: not defined by the present document but may be defined in future
  • superframe: set of eight frame repetition units
  • time interleaving block (TI block): set of cells within which time interleaving is carried out, corresponding to one use of the time interleaver memory
  • TI group: unit over which dynamic capacity allocation for a particular DP is carried out, made up of an integer, dynamically varying number of XFECBLOCKs.
  • NOTE: The TI group may be mapped directly to one frame or may be mapped to multiple frames. It may contain one or more TI blocks.
  • Type 1 DP: DP of a frame where all DPs are mapped into the frame in TDM fashion
  • Type 2 DP: DP of a frame where all DPs are mapped into the frame in FDM fashion
  • XFECBLOCK: set of Ncells cells carrying all the bits of one LDPC FECBLOCK
  • FIG. 1 illustrates a structure of an apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention.
  • The apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention can include an input formatting block 1000, a BICM (Bit interleaved coding & modulation) block 1010, a frame structure block 1020, an OFDM (Orthogonal Frequency Division Multiplexing) generation block 1030 and a signaling generation block 1040. A description will be given of the operation of each module of the apparatus for transmitting broadcast signals.
  • IP stream/packets and MPEG2-TS are the main input formats, other stream types are handled as General Streams. In addition to these data inputs, Management Information is input to control the scheduling and allocation of the corresponding bandwidth for each input stream. One or multiple TS stream(s), IP stream(s) and/or General Stream(s) inputs are simultaneously allowed.
  • The input formatting block 1000 can demultiplex each input stream into one or multiple data pipe(s), to each of which an independent coding and modulation is applied. The data pipe (DP) is the basic unit for robustness control, thereby affecting quality-of-service (QoS). One or multiple service(s) or service component(s) can be carried by a single DP. Details of operations of the input formatting block 1000 will be described later.
  • The data pipe is a logical channel in the physical layer that carries service data or related metadata, which may carry one or multiple service(s) or service component(s).
  • Also, the data pipe unit: a basic unit for allocating data cells to a DP in a frame.
  • In the BICM block 1010, parity data is added for error correction and the encoded bit streams are mapped to complex-value constellation symbols. The symbols are interleaved across a specific interleaving depth that is used for the corresponding DP. For the advanced profile, MIMO encoding is performed in the BICM block 1010 and the additional data path is added at the output for MIMO transmission. Details of operations of the BICM block 1010 will be described later.
  • The Frame Building block 1020 can map the data cells of the input DPs into the OFDM symbols within a frame. After mapping, the frequency interleaving is used for frequency-domain diversity, especially to combat frequency-selective fading channels. Details of operations of the Frame Building block 1020 will be described later.
  • After inserting a preamble at the beginning of each frame, the OFDM Generation block 1030 can apply conventional OFDM modulation having a cyclic prefix as guard interval. For antenna space diversity, a distributed MISO scheme is applied across the transmitters. In addition, a Peak-to-Average Power Reduction (PAPR) scheme is performed in the time domain. For flexible network planning, this proposal provides a set of various FFT sizes, guard interval lengths and corresponding pilot patterns. Details of operations of the OFDM Generation block 1030 will be described later.
  • The Signaling Generation block 1040 can create physical layer signaling information used for the operation of each functional block. This signaling information is also transmitted so that the services of interest are properly recovered at the receiver side. Details of operations of the Signaling Generation block 1040 will be described later.
  • FIGS. 2, 3 and 4 illustrate the input formatting block 1000 according to embodiments of the present invention. A description will be given of each figure.
  • FIG. 2 illustrates an input formatting block according to one embodiment of the present invention. FIG. 2 shows an input formatting module when the input signal is a single input stream.
  • The input formatting block illustrated in FIG. 2 corresponds to an embodiment of the input formatting block 1000 described with reference to FIG. 1.
  • The input to the physical layer may be composed of one or multiple data streams. Each data stream is carried by one DP. The mode adaptation modules slice the incoming data stream into data fields of the baseband frame (BBF). The system supports three types of input data streams: MPEG2-TS, Internet protocol (IP) and Generic stream (GS). MPEG2-TS is characterized by fixed length (188 byte) packets with the first byte being a sync-byte (0x47). An IP stream is composed of variable length IP datagram packets, as signaled within IP packet headers. The system supports both IPv4 and IPv6 for the IP stream. GS may be composed of variable length packets or constant length packets, signaled within encapsulation packet headers.
  • (a) shows a mode adaptation block 2000 and a stream adaptation 2010 for signal DP and (b) shows a PLS generation block 2020 and a PLS scrambler 2030 for generating and processing PLS data. A description will be given of the operation of each block.
  • The Input Stream Splitter splits the input TS, IP, GS streams into multiple service or service component (audio, video, etc.) streams. The mode adaptation module 2010 is comprised of a CRC Encoder, BB (baseband) Frame Slicer, and BB Frame Header Insertion block.
  • The CRC Encoder provides three kinds of CRC encoding for error detection at the user packet (UP) level, i.e., CRC-8, CRC-16, and CRC-32. The computed CRC bytes are appended after the UP. CRC-8 is used for TS stream and CRC-32 for IP stream. If the GS stream doesn't provide the CRC encoding, the proposed CRC encoding should be applied.
  • BB Frame Slicer maps the input into an internal logical-bit format. The first received bit is defined to be the MSB. The BB Frame Slicer allocates a number of input bits equal to the available data field capacity. To allocate a number of input bits equal to the BBF payload, the UP packet stream is sliced to fit the data field of BBF.
  • BB Frame Header Insertion block can insert fixed length BBF header of 2 bytes is inserted in front of the BB Frame. The BBF header is composed of STUFFI (1 bit), SYNCD (13 bits), and RFU (2 bits). In addition to the fixed 2-Byte BBF header, BBF can have an extension field (1 or 3 bytes) at the end of the 2-byte BBF header.
  • The stream adaptation 2010 is comprised of stuffing insertion block and BB scrambler. The stuffing insertion block can insert stuffing field into a payload of a BB frame. If the input data to the stream adaptation is sufficient to fill a BB-Frame, STUFFI is set to ‘0’ and the BBF has no stuffing field. Otherwise STUFFI is set to ‘1’ and the stuffing field is inserted immediately after the BBF header. The stuffing field comprises two bytes of the stuffing field header and a variable size of stuffing data.
  • The BB scrambler scrambles complete BBF for energy dispersal. The scrambling sequence is synchronous with the BBF. The scrambling sequence is generated by the feed-back shift register.
  • The PLS generation block 2020 can generate physical layer signaling (PLS) data. The PLS provides the receiver with a means to access physical layer DPs. The PLS data consists of PLS1 data and PLS2 data.
  • The PLS1 data is a first set of PLS data carried in the FSS symbols in the frame having a fixed size, coding and modulation, which carries basic information about the system as well as the parameters needed to decode the PLS2 data. The PLS1 data provides basic transmission parameters including parameters required to enable the reception and decoding of the PLS2 data. Also, the PLS1 data remains constant for the duration of a frame-group.
  • The PLS2 data is a second set of PLS data transmitted in the FSS symbol, which carries more detailed PLS data about the system and the DPs. The PLS2 contains parameters that provide sufficient information for the receiver to decode the desired DP. The PLS2 signaling further consists of two types of parameters, PLS2 Static data (PLS2-STAT data) and PLS2 dynamic data (PLS2-DYN data). The PLS2 Static data is PLS2 data that remains static for the duration of a frame-group and the PLS2 dynamic data is PLS2 data that may dynamically change frame-by-frame.
  • Details of the PLS data will be described later.
  • The PLS scrambler 2030 can scramble the generated PLS data for energy dispersal.
  • The above-described blocks may be omitted or replaced by blocks having similar or identical functions.
  • FIG. 3 illustrates an input formatting block according to another embodiment of the present invention.
  • The input formatting block illustrated in FIG. 3 corresponds to an embodiment of the input formatting block 1000 described with reference to FIG. 1.
  • FIG. 3 shows a mode adaptation block of the input formatting block when the input signal corresponds to multiple input streams.
  • The mode adaptation block of the input formatting block for processing the multiple input streams can independently process the multiple input streams.
  • Referring to FIG. 3, the mode adaptation block for respectively processing the multiple input streams can include an input stream splitter 3000, an input stream synchronizer 3010, a compensating delay block 3020, a null packet deletion block 3030, a head compression block 3040, a CRC encoder 3050, a BB frame slicer 3060 and a BB header insertion block 3070. Description will be given of each block of the mode adaptation block.
  • Operations of the CRC encoder 3050, BB frame slicer 3060 and BB header insertion block 3070 correspond to those of the CRC encoder, BB frame slicer and BB header insertion block described with reference to FIG. 2 and thus description thereof is omitted.
  • The input stream splitter 3000 can split the input TS, IP, GS streams into multiple service or service component (audio, video, etc.) streams.
  • The input stream synchronizer 3010 may be referred as ISSY. The ISSY can provide suitable means to guarantee Constant Bit Rate (CBR) and constant end-to-end transmission delay for any input data format. The ISSY is always used for the case of multiple DPs carrying TS, and optionally used for multiple DPs carrying GS streams.
  • The compensating delay block 3020 can delay the split TS packet stream following the insertion of ISSY information to allow a TS packet recombining mechanism without requiring additional memory in the receiver.
  • The null packet deletion block 3030, is used only for the TS input stream case. Some TS input streams or split TS streams may have a large number of null-packets present in order to accommodate VBR (variable bit-rate) services in a CBR TS stream. In this case, in order to avoid unnecessary transmission overhead, null-packets can be identified and not transmitted. In the receiver, removed null-packets can be re-inserted in the exact place where they were originally by reference to a deleted null-packet (DNP) counter that is inserted in the transmission, thus guaranteeing constant bit-rate and avoiding the need for time-stamp (PCR) updating.
  • The head compression block 3040 can provide packet header compression to increase transmission efficiency for TS or IP input streams. Because the receiver can have a priori information on certain parts of the header, this known information can be deleted in the transmitting unit.
  • For Transport Stream, the receiver has a-priori information about the sync-byte configuration (0x47) and the packet length (188 Byte). If the input TS stream carries content that has only one PID, i.e., for only one service component (video, audio, etc.) or service sub-component (SVC base layer, SVC enhancement layer, MVC base view or MVC dependent views), TS packet header compression can be applied (optionally) to the Transport Stream. IP packet header compression is used optionally if the input steam is an IP stream.
  • The above-described blocks may be omitted or replaced by blocks having similar or identical functions.
  • FIG. 4 illustrates a BICM block according to an embodiment of the present invention.
  • The BICM block illustrated in FIG. 4 corresponds to an embodiment of the BICM block 1010 described with reference to FIG. 1.
  • As described above, the apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention can provide a terrestrial broadcast service, mobile broadcast service, UHDTV service, etc.
  • Since QoS (quality of service) depends on characteristics of a service provided by the apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention, data corresponding to respective services needs to be processed through different schemes. Accordingly, the a BICM block according to an embodiment of the present invention can independently process DPs input thereto by independently applying SISO, MISO and MIMO schemes to the data pipes respectively corresponding to data paths. Consequently, the apparatus for transmitting broadcast signals for future broadcast services according to an embodiment of the present invention can control QoS for each service or service component transmitted through each DP.
  • (a) shows the BICM block shared by the base profile and the handheld profile and (b) shows the BICM block of the advanced profile.
  • The BICM block shared by the base profile and the handheld profile and the BICM block of the advanced profile can include plural processing blocks for processing each DP.
  • A description will be given of each processing block of the BICM block for the base profile and the handheld profile and the BICM block for the advanced profile.
  • A processing block 5000 of the BICM block for the base profile and the handheld profile can include a Data FEC encoder 5010, a bit interleaver 5020, a constellation mapper 5030, an SSD (Signal Space Diversity) encoding block 5040 and a time interleaver 5050.
  • The Data FEC encoder 5010 can perform the FEC encoding on the input BBF to generate FECBLOCK procedure using outer coding (BCH), and inner coding (LDPC). The outer coding (BCH) is optional coding method. Details of operations of the Data FEC encoder 5010 will be described later.
  • The bit interleaver 5020 can interleave outputs of the Data FEC encoder 5010 to achieve optimized performance with combination of the LDPC codes and modulation scheme while providing an efficiently implementable structure. Details of operations of the bit interleaver 5020 will be described later.
  • The constellation mapper 5030 can modulate each cell word from the bit interleaver 5020 in the base and the handheld profiles, or cell word from the Cell-word demultiplexer 5010-1 in the advanced profile using either QPSK, QAM-16, non-uniform QAM (NUQ-64, NUQ-256, NUQ-1024) or non-uniform constellation (NUC-16, NUC-64, NUC-256, NUC-1024) to give a power-normalized constellation point, el. This constellation mapping is applied only for DPs. Observe that QAM-16 and NUQs are square shaped, while NUCs have arbitrary shape. When each constellation is rotated by any multiple of 90 degrees, the rotated constellation exactly overlaps with its original one. This “rotation-sense” symmetric property makes the capacities and the average powers of the real and imaginary components equal to each other. Both NUQs and NUCs are defined specifically for each code rate and the particular one used is signaled by the parameter DP_MOD filed in PLS2 data.
  • The time interleaver 5050 can operates at the DP level. The parameters of time interleaving (TI) may be set differently for each DP. Details of operations of the time interleaver 5050 will be described later.
  • A processing block 5000-1 of the BICM block for the advanced profile can include the Data FEC encoder, bit interleaver, constellation mapper, and time interleaver. However, the processing block 5000-1 is distinguished from the processing block 5000 further includes a cell-word demultiplexer 5010-1 and a MIMO encoding block 5020-1.
  • Also, the operations of the Data FEC encoder, bit interleaver, constellation mapper, and time interleaver in the processing block 5000-1 correspond to those of the Data FEC encoder 5010, bit interleaver 5020, constellation mapper 5030, and time interleaver 5050 described and thus description thereof is omitted.
  • The cell-word demultiplexer 5010-1 is used for the DP of the advanced profile to divide the single cell-word stream into dual cell-word streams for MIMO processing. Details of operations of the cell-word demultiplexer 5010-1 will be described later.
  • The MIMO encoding block 5020-1 can processing the output of the cell-word demultiplexer 5010-1 using MIMO encoding scheme. The MIMO encoding scheme was optimized for broadcasting signal transmission. The MIMO technology is a promising way to get a capacity increase but it depends on channel characteristics. Especially for broadcasting, the strong LOS component of the channel or a difference in the received signal power between two antennas caused by different signal propagation characteristics makes it difficult to get capacity gain from MIMO. The proposed MIMO encoding scheme overcomes this problem using a rotation-based pre-coding and phase randomization of one of the MIMO output signals.
  • MIMO encoding is intended for a 2×2 MIMO system requiring at least two antennas at both the transmitter and the receiver. Two MIMO encoding modes are defined in this proposal; full-rate spatial multiplexing (FR-SM) and full-rate full-diversity spatial multiplexing (FRFD-SM). The FR-SM encoding provides capacity increase with relatively small complexity increase at the receiver side while the FRFD-SM encoding provides capacity increase and additional diversity gain with a great complexity increase at the receiver side. The proposed MIMO encoding scheme has no restriction on the antenna polarity configuration.
  • MIMO processing is required for the advanced profile frame, which means all DPs in the advanced profile frame are processed by the MIMO encoder. MIMO processing is applied at DP level. Pairs of the Constellation Mapper outputs NUQ (e1,i and e2,i) are fed to the input of the MIMO Encoder. Paired MIMO Encoder output (g1,i and g2,i) is transmitted by the same carrier k and OFDM symbol l of their respective TX antennas.
  • The above-described blocks may be omitted or replaced by blocks having similar or identical functions.
  • FIG. 5 illustrates a BICM block according to another embodiment of the present invention.
  • The BICM block illustrated in FIG. 5 corresponds to an embodiment of the BICM block 1010 described with reference to FIG. 1.
  • FIG. 5 illustrates a BICM block for protection of physical layer signaling (PLS), emergency alert channel (EAC) and fast information channel (FIC). EAC is a part of a frame that carries EAS information data and FIC is a logical channel in a frame that carries the mapping information between a service and the corresponding base DP. Details of the EAC and FIC will be described later.
  • Referring to FIG. 5, the BICM block for protection of PLS, EAC and FIC can include a PLS FEC encoder 6000, a bit interleaver 6010 and a constellation mapper 6020.
  • Also, the PLS FEC encoder 6000 can include a scrambler, BCH encoding/zero insertion block, LDPC encoding block and LDPC parity puncturing block. Description will be given of each block of the BICM block.
  • The PLS FEC encoder 6000 can encode the scrambled PLS 1/2 data, EAC and FIC section.
  • The scrambler can scramble PLS1 data and PLS2 data before BCH encoding and shortened and punctured LDPC encoding.
  • The BCH encoding/zero insertion block can perform outer encoding on the scrambled PLS 1/2 data using the shortened BCH code for PLS protection and insert zero bits after the BCH encoding. For PLS1 data only, the output bits of the zero insertion may be permuted before LDPC encoding.
  • The LDPC encoding block can encode the output of the BCH encoding/zero insertion block using LDPC code. To generate a complete coded block, Cldpc, parity bits, Pldpc are encoded systematically from each zero-inserted PLS information block, Ildpc and appended after it.

  • C ldpc =[I ldpc P ldpc ]=[i 0 ,i 1 , . . . ,i K ldpc −1 ,p 0 ,p 1 , . . . ,p N ldpc −K ldpc −1]  [Math Figure 1]
  • The LDPC code parameters for PLS1 and PLS2 are as following table 4.
  • TABLE 4
    Signaling Kldpc code
    Type Ksig Kbch Nbch parity (=Nbch) Nldpc Nldpc parity rate Qldpc
    PLS1 342 1020 60 1080 4320 3240 1/4  36
    PLS2 <1021
    >1020 2100 2160 7200 5040 3/10 56
  • The LDPC parity puncturing block can perform puncturing on the PLS1 data and PLS 2 data.
  • When shortening is applied to the PLS1 data protection, some LDPC parity bits are punctured after LDPC encoding. Also, for the PLS2 data protection, the LDPC parity bits of PLS2 are punctured after LDPC encoding. These punctured bits are not transmitted.
  • The bit interleaver 6010 can interleave the each shortened and punctured PLS1 data and PLS2 data.
  • The constellation mapper 6020 can map the bit interleaved PLS1 data and PLS2 data onto constellations.
  • The above-described blocks may be omitted or replaced by blocks having similar or identical functions.
  • FIG. 6 illustrates a frame building block according to one embodiment of the present invention.
  • The frame building block illustrated in FIG. 6 corresponds to an embodiment of the frame building block 1020 described with reference to FIG. 1.
  • Referring to FIG. 6, the frame building block can include a delay compensation block 7000, a cell mapper 7010 and a frequency interleaver 7020. Description will be given of each block of the frame building block.
  • The delay compensation block 7000 can adjust the timing between the data pipes and the corresponding PLS data to ensure that they are co-timed at the transmitter end. The PLS data is delayed by the same amount as data pipes are by addressing the delays of data pipes caused by the Input Formatting block and BICM block. The delay of the BICM block is mainly due to the time interleaver. In-band signaling data carries information of the next TI group so that they are carried one frame ahead of the DPs to be signaled. The Delay Compensating block delays in-band signaling data accordingly.
  • The cell mapper 7010 can map PLS, EAC, FIC, DPs, auxiliary streams and dummy cells into the active carriers of the OFDM symbols in the frame. The basic function of the cell mapper 7010 is to map data cells produced by the TIs for each of the DPs, PLS cells, and EAC/FIC cells, if any, into arrays of active OFDM cells corresponding to each of the OFDM symbols within a frame. Service signaling data (such as PSI (program specific information)/SI) can be separately gathered and sent by a data pipe. The Cell Mapper operates according to the dynamic information produced by the scheduler and the configuration of the frame structure. Details of the frame will be described later.
  • The frequency interleaver 7020 can randomly interleave data cells received from the cell mapper 7010 to provide frequency diversity. Also, the frequency interleaver 7020 can operate on very OFDM symbol pair comprised of two sequential OFDM symbols using a different interleaving-seed order to get maximum interleaving gain in a single frame. Details of operations of the frequency interleaver 7020 will be described later.
  • The above-described blocks may be omitted or replaced by blocks having similar or identical functions.
  • FIG. 7 illustrates an OFMD generation block according to an embodiment of the present invention.
  • The OFMD generation block illustrated in FIG. 7 corresponds to an embodiment of the OFMD generation block 1030 described with reference to FIG. 1.
  • The OFDM generation block modulates the OFDM carriers by the cells produced by the Frame Building block, inserts the pilots, and produces the time domain signal for transmission. Also, this block subsequently inserts guard intervals, and applies PAPR (Peak-to-Average Power Radio) reduction processing to produce the final RF signal. Referring to FIG. 7, the OFDM generation block can include a pilot and reserved tone insertion block 8000, a 2D-eSFN encoding block 8010, an IFNT (Inverse Fast Fourier Transform) block 8020, a PAPR reduction block 8030, a guard interval insertion block 8040, a preamble insertion block 8050, other system insertion block 8060 and a DAC block 8070. Description will be given of each block of the frame building block.
  • The other system insertion block 8060 can multiplex signals of a plurality of broadcast transmission/reception systems in the time domain such that data of two or more different broadcast transmission/reception systems providing broadcast services can be simultaneously transmitted in the same RF signal bandwidth. In this case, the two or more different broadcast transmission/reception systems refer to systems providing different broadcast services. The different broadcast services may refer to a terrestrial broadcast service, mobile broadcast service, etc.
  • FIG. 8 illustrates a structure of an apparatus for receiving broadcast signals for future broadcast services according to an embodiment of the present invention.
  • The apparatus for receiving broadcast signals for future broadcast services according to an embodiment of the present invention can correspond to the apparatus for transmitting broadcast signals for future broadcast services, described with reference to FIG. 1.
  • The apparatus for receiving broadcast signals for future broadcast services according to an embodiment of the present invention can include a synchronization & demodulation module 9000, a frame parsing module 9010, a demapping & decoding module 9020, an output processor 9030 and a signaling decoding module 9040. A description will be given of operation of each module of the apparatus for receiving broadcast signals.
  • The synchronization & demodulation module 9000 can receive input signals through m Rx antennas, perform signal detection and synchronization with respect to a system corresponding to the apparatus for receiving broadcast signals and carry out demodulation corresponding to a reverse procedure of the procedure performed by the apparatus for transmitting broadcast signals.
  • The frame parsing module 9010 can parse input signal frames and extract data through which a service selected by a user is transmitted. If the apparatus for transmitting broadcast signals performs interleaving, the frame parsing module 9010 can carry out deinterleaving corresponding to a reverse procedure of interleaving. In this case, the positions of a signal and data that need to be extracted can be obtained by decoding data output from the signaling decoding module 9040 to restore scheduling information generated by the apparatus for transmitting broadcast signals.
  • The demapping & decoding module 9020 can convert the input signals into bit domain data and then deinterleave the same as necessary. The demapping & decoding module 9020 can perform demapping for mapping applied for transmission efficiency and correct an error generated on a transmission channel through decoding. In this case, the demapping & decoding module 9020 can obtain transmission parameters necessary for demapping and decoding by decoding the data output from the signaling decoding module 9040.
  • The output processor 9030 can perform reverse procedures of various compression/signal processing procedures which are applied by the apparatus for transmitting broadcast signals to improve transmission efficiency. In this case, the output processor 9030 can acquire necessary control information from data output from the signaling decoding module 9040. The output of the output processor 8300 corresponds to a signal input to the apparatus for transmitting broadcast signals and may be MPEG-TSs, IP streams (v4 or v6) and generic streams.
  • The signaling decoding module 9040 can obtain PLS information from the signal demodulated by the synchronization & demodulation module 9000. As described above, the frame parsing module 9010, demapping & decoding module 9020 and output processor 9030 can execute functions thereof using the data output from the signaling decoding module 9040.
  • FIG. 9 illustrates a frame structure according to an embodiment of the present invention.
  • FIG. 9 shows an example configuration of the frame types and FRUs in a super-frame. (a) shows a super frame according to an embodiment of the present invention, (b) shows FRU (Frame Repetition Unit) according to an embodiment of the present invention, (c) shows frames of variable PHY profiles in the FRU and (d) shows a structure of a frame.
  • A super-frame may be composed of eight FRUs. The FRU is a basic multiplexing unit for TDM of the frames, and is repeated eight times in a super-frame.
  • Each frame in the FRU belongs to one of the PHY profiles, (base, handheld, advanced) or FEF. The maximum allowed number of the frames in the FRU is four and a given PHY profile can appear any number of times from zero times to four times in the FRU (e.g., base, base, handheld, advanced). PHY profile definitions can be extended using reserved values of the PHY_PROFILE in the preamble, if required.
  • The FEF part is inserted at the end of the FRU, if included. When the FEF is included in the FRU, the minimum number of FEFs is 8 in a super-frame. It is not recommended that FEF parts be adjacent to each other.
  • One frame is further divided into a number of OFDM symbols and a preamble. As shown in (d), the frame comprises a preamble, one or more frame signaling symbols (FSS), normal data symbols and a frame edge symbol (FES).
  • The preamble is a special symbol that enables fast Futurecast UTB system signal detection and provides a set of basic transmission parameters for efficient transmission and reception of the signal. The detailed description of the preamble will be will be described later.
  • The main purpose of the FSS(s) is to carry the PLS data. For fast synchronization and channel estimation, and hence fast decoding of PLS data, the FSS has more dense pilot pattern than the normal data symbol. The FES has exactly the same pilots as the FSS, which enables frequency-only interpolation within the FES and temporal interpolation, without extrapolation, for symbols immediately preceding the FES.
  • FIG. 10 illustrates a signaling hierarchy structure of the frame according to an embodiment of the present invention.
  • FIG. 10 illustrates the signaling hierarchy structure, which is split into three main parts: the preamble signaling data 11000, the PLS1 data 11010 and the PLS2 data 11020. The purpose of the preamble, which is carried by the preamble symbol in every frame, is to indicate the transmission type and basic transmission parameters of that frame. The PLS1 enables the receiver to access and decode the PLS2 data, which contains the parameters to access the DP of interest. The PLS2 is carried in every frame and split into two main parts: PLS2-STAT data and PLS2-DYN data. The static and dynamic portion of PLS2 data is followed by padding, if necessary.
  • FIG. 11 illustrates preamble signaling data according to an embodiment of the present invention.
  • Preamble signaling data carries 21 bits of information that are needed to enable the receiver to access PLS data and trace DPs within the frame structure. Details of the preamble signaling data are as follows:
  • PHY_PROFILE: This 3-bit field indicates the PHY profile type of the current frame. The mapping of different PHY profile types is given in below table 5.
  • TABLE 5
    Value PHY Profile
    000 Base profile
    001 Handheld profile
    010 Advanced profiled
    011~110 Reserved
    111 FEF
  • FFT_SIZE: This 2 bit field indicates the FFT size of the current frame within a frame-group, as described in below table 6.
  • TABLE 6
    Value FFT size
    00 8K FFT
    01 16K FFT
    10 32K FFT
    11 Reserved
  • GI_FRACTION: This 3 bit field indicates the guard interval fraction value in the current super-frame, as described in below table 7.
  • TABLE 7
    Value GI_FRACTION
    000
    001 1/10
    010 1/20
    011 1/40
    100 1/80
    101 1/160
    110~111 Reserved
  • EAC_FLAG: This 1 bit field indicates whether the EAC is provided in the current frame. If this field is set to ‘1’, emergency alert service (EAS) is provided in the current frame. If this field set to ‘0’, EAS is not carried in the current frame. This field can be switched dynamically within a super-frame.
  • PILOT_MODE: This 1-bit field indicates whether the pilot mode is mobile mode or fixed mode for the current frame in the current frame-group. If this field is set to ‘0’, mobile pilot mode is used. If the field is set to ‘1’, the fixed pilot mode is used.
  • PAPR_FLAG: This 1-bit field indicates whether PAPR reduction is used for the current frame in the current frame-group. If this field is set to value ‘1’, tone reservation is used for PAPR reduction. If this field is set to ‘0’, PAPR reduction is not used.
  • FRU_CONFIGURE: This 3-bit field indicates the PHY profile type configurations of the frame repetition units (FRU) that are present in the current super-frame. All profile types conveyed in the current super-frame are identified in this field in all preambles in the current super-frame. The 3-bit field has a different definition for each profile, as show in below table 8.
  • TABLE 8
    Current Current
    Current PHY_PROFILE = PHY_PROFILE = Current
    PHY_PROFILE = ‘001’ ‘010’ PHY_PROFILE =
    ‘000’ (base) (handheld) (advanced) ‘111’ (FEF)
    FRU_CONFIGURE = Only base Only handheld Only advanced Only FEF
    000 profile present profile present profile present present
    FRU_CONFIGURE = Handheld Base profile Base profile Base profile
    1XX profile present present present present
    FRU_CONFIGURE = Advanced Advanced Handheld Handheld
    X1X profile present profile present profile present profile present
    FRU_CONFIGURE = FEF present FEF present FEF present Advanced
    XX1 profile present
  • RESERVED: This 7-bit field is reserved for future use.
  • FIG. 12 illustrates PLS1 data according to an embodiment of the present invention.
  • PLS1 data provides basic transmission parameters including parameters required to enable the reception and decoding of the PLS2. As above mentioned, the PLS1 data remain unchanged for the entire duration of one frame-group. The detailed definition of the signaling fields of the PLS1 data are as follows:
  • PREAMBLE_DATA: This 20-bit field is a copy of the preamble signaling data excluding the EAC_FLAG.
  • NUM_FRAME_FRU: This 2-bit field indicates the number of the frames per FRU.
  • PAYLOAD_TYPE: This 3-bit field indicates the format of the payload data carried in the frame-group. PAYLOAD_TYPE is signaled as shown in table 9.
  • TABLE 9
    value Payload type
    1XX TS stream is transmitted
    X1X IP stream is transmitted
    XX1 GS stream is transmitted
  • NUM_FSS: This 2-bit field indicates the number of FSS symbols in the current frame.
  • SYSTEM_VERSION: This 8-bit field indicates the version of the transmitted signal format. The SYSTEM_VERSION is divided into two 4-bit fields, which are a major version and a minor version.
  • Major version: The MSB four bits of SYSTEM_VERSION field indicate major version information. A change in the major version field indicates a non-backward-compatible change. The default value is ‘0000’. For the version described in this standard, the value is set to ‘0000’.
  • Minor version: The LSB four bits of SYSTEM_VERSION field indicate minor version information. A change in the minor version field is backward-compatible.
  • CELL_ID: This is a 16-bit field which uniquely identifies a geographic cell in an ATSC network. An ATSC cell coverage area may consist of one or more frequencies, depending on the number of frequencies used per Futurecast UTB system. If the value of the CELL_ID is not known or unspecified, this field is set to ‘0’.
  • NETWORK_ID: This is a 16-bit field which uniquely identifies the current ATSC network.
  • SYSTEM_ID: This 16-bit field uniquely identifies the Futurecast UTB system within the ATSC network. The Futurecast UTB system is the terrestrial broadcast system whose input is one or more input streams (TS, IP, GS) and whose output is an RF signal. The Futurecast UTB system carries one or more PHY profiles and FEF, if any. The same Futurecast UTB system may carry different input streams and use different RF frequencies in different geographical areas, allowing local service insertion. The frame structure and scheduling is controlled in one place and is identical for all transmissions within a Futurecast UTB system. One or more Futurecast UTB systems may have the same SYSTEM_ID meaning that they all have the same physical layer structure and configuration.
  • The following loop consists of FRU_PHY_PROFILE,
  • FRU_FRAME_LENGTH, FRU_GI_FRACTION, and RESERVED which are used to indicate the FRU configuration and the length of each frame type. The loop size is fixed so that four PHY profiles (including a FEF) are signaled within the FRU. If NUM_FRAME_FRU is less than 4, the unused fields are filled with zeros.
  • FRU_PHY_PROFILE: This 3-bit field indicates the PHY profile type of the (i+1)th (i is the loop index) frame of the associated FRU. This field uses the same signaling format as shown in the table 8.
  • FRU_FRAME_LENGTH: This 2-bit field indicates the length of the (i+1)th frame of the associated FRU. Using FRU_FRAME_LENGTH together with FRU_GI_FRACTION, the exact value of the frame duration can be obtained.
  • FRU_GI_FRACTION: This 3-bit field indicates the guard interval fraction value of the (i+1)th frame of the associated FRU. FRU_GI_FRACTION is signaled according to the table 7.
  • RESERVED: This 4-bit field is reserved for future use.
  • The following fields provide parameters for decoding the PLS2 data.
  • PLS2_FEC_TYPE: This 2-bit field indicates the FEC type used by the PLS2 protection. The FEC type is signaled according to table 10. The details of the LDPC codes will be described later.
  • TABLE 10
    Content PLS2 FEC type
    00 4K-1/4 and 7K-3/10 LDPC codes
    01~11 Reserved
  • PLS2_MOD: This 3-bit field indicates the modulation type used by the PLS2. The modulation type is signaled according to table 11.
  • TABLE 11
    Value PLS2_MODE
    000 BPSK
    001 QPSK
    010 QAM-16
    011 NUQ-64
    100~111 Reserved
  • PLS2_SIZE_CELL: This 15-bit field indicates Ctotal_partial_block, the size (specified as the number of QAM cells) of the collection of full coded blocks for PLS2 that is carried in the current frame-group. This value is constant during the entire duration of the current frame-group.
  • PLS2_STAT_SIZE_BIT: This 14-bit field indicates the size, in bits, of the PLS2-STAT for the current frame-group. This value is constant during the entire duration of the current frame-group.
  • PLS2_DYN_SIZE_BIT: This 14-bit field indicates the size, in bits, of the PLS2-DYN for the current frame-group. This value is constant during the entire duration of the current frame-group.
  • PLS2_REP_FLAG: This 1-bit flag indicates whether the PLS2 repetition mode is used in the current frame-group. When this field is set to value ‘1’, the PLS2 repetition mode is activated. When this field is set to value ‘0’, the PLS2 repetition mode is deactivated.
  • PLS2_REP_SIZE_CELL: This 15-bit field indicates Ctotal_partial_block, the size (specified as the number of QAM cells) of the collection of partial coded blocks for PLS2 carried in every frame of the current frame-group, when PLS2 repetition is used. If repetition is not used, the value of this field is equal to 0. This value is constant during the entire duration of the current frame-group.
  • PLS2_NEXT_FEC_TYPE: This 2-bit field indicates the FEC type used for PLS2 that is carried in every frame of the next frame-group. The FEC type is signaled according to the table 10.
  • PLS2_NEXT_MOD: This 3-bit field indicates the modulation type used for PLS2 that is carried in every frame of the next frame-group. The modulation type is signaled according to the table 11.
  • PLS2_NEXT_REP_FLAG: This 1-bit flag indicates whether the PLS2 repetition mode is used in the next frame-group. When this field is set to value ‘1’, the PLS2 repetition mode is activated. When this field is set to value ‘0’, the PLS2 repetition mode is deactivated.
  • PLS2_NEXT_REP_SIZE_CELL: This 15-bit field indicates Ctotal_full_block, The size (specified as the number of QAM cells) of the collection of full coded blocks for PLS2 that is carried in every frame of the next frame-group, when PLS2 repetition is used. If repetition is not used in the next frame-group, the value of this field is equal to 0. This value is constant during the entire duration of the current frame-group.
  • PLS2_NEXT_REP_STAT_SIZE_BIT: This 14-bit field indicates the size, in bits, of the PLS2-STAT for the next frame-group. This value is constant in the current frame-group.
  • PLS2_NEXT_REP_DYN_SIZE_BIT: This 14-bit field indicates the size, in bits, of the PLS2-DYN for the next frame-group. This value is constant in the current frame-group.
  • PLS2_AP_MODE: This 2-bit field indicates whether additional parity is provided for PLS2 in the current frame-group. This value is constant during the entire duration of the current frame-group. The below table 12 gives the values of this field. When this field is set to ‘00’, additional parity is not used for the PLS2 in the current frame-group.
  • TABLE 12
    Value PLS2-AP mode
    00 AP is not provided
    01 AP1 mode
    10~11 Reserved
  • PLS2_AP_SIZE_CELL: This 15-bit field indicates the size (specified as the number of QAM cells) of the additional parity bits of the PLS2. This value is constant during the entire duration of the current frame-group.
  • PLS2_NEXT_AP_MODE: This 2-bit field indicates whether additional parity is provided for PLS2 signaling in every frame of next frame-group. This value is constant during the entire duration of the current frame-group. The table 12 defines the values of this field
  • PLS2_NEXT_AP_SIZE_CELL: This 15-bit field indicates the size (specified as the number of QAM cells) of the additional parity bits of the PLS2 in every frame of the next frame-group. This value is constant during the entire duration of the current frame-group.
  • RESERVED: This 32-bit field is reserved for future use.
  • CRC_32: A 32-bit error detection code, which is applied to the entire PLS1 signaling.
  • FIG. 13 illustrates PLS2 data according to an embodiment of the present invention.
  • FIG. 13 illustrates PLS2-STAT data of the PLS2 data. The PLS2-STAT data are the same within a frame-group, while the PLS2-DYN data provide information that is specific for the current frame.
  • The details of fields of the PLS2-STAT data are as follows:
  • FIC_FLAG: This 1-bit field indicates whether the FIC is used in the current frame-group. If this field is set to ‘1’, the FIC is provided in the current frame. If this field set to ‘0’, the FIC is not carried in the current frame. This value is constant during the entire duration of the current frame-group.
  • AUX_FLAG: This 1-bit field indicates whether the auxiliary stream(s) is used in the current frame-group. If this field is set to ‘1’, the auxiliary stream is provided in the current frame. If this field set to ‘0’, the auxiliary stream is not carried in the current frame. This value is constant during the entire duration of current frame-group.
  • NUM_DP: This 6-bit field indicates the number of DPs carried within the current frame. The value of this field ranges from 1 to 64, and the number of DPs is NUM_DP+1.
  • DP_ID: This 6-bit field identifies uniquely a DP within a PHY profile.
  • DP_TYPE: This 3-bit field indicates the type of the DP. This is signaled according to the below table 13.
  • TABLE 13
    Value DP Type
    000 DP Type 1
    001 DP Type 2
    010~111 reserved
  • DP_GROUP_ID: This 8-bit field identifies the DP group with which the current DP is associated. This can be used by a receiver to access the DPs of the service components associated with a particular service, which will have the same DP_GROUP_ID.
  • BASE_DP_ID: This 6-bit field indicates the DP carrying service signaling data (such as PSI/SI) used in the Management layer. The DP indicated by BASE_DP_ID may be either a normal DP carrying the service signaling data along with the service data or a dedicated DP carrying only the service signaling data
  • DP_FEC_TYPE: This 2-bit field indicates the FEC type used by the associated DP. The FEC type is signaled according to the below table 14.
  • TABLE 14
    Value FEC_TYPE
    00 16K LDPC
    01 64K LDPC
    10~11 Reserved
  • DP_COD: This 4-bit field indicates the code rate used by the associated DP. The code rate is signaled according to the below table 15.
  • TABLE 15
    Value Code rate
    0000 5/15
    0001 6/15
    0010 7/15
    0011 8/15
    0100 9/15
    0101 10/15 
    0110 11/15 
    0111 12/15 
    1000 13/15 
    1001~1111 Reserved
  • DP_MOD: This 4-bit field indicates the modulation used by the associated DP. The modulation is signaled according to the below table 16.
  • TABLE 16
    Value Modulation
    0000 QPSK
    0001 QAM-16
    0010 NUQ-64
    0011 NUQ-256
    0100 NUQ-1024
    0101 NUC-16
    0110 NUC-64
    0111 NUC-256
    1000 NUC-1024
    1001~1111 reserved
  • DP_SSD_FLAG: This 1-bit field indicates whether the SSD mode is used in the associated DP. If this field is set to value ‘1’, SSD is used. If this field is set to value ‘0’, SSD is not used.
  • The following field appears only if PHY_PROFILE is equal to ‘010’, which indicates the advanced profile:
  • DP_MIMO: This 3-bit field indicates which type of MIMO encoding process is applied to the associated DP. The type of MIMO encoding process is signaled according to the table 17.
  • TABLE 17
    Value MIMO encoding
    0000 FR-SM
    0001 FRFD-SM
    010~111 reserved
  • DP_TI_TYPE: This 1-bit field indicates the type of time-interleaving. A value of ‘0’ indicates that one TI group corresponds to one frame and contains one or more TI-blocks. A value of ‘1’ indicates that one TI group is carried in more than one frame and contains only one TI-block.
  • DP_TI_LENGTH: The use of this 2-bit field (the allowed values are only 1, 2, 4, 8) is determined by the values set within the DP_TI_TYPE field as follows:
  • If the DP_TI_TYPE is set to the value ‘1’, this field indicates PI, the number of the frames to which each TI group is mapped, and there is one TI-block per TI group (NTI=1). The allowed PI values with 2-bit field are defined in the below table 18.
  • If the DP_TI_TYPE is set to the value ‘0’, this field indicates the number of TI-blocks NTI per TI group, and there is one TI group per frame (PI=1). The allowed PI values with 2-bit field are defined in the below table 18.
  • TABLE 18
    2-bit field PI NTI
    00 1 1
    01 2 2
    10 4 3
    11 8 4
  • DP_FRAME_INTERVAL: This 2-bit field indicates the frame interval (HUMP) within the frame-group for the associated DP and the allowed values are 1, 2, 4, 8 (the corresponding 2-bit field is ‘00’, ‘01’, ‘10’, or ‘11’, respectively). For DPs that do not appear every frame of the frame-group, the value of this field is equal to the interval between successive frames. For example, if a DP appears on the frames 1, 5, 9, 13, etc., this field is set to ‘4’. For DPs that appear in every frame, this field is set to ‘1’.
  • DP_TI_BYPASS: This 1-bit field determines the availability of time interleaver. If time interleaving is not used for a DP, it is set to ‘1’. Whereas if time interleaving is used it is set to ‘0’.
  • DP_FIRST_FRAME_IDX: This 5-bit field indicates the index of the first frame of the super-frame in which the current DP occurs. The value of DP_FIRST_FRAME_IDX ranges from 0 to 31
  • DP_NUM_BLOCK_MAX: This 10-bit field indicates the maximum value of DP_NUM_BLOCKS for this DP. The value of this field has the same range as DP_NUM_BLOCKS.
  • DP_PAYLOAD_TYPE: This 2-bit field indicates the type of the payload data carried by the given DP. DP_PAYLOAD_TYPE is signaled according to the below table 19.
  • TABLE 19
    Value Payload Type
    00 TS.
    01 IP
    10 GS
    11 reserved
  • DP_INBAND_MODE: This 2-bit field indicates whether the current DP carries in-band signaling information. The in-band signaling type is signaled according to the below table 20.
  • TABLE 20
    Value In-band mode
    00 In-band signaling is not carried.
    01 INBAND-PLS is carried only
    10 INBAND-ISSY is carried only
    11 INBAND-PLS and INBAND-ISSY are carried
  • DP_PROTOCOL_TYPE: This 2-bit field indicates the protocol type of the payload carried by the given DP. It is signaled according to the below table 21 when input payload types are selected.
  • TABLE 21
    If If If
    DP_PAYLOAD_TYPE DP_PAYLOAD_TYPE DP_PAYLOAD_TYPE
    Value Is TS Is IP Is GS
    00 MPEG2-TS IPv4 (Note)
    01 Reserved IPv6 Reserved
    10 Reserved Reserved Reserved
    11 Reserved Reserved Reserved
  • DP_CRC_MODE: This 2-bit field indicates whether CRC encoding is used in the Input Formatting block. The CRC mode is signaled according to the below table 22.
  • TABLE 22
    Value CRC mode
    00 Not used
    01 CRC-8
    10 CRC-16
    11 CRC-32
  • DNP_MODE: This 2-bit field indicates the null-packet deletion mode used by the associated DP when DP_PAYLOAD_TYPE is set to TS (‘00’). DNP_MODE is signaled according to the below table 23. If DP_PAYLOAD_TYPE is not TS (‘00’), DNP_MODE is set to the value ‘00’.
  • TABLE 23
    Value Null-packet deletion mode
    00 Not used
    01 DNP-NORMAL
    10 DNP-OFFSET
    11 reserved
  • ISSY_MODE: This 2-bit field indicates the ISSY mode used by the associated DP when DP_PAYLOAD_TYPE is set to TS (‘00’). The ISSY_MODE is signaled according to the below table 24 If DP_PAYLOAD_TYPE is not TS (‘00’), ISSY_MODE is set to the value ‘00’.
  • TABLE 24
    Value ISSY mode
    00 Not used
    01 ISSY-UP
    10 ISSY-BBF
    11 reserved
  • HC_MODE_TS: This 2-bit field indicates the TS header compression mode used by the associated DP when DP_PAYLOAD_TYPE is set to TS (‘00’). The HC_MODE_TS is signaled according to the below table 25.
  • TABLE 25
    Value Header compression mode
    00 HC_MODE_TS 1
    01 HC_MODE_TS 2
    10 HC_MODE_TS 3
    11 HC_MODE_TS 4
  • HC_MODE_IP: This 2-bit field indicates the IP header compression mode when DP_PAYLOAD_TYPE is set to IP (‘01’). The HC_MODE_IP is signaled according to the below table 26.
  • TABLE 26
    Value Header compression mode
    00 No compression
    01 HC_MODE_IP 1
    10~11 reserved
  • PID: This 13-bit field indicates the PID number for TS header compression when DP_PAYLOAD_TYPE is set to TS (‘00’) and HC_MODE_TS is set to ‘01’ or ‘10’.
  • RESERVED: This 8-bit field is reserved for future use.
  • The following field appears only if FIC_FLAG is equal to ‘1’:
  • FIC_VERSION: This 8-bit field indicates the version number of the FIC.
  • FIC_LENGTH_BYTE: This 13-bit field indicates the length, in bytes, of the FIC.
  • RESERVED: This 8-bit field is reserved for future use.
  • The following field appears only if AUX_FLAG is equal to ‘1’:
  • NUM_AUX: This 4-bit field indicates the number of auxiliary streams. Zero means no auxiliary streams are used.
  • AUX_CONFIG_RFU: This 8-bit field is reserved for future use.
  • AUX_STREAM_TYPE: This 4-bit is reserved for future use for indicating the type of the current auxiliary stream.
  • AUX_PRIVATE_CONFIG: This 28-bit field is reserved for future use for signaling auxiliary streams.
  • FIG. 14 illustrates PLS2 data according to another embodiment of the present invention.
  • FIG. 14 illustrates PLS2-DYN data of the PLS2 data. The values of the PLS2-DYN data may change during the duration of one frame-group, while the size of fields remains constant.
  • The details of fields of the PLS2-DYN data are as follows:
  • FRAME_INDEX: This 5-bit field indicates the frame index of the current frame within the super-frame. The index of the first frame of the super-frame is set to ‘0’.
  • PLS_CHANGE_COUNTER: This 4-bit field indicates the number of super-frames ahead where the configuration will change. The next super-frame with changes in the configuration is indicated by the value signaled within this field. If this field is set to the value ‘0000’, it means that no scheduled change is foreseen: e.g., value ‘1’ indicates that there is a change in the next super-frame.
  • FIC_CHANGE_COUNTER: This 4-bit field indicates the number of super-frames ahead where the configuration (i.e., the contents of the FIC) will change. The next super-frame with changes in the configuration is indicated by the value signaled within this field. If this field is set to the value ‘0000’, it means that no scheduled change is foreseen: e.g. value ‘0001’ indicates that there is a change in the next super-frame.
  • RESERVED: This 16-bit field is reserved for future use.
  • The following fields appear in the loop over NUM_DP, which describe the parameters associated with the DP carried in the current frame.
  • DP_ID: This 6-bit field indicates uniquely the DP within a PHY profile.
  • DP_START: This 15-bit (or 13-bit) field indicates the start position of the first of the DPs using the DPU addressing scheme. The DP_START field has differing length according to the PHY profile and FFT size as shown in the below table 27.
  • TABLE 27
    DP_START
    field size
    PHY profile 64K 16K
    Base
    13 bit 15 bit
    Handheld 13 bit
    Advanced 13 bit 15 bit
  • DP_NUM_BLOCK: This 10-bit field indicates the number of FEC blocks in the current TI group for the current DP. The value of DP_NUM_BLOCK ranges from 0 to 1023
  • RESERVED: This 8-bit field is reserved for future use.
  • The following fields indicate the FIC parameters associated with the EAC.
  • EAC_FLAG: This 1-bit field indicates the existence of the EAC in the current frame. This bit is the same value as the EAC_FLAG in the preamble.
  • EAS_WAKE_UP_VERSION_NUM: This 8-bit field indicates the version number of a wake-up indication.
  • If the EAC_FLAG field is equal to ‘1’, the following 12 bits are allocated for EAC_LENGTH_BYTE field. If the EAC_FLAG field is equal to ‘0’, the following 12 bits are allocated for EAC_COUNTER.
  • EAC_LENGTH_BYTE: This 12-bit field indicates the length, in byte, of the EAC.
  • EAC_COUNTER: This 12-bit field indicates the number of the frames before the frame where the EAC arrives.
  • The following field appears only if the AUX_FLAG field is equal to ‘1’:
  • AUX_PRIVATE_DYN: This 48-bit field is reserved for future use for signaling auxiliary streams. The meaning of this field depends on the value of AUX_STREAM_TYPE in the configurable PLS2-STAT.
  • CRC_32: A 32-bit error detection code, which is applied to the entire PLS2.
  • FIG. 15 illustrates a logical structure of a frame according to an embodiment of the present invention.
  • As above mentioned, the PLS, EAC, FIC, DPs, auxiliary streams and dummy cells are mapped into the active carriers of the OFDM symbols in the frame. The PLS 1 and PLS2 are first mapped into one or more FSS(s). After that, EAC cells, if any, are mapped immediately following the PLS field, followed next by FIC cells, if any. The DPs are mapped next after the PLS or EAC, FIC, if any. Type 1 DPs follows first, and Type 2 DPs next. The details of a type of the DP will be described later. In some case, DPs may carry some special data for EAS or service signaling data. The auxiliary stream or streams, if any, follow the DPs, which in turn are followed by dummy cells. Mapping them all together in the above mentioned order, i.e. PLS, EAC, FIC, DPs, auxiliary streams and dummy data cells exactly fill the cell capacity in the frame.
  • FIG. 16 illustrates PLS mapping according to an embodiment of the present invention.
  • PLS cells are mapped to the active carriers of FSS(s). Depending on the number of cells occupied by PLS, one or more symbols are designated as FSS(s), and the number of FSS(s) NFSS is signaled by NUM_FSS in PLS1. The FSS is a special symbol for carrying PLS cells. Since robustness and latency are critical issues in the PLS, the FSS(s) has higher density of pilots allowing fast synchronization and frequency-only interpolation within the FSS.
  • PLS cells are mapped to active carriers of the NFSS FSS(s) in a top-down manner as shown in an example in FIG. 17. The PLS1 cells are mapped first from the first cell of the first FSS in an increasing order of the cell index. The PLS2 cells follow immediately after the last cell of the PLS1 and mapping continues downward until the last cell index of the first FSS. If the total number of required PLS cells exceeds the number of active carriers of one FSS, mapping proceeds to the next FSS and continues in exactly the same manner as the first FSS.
  • After PLS mapping is completed, DPs are carried next. If EAC, FIC or both are present in the current frame, they are placed between PLS and “normal” DPs.
  • FIG. 17 illustrates EAC mapping according to an embodiment of the present invention.
  • EAC is a dedicated channel for carrying EAS messages and links to the DPs for EAS. EAS support is provided but EAC itself may or may not be present in every frame. EAC, if any, is mapped immediately after the PLS2 cells. EAC is not preceded by any of the FIC, DPs, auxiliary streams or dummy cells other than the PLS cells. The procedure of mapping the EAC cells is exactly the same as that of the PLS.
  • The EAC cells are mapped from the next cell of the PLS2 in increasing order of the cell index as shown in the example in FIG. 17. Depending on the EAS message size, EAC cells may occupy a few symbols, as shown in FIG. 17.
  • EAC cells follow immediately after the last cell of the PLS2, and mapping continues downward until the last cell index of the last FSS. If the total number of required EAC cells exceeds the number of remaining active carriers of the last FSS mapping proceeds to the next symbol and continues in exactly the same manner as FSS(s). The next symbol for mapping in this case is the normal data symbol, which has more active carriers than a FSS.
  • After EAC mapping is completed, the FIC is carried next, if any exists. If FIC is not transmitted (as signaled in the PLS2 field), DPs follow immediately after the last cell of the EAC.
  • FIG. 18 illustrates FIC mapping according to an embodiment of the present invention.
  • (a) shows an example mapping of FIC cell without EAC and (b) shows an example mapping of FIC cell with EAC.
  • FIC is a dedicated channel for carrying cross-layer information to enable fast service acquisition and channel scanning. This information primarily includes channel binding information between DPs and the services of each broadcaster. For fast scan, a receiver can decode FIC and obtain information such as broadcaster ID, number of services, and BASE_DP_ID. For fast service acquisition, in addition to FIC, base DP can be decoded using BASE_DP_ID. Other than the content it carries, a base DP is encoded and mapped to a frame in exactly the same way as a normal DP. Therefore, no additional description is required for a base DP. The FIC data is generated and consumed in the Management Layer. The content of FIC data is as described in the Management Layer specification.
  • The FIC data is optional and the use of FIC is signaled by the FIC_FLAG parameter in the static part of the PLS2. If FIC is used, FIC_FLAG is set to ‘1’ and the signaling field for FIC is defined in the static part of PLS2. Signaled in this field are FIC_VERSION, and FIC_LENGTH_BYTE. FIC uses the same modulation, coding and time interleaving parameters as PLS2. FIC shares the same signaling parameters such as PLS2_MOD and PLS2_FEC. FIC data, if any, is mapped immediately after PLS2 or EAC if any. FIC is not preceded by any normal DPs, auxiliary streams or dummy cells. The method of mapping FIC cells is exactly the same as that of EAC which is again the same as PLS.
  • Without EAC after PLS, FIC cells are mapped from the next cell of the PLS2 in an increasing order of the cell index as shown in an example in (a). Depending on the FIC data size, FIC cells may be mapped over a few symbols, as shown in (b).
  • FIC cells follow immediately after the last cell of the PLS2, and mapping continues downward until the last cell index of the last FSS. If the total number of required FIC cells exceeds the number of remaining active carriers of the last FSS, mapping proceeds to the next symbol and continues in exactly the same manner as FSS(s). The next symbol for mapping in this case is the normal data symbol which has more active carriers than a FSS.
  • If EAS messages are transmitted in the current frame, EAC precedes FIC, and FIC cells are mapped from the next cell of the EAC in an increasing order of the cell index as shown in (b).
  • After FIC mapping is completed, one or more DPs are mapped, followed by auxiliary streams, if any, and dummy cells.
  • FIG. 19 illustrates an FEC structure according to an embodiment of the present invention.
  • FIG. 19 illustrates an FEC structure according to an embodiment of the present invention before bit interleaving. As above mentioned, Data FEC encoder may perform the FEC encoding on the input BBF to generate FECBLOCK procedure using outer coding (BCH), and inner coding (LDPC). The illustrated FEC structure corresponds to the FECBLOCK. Also, the FECBLOCK and the FEC structure have same value corresponding to a length of LDPC codeword.
  • The BCH encoding is applied to each BBF (Kbch bits), and then LDPC encoding is applied to BCH-encoded BBF (Kldpc bits=Nbch bits) as illustrated in FIG. 19.
  • The value of Nldpc is either 64800 bits (long FECBLOCK) or 16200 bits (short FECBLOCK).
  • The below table 28 and table 29 show FEC encoding parameters for a long FECBLOCK and a short FECBLOCK, respectively.
  • TABLE 28
    BCH
    error
    LDPC correction
    Rate Nldpc Kldpc Kbch capability Nbch K bch
    5/15 64800 21600 21408 12 192
    6/15 25920 25728
    7/15 30240 30048
    8/15 34560 34368
    9/15 38880 38688
    10/15  43200 43008
    11/15  47520 47328
    12/15  51840 51648
    13/15  56160 55968
  • TABLE 29
    BCH
    error
    LDPC correction
    Rate Nldpc Kldpc Kbch capability Nbch K bch
    5/15 16200 5400 5232 12 168
    6/15 6480 6312
    7/15 7560 7392
    8/15 8640 8472
    9/15 9720 9552
    10/15  10800 10632
    11/15  11880 11712
    12/15  12960 12792
    13/15  14040 13872
  • The details of operations of the BCH encoding and LDPC encoding are as follows:
  • A 12-error correcting BCH code is used for outer encoding of the BBF. The BCH generator polynomial for short FECBLOCK and long FECBLOCK are obtained by multiplying together all polynomials.
  • LDPC code is used to encode the output of the outer BCH encoding. To generate a completed Bldpc (FECBLOCK), Pldpc (parity bits) is encoded systematically from each Ildpc (BCH-encoded BBF), and appended to Ildpc. The completed Bldpc (FECBLOCK) are expressed as follow Math figure.

  • B ldpc =[I ldpc P ldpc ]=[i 0 ,i 1 , . . . ,i K ldpc −1 p 0 ,p 1 , . . . ,p N ldpc −K ldpc −1]  [Math Figure 2]
  • The parameters for long FECBLOCK and short FECBLOCK are given in the above table 28 and 29, respectively.
  • The detailed procedure to calculate Nldpc−Kldpc parity bits for long FECBLOCK, is as follows:
  • 1) Initialize the parity bits,

  • p 0 =p 1 =p 2 = . . . =P N ldpc −K ldpc −1=0  [Math Figure 3]
  • 2) Accumulate the first information bit—i0, at parity bit addresses specified in the first row of addresses of parity check matrix. The details of addresses of parity check matrix will be described later. For example, for rate 13/15:

  • p 983 =p 983 ⊕i 0 p 2815 =p 2815 ⊕i 0

  • p 4837 =p 4837 ⊕i 0 p 4989 =p 4989 ⊕i 0

  • p 6138 =p 6138 ⊕i 0 p 6458 =p 6458 ⊕i 0

  • p 6921 =p 6921 ⊕i 0 p 6974 =p 6974 ⊕i 0

  • p 7572 =p 7572 ⊕i 0 p 8260 =p 8260 ⊕i 0

  • p 8496 =p 8496 ⊕i 0  [Math Figure 4]
  • 3) For the next 359 information bits, is, s=1, 2, . . . , 359 accumulate is at parity bit addresses using following Math figure.

  • {x+(s mod 360)×Q ldpc} mod(N ldpc −K ldpc)  [Math Figure 5]
  • where x denotes the address of the parity bit accumulator corresponding to the first bit i0, and Qldpc is a code rate dependent constant specified in the addresses of parity check matrix. Continuing with the example, Qldpc=24 for rate 13/15, so for information bit i1, the following operations are performed:

  • p 1007 =p 1007 ⊕i 1 p 2839 =p 2839 ⊕i 1

  • p 4861 =p 4861 ⊕i 1 p 5013 =p 5013 ⊕i 1

  • p 6162 =p 6162 ⊕i 1 p 6482 =p 6482 ⊕i 1

  • p 6945 =p 6945 ⊕i 1 p 6998 p 6998 ⊕i 1

  • p 7596 =p 7596 ⊕i 1 p 8284 =p 8284 ⊕i 1

  • p 8520 =p 8520 ⊕i 1  [Math Figure 4]
  • 4) For the 361st information bit i360, the addresses of the parity bit accumulators are given in the second row of the addresses of parity check matrix. In a similar manner the addresses of the parity bit accumulators for the following 359 information bits is, s=361, 362, . . . , 719 are obtained using the Math Figure 6, where x denotes the address of the parity bit accumulator corresponding to the information bit i360, i.e., the entries in the second row of the addresses of parity check matrix.
  • 5) In a similar manner, for every group of 360 new information bits, a new row from addresses of parity check matrixes used to find the addresses of the parity bit accumulators.
  • After all of the information bits are exhausted, the final parity bits are obtained as follows:
  • 6) Sequentially perform the following operations starting with i=1

  • p i =p i ⊕p i−1 ,i=1,2, . . . ,N ldpc −K ldpc−1  [Math Figure 7]
  • where final content of pi, i=0, 1, . . . Nldpc−Kldpc−1 is equal to the parity bit pi.
  • TABLE 30
    Code
    Rate Q
    ldpc
    5/15 120
    6/15 108
    7/15 96
    8/15 84
    9/15 72
    10/15  60
    11/15  48
    12/15  36
    13/15  24
  • This LDPC encoding procedure for a short FECBLOCK is in accordance with t LDPC encoding procedure for the long FECBLOCK, except replacing the table 30 with table 31, and replacing the addresses of parity check matrix for the long FECBLOCK with the addresses of parity check matrix for the short FECBLOCK.
  • TABLE 31
    Code Rate Q ldpc
    5/15 30
    6/15 27
    7/15 24
    8/15 21
    9/15 18
    10/15  15
    11/15  12
    12/15  9
    13/15  6
  • FIG. 20 illustrates a time interleaving according to an embodiment of the present invention.
  • (a) to (c) show examples of TI mode.
  • The time interleaver operates at the DP level. The parameters of time interleaving (TI) may be set differently for each DP.
  • The following parameters, which appear in part of the PLS2-STAT data, configure the TI:
  • DP_TI_TYPE (allowed values: 0 or 1): Represents the TI mode; ‘0’ indicates the mode with multiple TI blocks (more than one TI block) per TI group. In this case, one TI group is directly mapped to one frame (no inter-frame interleaving). ‘1’ indicates the mode with only one TI block per TI group. In this case, the TI block may be spread over more than one frame (inter-frame interleaving).
  • DP_TI_LENGTH: If DP_TI_TYPE=‘0’, this parameter is the number of TI blocks NTI per TI group. For DP_TI_TYPE=‘1’, this parameter is the number of frames PI spread from one TI group.
  • DP_NUM_BLOCK_MAX (allowed values: 0 to 1023): Represents the maximum number of XFECBLOCKs per TI group.
  • DP_FRAME_INTERVAL (allowed values: 1, 2, 4, 8): Represents the number of the frames HUMP between two successive frames carrying the same DP of a given PHY profile.
  • DP_TI_BYPASS (allowed values: 0 or 1): If time interleaving is not used for a DP, this parameter is set to ‘1’. It is set to ‘0’ if time interleaving is used.
  • Additionally, the parameter DP_NUM_BLOCK from the PLS2-DYN data is used to represent the number of XFECBLOCKs carried by one TI group of the DP.
  • When time interleaving is not used for a DP, the following TI group, time interleaving operation, and TI mode are not considered. However, the Delay Compensation block for the dynamic configuration information from the scheduler will still be required. In each DP, the XFECBLOCKs received from the SSD/MIMO encoding are grouped into TI groups. That is, each TI group is a set of an integer number of XFECBLOCKs and will contain a dynamically variable number of XFECBLOCKs. The number of XFECBLOCKs in the TI group of index n is denoted by N×BLOCK_Group(n) and is signaled as DP_NUM_BLOCK in the PLS2-DYN data. Note that N×BLOCK_Group(n) may vary from the minimum value of 0 to the maximum value N×BLOCK_Group_MAX (corresponding to DP_NUM_BLOCK_MAX) of which the largest value is 1023.
  • Each TI group is either mapped directly onto one frame or spread over PI frames. Each TI group is also divided into more than one TI blocks(NTI), where each TI block corresponds to one usage of time interleaver memory. The TI blocks within the TI group may contain slightly different numbers of XFECBLOCKs. If the TI group is divided into multiple TI blocks, it is directly mapped to only one frame. There are three options for time interleaving (except the extra option of skipping the time interleaving) as shown in the below table 33.
  • TABLE 32
    Modes Descriptions
    Option-1 Each TI group contains one TI block and is mapped directly
    to one frame as shown in (a). This option is signaled in the
    PLS2-STAT by DP_TI_TYPE = ‘0’ and
    DP_TI_LENGTH = ‘1’ (NTI = 1).
    Option-2 Each TI group contains one TI block and is mapped to more
    than one frame. (b) shows an example, where one TI group is
    mapped to two frames, i.e., DP_TI_LENGTH = ‘2’ (PI = 2)
    and DP_FRAME_INTERVAL (IJUMP = 2). This provides
    greater time diversity for low data-rate services. This option
    is signaled in the PLS2-STAT by DP_TI_TYPE = ‘1’.
    Option-3 Each TI group is divided into multiple TI blocks and
    is mapped directly to one frame as shown in (c). Each TI
    block may use full TI memory, so as to provide the
    maximum bit-rate for a DP. This option is signaled in the
    PLS2-STAT signaling by DP_TI_TYPE =‘0’ and
    DP_TI_LENGTH = NTI, while PI = 1.
  • Typically, the time interleaver will also act as a buffer for DP data prior to the process of frame building. This is achieved by means of two memory banks for each DP. The first TI-block is written to the first bank. The second TI-block is written to the second bank while the first bank is being read from and so on.
  • The TI is a twisted row-column block interleaver. For the sth TI block of the nth TI group, the number of rows Nr of a TI memory is equal to the number of cells Ncells, i.e., Nr=Ncells while the number of columns Nc is equal to the number NxBLOCK _ TI(n,s).
  • FIG. 21 illustrates the basic operation of a twisted row-column block interleaver according to an embodiment of the present invention.
  • FIG. 21 (a) shows a writing operation in the time interleaver and FIG. 21 (b) shows a reading operation in the time interleaver The first XFECBLOCK is written column-wise into the first column of the TI memory, and the second XFECBLOCK is written into the next column, and so on as shown in (a). Then, in the interleaving array, cells are read out diagonal-wise. During diagonal-wise reading from the first row (rightwards along the row beginning with the left-most column) to the last row, Nr cells are read out as shown in (b). In detail, assuming Zn,s,i(i=0, . . . , NrNc) as the TI memory cell position to be read sequentially, the reading process in such an interleaving array is performed by calculating the row index Rn,s,i, the column index Cn,s,i, and the associated twisting parameter Tn,s,i as follows expression.
  • GENERATE ( R n , s , i , C n , s , i ) = { R n , s , i = mod ( i , N t ) , T n , s , i = mod ( S shift × S n , s , i , S c ) C n , s , i = mod ( T n , s , i + i N t , N c ) } [ Math Figure 8 ]
  • where Sshift is a common shift value for the diagonal-wise reading process regardless of
  • NxBLOCK _ TI(n,s), and it is determined by NxBLOCK _ TI _ MAX given in the PLS2-STAT as follows expression.
  • for { N xBLOCK_TI _MAX = N xBLOCK_TI _MAX + 1 , if N xBLOCK_TI _MAX mod 2 = 0 N xBLOCK_TI _MAX = N xBLOCK_TI _MAX , if N xBLOCK_TI _MAX mod 2 = 1 , S shift = N xBLOCK_TI _MAX - 1 2 [ Math Figure 9 ]
  • As a result, the cell positions to be read are calculated by a coordinate as Zn,s,i=NrCn,s,i+Rn,s,i.
  • FIG. 22 illustrates an operation of a twisted row-column block interleaver according to another embodiment of the present invention.
  • More specifically, FIG. 22 illustrates the interleaving array in the TI memory for each TI group, including virtual XFECBLOCKs when NxBLOCK _ TI(0,0)=3, NxBLOCK _ TI(1,0)=6, NxBLOCK _ TI(2,0)=5.
  • The variable number NxBLOCK _ TI(n,s)=Nr will be less than or equal to NxBLOCK _ TI _ MAX′. Thus, in order to achieve a single-memory deinterleaving at the receiver side, regardless of NxBLOCK _ TI(n,s), the interleaving array for use in a twisted row-column block interleaver is set to the size of Nr×Nc=Ncells×NxBLOCK _ TI _ MAX′ by inserting the virtual XFECBLOCKs into the TI memory and the reading process is accomplished as follow expression.
  • [Math FIG. 10]
    p=0;
    for i=0; i<Ncells N′xBLOCK TI MAX ; i=i+1
    {GENERATE (Rn,s,i , Cn,s,i );
    Vi=NrCn,s,j + Rn,s,i
     if Vi<Ncells NxBLOCK TI (n, s)
     {
     Zn,s,p= Vi ; p=p+1;
     }
    }
  • The number of TI groups is set to 3. The option of time interleaver is signaled in the PLS2-STAT data by DP_TI_TYPE=‘0’, DP_FRAME_INTERVAL=‘1’, and DP_TI_LENGTH=‘1’, i.e., NTI=1, IJUMP=1, and PI=1. The number of XFECBLOCKs, each of which has Ncells=30 cells, per TI group is signaled in the PLS2-DYN data by N×BLOCK_TI(0,0)=3, N×BLOCK_TI(1,0)=6, and N×BLOCK_TI(2,0)=5, respectively. The maximum number of XFECBLOCK is signaled in the PLS2-STAT data by N×BLOCK_Group_MAX, which leads to
  • N xBLOCK_Group _MAX / N TI = N xBLOCK_TI _MAX = 6.
  • FIG. 23 illustrates a diagonal-wise reading pattern of a twisted row-column block interleaver according to an embodiment of the present invention.
  • More specifically FIG. 23 shows a diagonal-wise reading pattern from each interleaving array with parameters of NxBLOCK _ TI _ MAX′=7 and Sshift=(7−1)/2=3. Note that in the reading process shown as pseudocode above, if Vi≧NcellsNxBLOCK _ TI(n,s), the value of Vi is skipped and the next calculated value of Vi is used.
  • FIG. 24 illustrates interleaved XFECBLOCKs from each interleaving array according to an embodiment of the present invention.
  • FIG. 24 illustrates the interleaved XFECBLOCKs from each interleaving array with parameters of NxBLOCK _ TI _ MAX′=7 and Sshift=3.
  • FIG. 25 illustrates a protocol stack for providing broadcast services according to an embodiment of the present invention.
  • Broadcast services according to an embodiment of the present invention may provide additional services such as HTML5 applications, interactive service, ACR service, second screen service and personalization service as well as audio/video (A/V) data. In addition, the broadcast services according to an embodiment of the present invention may provide not only real-time (RT) services but also non-real-time (NRT) services. In the case of RT services, content for services is transmitted in real time. In the case of NRT services, content for services is transmitted in non-real time. Specifically, content for RT services may be transmitted at a time when the content for RT services is used. Content for NRT services may be transmitted prior to a time when the content for NRT services is used. In a specific embodiment, a broadcast receiving apparatus may previously receive and store content for NRT services and then use the stored content while providing the NRT services. For example, the broadcast receiving apparatus may previously receive and store the content for the NRT services and provide the NRT services using the stored content when user input for the NRT services is received. NRT services and RT services have different transport characteristics and thus may be transmitted through different transport protocols. Content for NRT services may be referred to as NRT data.
  • Such broadcast services may be transmitted through a broadcast network using ground waves, cables, satellites or the like. Here, the broadcast network using ground waves, cables, satellites or the like may be referred to as a physical layer. When NRT services are transmitted through a broadcast network in a specific embodiment, content for the NRT services may be transmitted through data carousel. Specifically, a broadcast transmitting apparatus may periodically transmit NRT content at predetermined intervals and a broadcast receiving apparatus may receive data after waiting a data rotation period. Accordingly, even when the broadcast receiving apparatus receives a broadcast service during transmission of content, the broadcast receiving apparatus may receive content, transmitted before reception of the broadcast service, in the next period. Therefore, the broadcast receiving apparatus may also receive NRT services through a broadcast network corresponding to unidirectional communication. Furthermore, the broadcast services according to an embodiment of the present invention may be transmitted through Internet (broadband).
  • The broadcast transmitting apparatus may encapsulate a broadcast service according to IP (Internet protocol) and transmit the encapsulated broadcast service through a broadcast network. Accordingly, when the broadcast service is transmitted through a broadcast network using ground waves, cables, satellites or the like, the broadcast receiving apparatus may demodulate a broadcast signal to extract IP packets. The broadcast receiving apparatus may extract user datagram protocol (UDP) packets from the IP packets. The broadcast receiving apparatus may extract asynchronous layered coding/layered coding transport (ALC/LCT) packets based on a real-time objective delivery over unidirectional transport (ROUTE) protocol from the UDP packets. The ROUTE protocol is an application layer protocol for transmitting RT data using ALC/LCT packets. The broadcast receiving apparatus may extract at least one of broadcast service signaling information, NRT data and media content from the ALC/LCT packets. Here, the media content may have MPEG-DASH (Dynamic Adaptive Streaming over HTTP) format. Specifically, the media content may be encapsulated into ISO base media file format (ISO BMFF) and transmitted through MPEG-DASH protocol. The broadcast receiving apparatus may extract MPEG-DASH segments from ROUTE packets. In addition, the broadcast receiving apparatus may extract ISO BMFF files from the MPEG-DASH segments.
  • The broadcast transmitting apparatus may transmit broadcast services encapsulated through MPEG-2 TS along with broadcast services encapsulated through IP through a broadcast network for legacy broadcast receiving apparatuses.
  • When a broadcast service is transmitted through a broadband, the broadcast transmitting apparatus may multicast or unicast the broadcast service. Here, the broadcast receiving apparatus may receive IP packets from the broadband. When the broadcast transmitting apparatus unicasts the broadcast service, the broadcast receiving apparatus may extract TCP packets from the IP packets. The broadcast receiving apparatus may extract HTTP packets from the TCP packets. The broadcast receiving apparatus may extract at least one of broadcast service signaling information, NRT data and media content. Here, the media content may be in MPEG-DASH (Dynamic Adaptive Streaming over HTTP) format. Specifically, the media content may be encapsulated into ISO base media file format (ISO BMFF) and transmitted through MPEG-DASH protocol. The broadcast receiving apparatus may extract MPEG-DASH segments from ROUTE packets. In addition, the broadcast receiving apparatus may extract ISO BMFF files from the MPEG-DASH segments. A broadcast receiving apparatus that receives broadcast services transmitted according to the protocol stack for providing broadcast services will be described with reference to FIG. 26.
  • FIG. 26 is a block diagram of a broadcast transmitting apparatus for transmitting broadcast services, a content server for transmitting content related to broadcast services, a broadcast receiving apparatus for receiving broadcast services and a companion apparatus interoperating with the broadcast receiving apparatus according to an embodiment of the present invention.
  • The broadcast transmitting apparatus 10 transmits broadcast services. Specifically, the broadcast transmitting apparatus 10 transmits broadcast services including media content through a broadcast network using at least one of a satellite, ground waves and cable. More specifically, the broadcast transmitting apparatus 10 may include a controller (not shown) and a transmitting unit (not shown). The controller controls operations of the broadcast transmitting apparatus 10. The transmitting unit transmits broadcast signals.
  • The broadcast receiving apparatus 100 receives broadcast services. The broadcast receiving apparatus 100 may include a broadcast receiving unit 110, an IP transceiver 130, a controller 150, a display unit 180 and a power supply 190.
  • The broadcast receiving unit 110 receives broadcast signals through a broadcast network. Specifically, the broadcast receiving unit 110 may receive broadcast signals through broadcast network using at least one of a satellite, ground waves and cable. More specifically, the broadcast receiving unit 110 may include a tuner for receiving broadcast signals. In addition, the broadcast receiving unit 110 may include a demodulator for demodulating broadcast signals to extract link layer data. The broadcast receiving unit 110 may include one or more processors for respectively executing a plurality of functions of the broadcast receiving unit 110, one or more circuits, and one or more hardware modules. Specifically, the controller may be a system-on-chip (SOC) into which various semiconductor components are integrated. Here, the SOC may be a semiconductor chip into which various components for multimedia such as graphics, audio, video and modem, processors, DRAMs and so on are integrated.
  • The IP transceiver 130 may transmit and receive IP data. Specifically, the IP transceiver 130 may transmit a request for a content server 50 that provides content related to broadcast services. In addition, the IP transceiver 130 may receive a response to the request from the content server 50. The IP transceiver 130 may transmit data to the companion apparatus 300 and receive data from the companion apparatus 300. Specifically, the IP transceiver 130 may transmit a request to the companion apparatus 300 and receive a response to the request from the companion apparatus 300. In addition, the IP transceiver 130 may transmit content related to broadcast services to the companion apparatus 300. The IP transceiver 130 may include one or more processors for respectively executing a plurality of functions of the IP transceiver 130, one or more circuits, and one or more hardware modules. Specifically, the controller may be an SOC into which various semiconductor components are integrated.
  • The controller 150 may include a main processor 151, a user input receiver 153, a memory unit 155, a storage 157 and a multimedia module 159. The main processor 151 controls the overall operation of the controller. The user input receiver 153 receives user input. The memory unit 155 temporarily stores data for the operation of the controller 150. Specifically, the memory unit 155 may be a volatile memory. The storage 157 stores data necessary for the operation of the controller 150. Specifically, the storage 157 may be a nonvolatile memory. The multimedia module 159 processes media content. The controller 150 may include one or more processors for respectively executing a plurality of functions of the controller 150, one or more circuits, and one or more hardware modules. Specifically, the controller may be an SOC into which various semiconductor components are integrated.
  • The display unit 180 displays images.
  • The power supply 190 supplies power necessary for operations of the broadcast receiving apparatus 100.
  • The companion apparatus 300 interoperates with the broadcast receiving apparatus 100. Specifically, the companion apparatus 300 provides information about broadcast services received by the broadcast receiving apparatus 100 by interoperating with the broadcast receiving apparatus 100. The companion apparatus 300 may include an IP transceiver 330, a controller 350, a display unit 380 and a power supply 390.
  • The IP transceiver 330 may transmit and receive IP data. Specifically, the IP transceiver 330 may transmit a request for the content server 50 that provides content related to broadcast services. In addition, the IP transceiver 330 may receive a response to the request from the content server 50. The IP transceiver 330 may transmit data to the broadcast receiving apparatus 100 and receive data from the broadcast receiving apparatus 100. Specifically, the IP transceiver 330 may transmit a request to the broadcast receiving apparatus 100 and receive a response to the request from the broadcast receiving apparatus 100. In addition, the IP transceiver 330 may receive content related to broadcast services from the broadcast receiving apparatus 100. The IP transceiver 330 may include one or more processors for respectively executing a plurality of functions of the IP transceiver 330, one or more circuits, and one or more hardware modules. Specifically, the controller may be an SOC into which various semiconductor components are integrated.
  • The controller 350 may include a main processor 351, a user input receiver 353, a memory unit 355, a storage 357 and a multimedia module 359. The main processor 351 controls the overall operation of the controller. The user input receiver 353 receives user input. The memory unit 355 temporarily stores data for the operation of the controller 350. Specifically, the memory unit 355 may be a volatile memory. The storage 357 stores data necessary for the operation of the controller 350. Specifically, the storage 357 may be a nonvolatile memory. The multimedia module 359 processes media content. The controller 350 may include one or more processors for respectively executing a plurality of functions of the controller 350, one or more circuits, and one or more hardware modules. Specifically, the controller may be an SOC into which various semiconductor components are integrated.
  • The display unit 380 displays images.
  • The power supply 390 supplies power necessary for operations of the companion apparatus 300.
  • As described above, ESG data may include the start time, end time, titles and summary of content, parental rating, genres and information on appearances of programs. In addition, the ESG data may include at least one of provision information indicating provision for viewing content, information for interactivity service, purchase information related to content and information for accessing content.
  • To efficiently transmit such information, the broadcast transmitting apparatus 10 may structure ESG data in units of information and transmit the structured information. The broadcast transmitting apparatus 10 may classify the ESG data by types of information included in the ESG data and transmit the ESG data. Specifically, the broadcast transmitting apparatus 10 may classify the ESG data into ESG data including information indicating broadcast services, ESG data including information indicating one or more content included in a broadcast service, and ESG data including information indicating the schedule of at least one of a service and content and transmit the classified ESG data. Here, the unit of information may be referred to as a service guide fragment or a fragment. In addition, the ESG data including information indicating broadcast services may be referred to as a service fragment. The ESG data including information indicating one or more content included in a broadcast service may be referred to as a content fragment. The ESG data including information indicating the schedule of at least one of a service and content may be referred to as a schedule fragment. The broadcast receiving apparatus 100 may process ESG data based on the unit of structured information. Specifically, the broadcast receiving apparatus 100 may process ESG data based on at least one of the service fragment, content fragment and schedule fragment. When the broadcast receiving apparatus 100 processes data based on a fragment in this manner, the broadcast receiving apparatus 100 may selectively process necessary data. Accordingly, the broadcast receiving apparatus 100 may selectively request necessary data through a communication network available for interactive communication. In addition, the broadcast receiving apparatus 100 may transmit data necessary for the companion apparatus 300 to the companion apparatus 300 with efficiency. The service fragment will be described first with reference to FIGS. 27 to 31.
  • The service in the service fragment refers to a broadcast service. The service may be a set of content items constituting a broadcast service. For example, the service may be a set of broadcast programs transmitted through a logical transmission channel by one broadcaster.
  • The service fragment forms a center point referred to by other fragments included in an ESG. Specifically, the broadcast receiving apparatus 100 may display a broadcast service represented by the service fragment while displaying the ESG. More specifically, the broadcast receiving apparatus 100 may chronologically arrange and display content included in broadcast services for the broadcast services. Here, the broadcast receiving apparatus 100 may display the contents of corresponding content, how the corresponding content may be viewed and when the corresponding content may be viewed.
  • Services may have various service types. Specifically, a service may be an interactive service. A service may be a unidirectional service through a broadcast network. A service may include an interactive service and a unidirectional service. Further, a service may include one or more components depending on service type. Here, the service may include a component for service functionality, which is not directly related with content included in the service. For example, a service may include purchase information for purchasing content included in the service or the service. In addition, a service may include subscription information for subscribing to content included in the service or the service. Services of various types that may be represented by services of service fragments will be described with reference to FIG. 27.
  • FIG. 27 shows values of a serviceType element included in a service fragment and types of services indicated by the values according to an embodiment of the present invention. And FIG. 28 shows an XML format of the serviceType element included in the service fragment.
  • The service fragment may include an element representing service types of the service fragment. The broadcast receiving apparatus 100 may display broadcast services to a user based on the element representing service types of the service fragment. Specifically, the broadcast receiving apparatus 100 may display an ESG based on the element representing service types of the service fragment. For example, the broadcast receiving apparatus 100 may display service types as characters depending on the element representing service types of the service fragment. In addition, the broadcast receiving apparatus 100 may display service types as icons depending on the element representing service types of the service fragment. Furthermore, the broadcast receiving apparatus 100 may display service types as graphics depending on the element representing service types of the service fragment.
  • The element representing service types may indicate at least one of a basic TV service, a basic radio service, a rights issuer service, cachecast, a file download service, software management services, a notification service, a service guide service, terminal provisioning services, an auxiliary data service, a streaming on demand service, a file download on demand service, smartcard provisioning services, a linear service, an App-based service and a companion screen service. The basic TV service represents a service including video. The basic radio service represents a service including audio without video. The rights issuer service indicates a service that issues the right of digital right management (DRM) managing the right to present content. The cachecast represents an NRT service that previously downloads a broadcast service prior to reproduction of the broadcast service and then presents the broadcast service. The cachecast may be called an NRT service. The file download services refer to services that require file download for service reproduction. The software management services are services for managing software. Specifically, the software management services may represent services for updating software of the broadcast receiving apparatus 100. The notification service indicates a service for signaling notification to the broadcast receiving apparatus 100. Specifically, a broadcast service provider or a content provider may deliver a message to the broadcast receiving apparatus 100 through the notification service. The service guide service provides a service guide. Specifically, the service guide service may represent a service for receiving, by the broadcast receiving apparatus 100, ESG data providing broadcast services. The terminal provisioning services represent services for provisioning for subscribing to a service or content. Specifically, the broadcast receiving apparatus 100 may update provisioning through the terminal provisioning service to present broadcast services. The auxiliary data service provides auxiliary data related to a broadcast service. The streamlining on demand service provides a streaming service at the request of a user. The file download on demand service provides file download at the request of a user. The smartcard provisioning services may be services for updating smartcard provisioning. Specifically, the broadcast receiving apparatus 100 may use a smartcard that descrambles scrambled content in order to present scrambled content. Here, the broadcast receiving apparatus 100 may update provisioning of the smartcard through the smartcard provisioning services. The linear service may be a service including continuous components of which primary content is consumed depending on the schedule and time base set by a broadcaster. Here, the continuous components may be content components presented in continuous streams. The App-based service is a non-linear service that provides user interfaces and functions based on applications. The companion screen service is a service by which the broadcast receiving apparatus 100 receiving broadcast services and the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 provide broadcast services together.
  • In hybrid broadcast, broadcast services may combinations of a plurality of services. For example, broadcast services may be a linear service and an App-based service providing additional services through applications. In this case, a service fragment may include a plurality of elements respectively representing different service types.
  • In a specific embodiment, the element representing service types may be referred to as ServiceType. ServiceType may indicate the basic TV service when ServiceType has a value of 1 in a specific embodiment. ServiceType may indicate the basic radio service when ServiceType has a value of 2 in a specific embodiment. ServiceType may indicate the rights issuer service when ServiceType has a value of 3 in a specific embodiment. ServiceType may indicate the cachecast service when ServiceType has a value of 4 in a specific embodiment. ServiceType may indicate the file download services when ServiceType has a value of 5 in a specific embodiment. ServiceType may indicate the software management services when ServiceType has a value of 6 in a specific embodiment. ServiceType may indicate the notification service when ServiceType has a value of 7 in a specific embodiment. ServiceType may indicate the service guide service when ServiceType has a value of 8 in a specific embodiment. ServiceType may indicate the terminal provisioning services when ServiceType has a value of 9 in a specific embodiment. ServiceType may indicate the auxiliary data service when ServiceType has a value of 10 in a specific embodiment. ServiceType may indicate the streaming on demand service when ServiceType has a value of 11 in a specific embodiment. ServiceType may indicate the file download on demand service when ServiceType has a value of 12 in a specific embodiment. ServiceType may indicate the smartcard provisioning services when ServiceType has a value of 13 in a specific embodiment. ServiceType may indicate the linear service when ServiceType has a value of 14 in a specific embodiment. ServiceType may indicate the App-based service when ServiceType has a value of 15 in a specific embodiment. ServiceType may indicate the companion screen service when ServiceType has a value of 16 in a specific embodiment.
  • In addition, the service fragment may include an element representing the range of service types of the corresponding service. The element representing the range of service types of the corresponding service may include a maximum value and a minimum value of the element representing service types. The broadcast transmitting apparatus 10 may signal the range of broadcast services to the broadcast receiving apparatus 100 through the element representing the range of service types of the corresponding service. The broadcast receiving apparatus 100 may determine types available for the corresponding service based on the element representing the range of service types of the corresponding service. The element representing the range of service types of the corresponding service may be referred to as ServiceTypeRangeType. Here, the minimum value of the element representing service types may be referred to as a minInclusive value. The maximum value of the element representing service types may be referred to as a maxInclusive value. In the embodiment of FIG. 28, ServiceTypeRangeType has a minInclusive value of 0 and a maxInclusive value of 13. Accordingly, the broadcast receiving apparatus 100 may be aware that services represented by the corresponding service fragment do not correspond to the linear service, App-based service and companion screen service.
  • As described above, the broadcast receiving apparatus 100 may display services based on service types represented by the service fragment. Specifically, the broadcast receiving apparatus 100 may display services based on the element representing service types included in the service fragment. In a specific embodiment, the broadcast receiving apparatus 100 may display service types based on service types represented by the service fragment through a menu that indicates a broadcast service guide. Here, the menu indicating the broadcast service guide may be a menu indicating a plurality of broadcast services and content respectively included in the plurality of broadcast services. In another specific embodiment, the broadcast receiving apparatus 100 may display service types based on service types represented by the service fragment through a menu that indicates information of corresponding broadcast services on a screen displaying broadcast services. For example, the broadcast receiving apparatus 100 may display service names, service provision schedules and service types represented by the service fragment in the form of a bar positioned at the lower or upper part of the screen while reproducing a broadcast service. In another specific embodiment, the broadcast receiving apparatus 100 may display service types based on service types represented by the service fragment through a service list that indicates broadcast services and information representing the broadcast services. For example, the broadcast receiving apparatus 100 may display service types represented by the service fragment along with the names of corresponding services and virtual channel numbers indicating the corresponding services while displaying the service list. Embodiments will be described with reference to the following figures.
  • FIG. 29 illustrates XML data indicating the serviceType element and a user interface when a service represented by the service fragment is the linear service according to an embodiment of the present invention.
  • In the embodiment of FIG. 29, the service fragment indicates that the corresponding service is the linear service. Here, the broadcast receiving apparatus 100 may display a start time and an end time of the linear service while reproducing the service because main content included in the linear service has the start time and the end time designated by a broadcaster. Accordingly, the broadcast receiving apparatus 100 may notify a user of the start time and end time of the corresponding service based on the service fragment of ESG data without additional information.
  • FIG. 30 illustrates XML data indicating the serviceType element and a user interface when a service represented by the service fragment is the linear service and the App-based service according to an embodiment of the present invention.
  • In the embodiment of FIG. 30, the service fragment indicates that the corresponding service is the linear service and the App-based service. Here, the broadcast receiving apparatus 100 may display that there is an application included in the corresponding service while reproducing the service. When the broadcast receiving apparatus 100 receives user input for application execution from a user, the broadcast receiving apparatus 100 may execute the corresponding application. Accordingly, the broadcast receiving apparatus 100 may display that there is the application related to the service based on the service fragment of ESG data without additional information.
  • FIG. 31 illustrates XML data indicating the serviceType element and a user interface when a service represented by the service fragment is the linear service and the companion screen service according to an embodiment of the present invention.
  • In the embodiment of FIG. 31, the service fragment indicates that the corresponding service is the linear service and the companion screen service. Here, the broadcast receiving apparatus 100 may display that the corresponding service can be provided through the companion apparatus 300 while reproducing the service. Accordingly, the broadcast receiving apparatus 100 may display that the corresponding service can be provided through the companion apparatus 300 based on the service fragment of ESG data without additional information.
  • A broadcast service may include one or more components constituting the broadcast service. A component may constitute media content included in the broadcast service. In addition, the component may provide a specific function or information related to the broadcast service. For example, an application that provides information related to the broadcast service may be a component of the broadcast service.
  • As described above, hybrid broadcast may include various components. Furthermore, components included in broadcast services may vary according to content included in the broadcast services. The broadcast receiving apparatus 100 may selectively present various components. Accordingly, it is necessary to indicate components through ESG data such that a user may select a broadcast service and content included in the broadcast service by viewing component information. However, ESG data does not indicate components of broadcast services, the broadcast receiving apparatus 100 needs to acquire information about the components through broadcast service signaling information that signals a broadcast service that is currently broadcast. Accordingly, it is necessary for the broadcast receiving apparatus 100 to efficiently acquire information about components included in broadcast services and to obtain information about components to be broadcast as well as a currently broadcast component. In addition, the broadcast receiving apparatus 100 needs to acquire a correct time when a component is broadcast. Further, the broadcast receiving apparatus 100 needs to acquire information about components that will be broadcast. To this end, ESG data includes component fragments indicating components included in broadcast services, which will be described with reference to the attached drawings.
  • FIG. 32 illustrates an XML format of the component fragment according to an embodiment of the present invention.
  • ESG data according to an embodiment of the present invention may include the component fragment. The component fragment includes information about components corresponding to part of a broadcast service or content. Specifically, the component fragment may include an identifier element for identifying the component fragment. In addition, the component fragment may include a version element indicating whether a component is changed. Furthermore, the component fragment may include an element indicating a valid period of the component fragment. The element indicating the valid period of the component fragment may include a start time and an end time of the valid period. The component fragment may include a component type element indicating component type. Since one component may include a plurality of properties, one component fragment may include a plurality of component type elements. The component fragment may include a component data element indicating data types included in a component. In addition, the component fragment may include an extension element reserved for future extension and specific applications.
  • In a specific embodiment as shown in FIG. 32, the identifier element may be “id” and the version element may be “version.” The element indicating the start time of the valid period of the component fragment may be “validFrom” and the end time of the valid period may be “validTo.” In addition, the component type element may be “ComponentType,” and the component data element may be “ComponentData.” The extension element may be “PrivateExt”.
  • Component types that can be represented by the component type element will be described with reference to FIG. 33.
  • FIG. 33 shows component types that can be represented by the component fragment according to an embodiment of the present invention.
  • The component fragment may indicate various types of components. Specifically, the component fragment may indicate component types through the component type element included therein. Accordingly, the broadcast receiving apparatus 100 may recognize the type of a corresponding component.
  • Specifically, the component fragment may indicate a continuous component. The continuous component is a component presented in a continuous stream. For example, the continuous component may be one of audio, video and closed captioning. In a specific embodiment, the component type element may have a value of 1 when the component fragment indicates a continuous component.
  • The component fragment may indicate an elementary component. The elementary component is a continuous component that is a single encoding. The elementary component may be an audio component. Specifically, the elementary component may be a single encoding of a sound sequence. In addition, the elementary component may be a video component. Specifically, the elementary component may be a single encoding of a picture sequence. The elementary component may be a closed caption track. When the component fragment indicates the elementary component in a specific embodiment, the component type element may have a value of 2.
  • The component fragment may indicate a composite component. The composite component is a collection of a plurality of continuous components necessary to present one scene. Specifically, the composite component is a collection of continuous components which have the same content type and represent the same scene, and which are to be combined in a combination to produce a presentation. Accordingly, the composite component is a collection of a plurality of media components combined to represent one scene. Specifically, the composite component may be music, dialog and special effects necessary for complete audio. In addition, the composite component may be right and left 3D views necessary to present 3D pictures. When the component fragment indicates the composite component in a specific embodiment, the component type element may have a value of 3.
  • The component fragment may indicate a PickOne component. The PickOne component is a collection of a plurality of alternative continuous components which represent one scene. Here, “PickOne” represents that one of a plurality of alternative continuous components may be selected and presented. Specifically, the PickOne component is a collection of a plurality of continuous components which have the same media type and represent the same scene, and one of which is selected to produce a presentation. Specifically, the PickOne component may be a collection of a plurality of media components encoded from the same content with different qualities. For example, the PickOne component may be a set of audio components encoded from the same sound sequence with different bitrates. Otherwise, the PickOne component may be a set of video components encoded from the same picture sequence with different bitrates. Further, the PickOne component may be a regular closed caption track and an “easy reader” closed caption track for the same dialog. When the component fragment indicates the PickOne component in a specific embodiment, the component type element may have a value of 4.
  • The component fragment may indicate a complex component. The complex component indicates either a composite component or a PickOne component. When the component fragment indicates the complex component in a specific embodiment, the component type element may have a value of 5.
  • The component fragment may indicate a presentable component. The presentable component refers to a continuous component which is may be substantially presented by the broadcast receiving apparatus 100. The presentable component may be an elementary component. The presentable component may be a complex component. When the component fragment indicates the presentable component in a specific embodiment, the component type element may have a value of 6.
  • The component fragment may indicate that a component is a non-real-time (NRT) file. An NRT file is a file delivered in non-real-time. Specifically, the NRT file may refer to a file which is previously downloaded by the broadcast receiving apparatus 100 before being executed and which is executed at an execution time. When the component fragment indicates that a component is an NRT file in a specific embodiment, the component type element may have a value of 7.
  • The component fragment may indicate that a component is an NRT content item. An NRT content item is a collection of NRT files which are intended to be consumed as an integrated whole by a service provider. When the component fragment indicates that a component is an NRT content item in a specific embodiment, the component type element may have a value of 8.
  • The component fragment may indicate that a component is an application. An application may be a collection of documents constituting an enhanced or interactive service. Specifically, documents may include at least one of HTML, JavaScript, CSS, XML and multimedia files. An application may be regarded as an NRT content item. When the component fragment indicates that a component is an application item in a specific embodiment, the component type element may have a value of 9.
  • The component fragment may indicate that a component is an ATSC 3.0 application. An ATSC 3.0 application represents an application executed in environments conforming to ATSC 3.0 specification. When the component fragment indicates that a component is an ATSC 3.0 application item in a specific embodiment, the component type element may have a value of 10.
  • The component fragment may indicate that a component is an on demand component. An on demand component represents a component that is delivered on demand. When the component fragment indicates that a component is an on demand component in a specific embodiment, the component type element may have a value of 11.
  • The component fragment may indicate that a component is a notification stream. A notification stream delivers notifications to synchronize actions of applications with an underlying linear time base. When the component fragment indicates that a component is a notification stream in a specific embodiment, the component type element may have a value of 12.
  • The component fragment may indicate that a component is an App-based enhancement. An App-based enhancement may include one or more notification streams. An App-based enhancement may include one or more applications. An App-based enhancement may include an NRT content item. Here, the NRT content item may be executed by an application. An App-based enhancement may include an on demand component. Here, the on demand component may be managed by an application. When the component fragment indicates that a component is an App-based enhancement in a specific embodiment, the component type element may have a value of 13.
  • FIG. 34 shows an XML format of a ComponentRangeType element included in the component fragment according to an embodiment of the present invention.
  • The component fragment may include an element indicating the range of component types that a corresponding component may have. The element indicating the range of component types that a corresponding component may have may include a minimum value and a maximum value of values that the element indicating component types may have. The broadcast transmitting apparatus 10 may notify the broadcast receiving apparatus of the range of component types through the element indicating the range of component types that a corresponding component may have may. The broadcast receiving apparatus 100 may determine component types of the corresponding component based on the element indicating the range of component types that the corresponding component may have. The element indicating the range of component types that the corresponding component may have may be referred to as ComponentRangeType. Here, the minimum value of the element indicating component types may be minInclusive value and the maximum value thereof may be maxInclusive value. In the embodiment shown in FIG. 34, ComponentRangeType has a minInclusive value of 0 and a maxInclusive value of 13. Accordingly, the broadcast receiving apparatus 100 may determine that the component indicated by the corresponding component fragment corresponds to one of the aforementioned component types. However, the component type element indicating component types may not represent content types included in a component. Furthermore, to represent content types included in a component through the component type element, too many values need to be defined as values that the component type element may have. Accordingly, it is necessary to define a component data element indicating content types included in a component. This will be described with reference to FIG. 35.
  • FIG. 35 shows an XML format of a ComponentData element included in the component fragment according to an embodiment of the present invention.
  • As described above, the component fragment may include the component data element indicating content types included in a component. Here, the component data element may represent that a component is a video component including a video. In addition, the component data element may represent that a component is an audio component including an audio. The component data element may represent that a component is a closed captioning component including a closed captioning. Further, the component data element may include a lower element depending on content type included in a component. In a specific embodiment, an element indicating that a component is a video component may be VideoComponent. In another specific embodiment, an element indicating that a component is an audio component may be AudioComponent. In another specific embodiment, an element indicating that a component is a closed captioning component may be CCComponent. The lower element included in the component data element will be described in detail with reference to the attached drawings.
  • FIG. 36 shows an XML format of a VideoDataType element included in the component fragment according to an embodiment of the present invention.
  • When a component is a video component, the component data element may include a video role element specifying the role of the corresponding video. The value of the video role element may be an integer. The video role element may represent a default video when the component is a presentable component. Here, the video role element may have a value of 1. In addition, the video role element may represent an alternative camera view when the component is a presentable component. Here, the video role element may have a value of 2. Furthermore, the video role element may represent an alternative video component when the component is a presentable component. Here, the video role element may have a value of 3. The video role element may represent a sign language inset when the component is a presentable component. Here, the video role element may have a value of 4. The video role element may represent a follow subject video when the component is a presentable component. Here, the video role element may have a value of 5. When the component is a composite component, the video role element may represent a base layer for scalable video encoding. Here, the video role element may have a value of 6. The video role element may represent an enhancement layer for scalable video encoding when the component is a composite component. Here, the video role element may have a value of 7. The video role element may represent a 3D video left view when the component is a composite component. Here, the video role element may have a value of 8. The video role element may represent a 3D video right view when the component is a composite component. Here, the video role element may have a value of 9. The video role element may represent 3D video depth information when the component is a composite component. Here, the video role element may have a value of 10. The video role element may represent that a media component is a video at a specific position of a picture divided into a plurality of regions when the component is a composite component. Here, the video role element may have a value of 11. The video role element may represent follow-subject metadata when the component is a composite component. Here, the video role element may have a value of 12. The follow-subject metadata may include at least one of the name, position and size of the corresponding subject. When the follow-subject function is supported by metadata in units of frames of a stream, the follow-subject metadata may represent a main video component region on which the subject is focused. In a specific embodiment, the video role element specifying the role of a video may be referred to as VideoRole.
  • When the component is a video component, the component data element may include an element specifying the range of values that the video role element may have. The role of a video component may depend on component type. Accordingly, the element specifying the range of values that the video role element may have may include an element specifying the range of values that the video role element may have when the video component is a presentable video. The element specifying the range of values that the video role element may have may include an element specifying the range of values that the video role element may have when the video component is a composite video. The element specifying the range of values that the video role element may have may include an element specifying the range of values that the video role element may have when the video component is a composite video or a presentable video. In a specific embodiment, the element specifying the range of values that the video role element may have may be VideoRoleRangeType.
  • In addition, the component data element may include a target user element representing a target user of the corresponding component. In a specific embodiment, the target user element may be TargetUserProfile. Furthermore, the component data element may include a target device element indicating a target device of the corresponding component. The target device element may represent at least one of a primary apparatus that receives broadcast service, a companion apparatus and inclusion of both the primary apparatus and the companion apparatus in the companion screen services. In a specific embodiment, the target device element may be TargetDevice.
  • FIG. 37 shows an XML format of an AudioDataTape element included in the component fragment according to an embodiment of the present invention.
  • When a component is an audio component, the component data element may include an audio role element specifying the role of an audio. The value of the audio role element may be an integer. The audio role element may indicate that an audio component is complete main. Here, the audio role element may have a value of 1. The audio role element may indicate that an audio component is music. Here, the audio role element may have a value of 2. The audio role element may indicate that an audio component is a dialog. Here, the audio role element may have a value of 3. The audio role element may indicate that an audio component is effects. Here, the audio role element may have a value of 4. The audio role element may indicate that an audio component is for visually impaired. Here, the audio role element may have a value of 5. The audio role element may indicate that an audio component is for hearing impaired. Here, the audio role element may have a value of 6. The audio role element may indicate that an audio component is a commentary. Here, the audio role element may have a value of 7.
  • Additionally, when a component is an audio component, the component data element may include an element specifying the range of values that the audio role element may have. In a specific embodiment, the element specifying the range of values that the audio role element may have may be AudioRoleRangeType.
  • Furthermore, the component data element may include a target user element indicating a target user of the corresponding component. In a specific embodiment, the target user element may be TargetUserProfile. In addition, the component data element may include a target device element indicating a target device of the corresponding component. The target device element may represent at least one of a primary apparatus that receives broadcast service, a companion apparatus and inclusion of both the primary apparatus and the companion apparatus in the companion screen services. In a specific embodiment, the target device element may be TargetDevice.
  • In addition, the component data element may include an element indicating that a component is associated with a presentable video component. Specifically, a component associated to a video component may refer to a component presented along with the video component. In addition, a component associated to a video component may refer to a component synchronized with the video component and presented along with the video component. This element may include an identifier for identifying a component fragment indicating an associated presentable video component. In a specific embodiment, this element may be associatedTo.
  • The component data element may include an element indicating the number of channels included in an audio component. In a specific embodiment, this element may be NumberOfAudioChannels.
  • FIG. 38 shows an XML format of a CCDataType element included in the component fragment according to an embodiment of the present invention.
  • When a component is a closed captioning component, the component data element may include a closed captioning role element indicating the role of a closed captioning. The value of the closed captioning role element may be an integer. The closed captioning role element may represent that a closed captioning component is a normal closed captioning. Here, the closed captioning role element may have a value of 1. The closed captioning role element may represent that a closed captioning component is an easy-reader closed captioning for kindergarteners and elementary school students. Here, the closed captioning role element may have a value of 2.
  • When a component is a closed captioning component, the component data element may include an element specifying the range of values that the closed captioning role element may have. In a specific embodiment, the element specifying the range of values that the closed captioning role element may have may be CCRoleRangeType.
  • In addition, the component data element may include a target user element indicating a target user of the corresponding component. In a specific embodiment, the target user element may be TargetUserProfile. Furthermore, the component data element may include a target device element indicating a target device of the corresponding component. The target device element may represent at least one of a primary apparatus that receives broadcast service, a companion apparatus and inclusion of both the primary apparatus and the companion apparatus in the companion screen services. In a specific embodiment, the target device element may be TargetDevice.
  • In addition, the component data element may include an element indicating that a component is associated with a presentable video component. Specifically, whether a component is associated to a video component may represent whether the component is presented along with the video component. In addition, a component associated to a video component may refer to a component synchronized with the video component and presented along with the video component. This element may include an identifier for identifying a component fragment indicating an associated presentable video component. In a specific embodiment, this element may be associatedTo.
  • The broadcast receiving apparatus 100 may determine the role of a component based on the aforementioned component fragment. In addition, the broadcast receiving apparatus 100 may display the role of a component based on the component data element. In a specific embodiment, the broadcast receiving apparatus 100 may display the roles of components included in content in a service guide menu. The broadcast receiving apparatus 100 may display the roles of component fragments based on the component fragments in a bar-shaped menu positioned at the lower or upper part of the screen. In another specific embodiment, the broadcast receiving apparatus 100 may display the roles of components represented by service fragments through a service list indicating information about broadcast services based on the service fragments. For example, the broadcast receiving apparatus 100 may display the role of a currently broadcast component in the corresponding service along with the name of the corresponding service and a virtual channel number indicating the corresponding service while displaying the service list.
  • In addition, the broadcast receiving apparatus 100 may determine at least one of a target device and a target user profile of each component based on the component data element. The broadcast receiving apparatus 100 may display the roles of components based on the component data element. In a specific embodiment, the broadcast receiving apparatus 100 may display information about a component in the service guide menu based on at least one of a target device and a target user profile of the component. Specifically, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may display the corresponding component differently from other components. For example, when the broadcast receiving apparatus 100 does not support 3D pictures, the broadcast receiving apparatus 100 may display a video component for the 3D pictures in gray in an ESG. In another specific embodiment, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may not display the corresponding component. In a specific embodiment, the broadcast receiving apparatus 100 may display at least one of an icon, text and graphic form indicating a target user profile of the corresponding component in the service guide menu. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may display the corresponding component differently from other components. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may not display the corresponding component in the service guide menu.
  • In a specific embodiment, the broadcast receiving apparatus 100 may display information about a component in a bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented based on at least one of a target device and a target user profile of the component. Specifically, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented, differently from other components. For example, when the broadcast receiving apparatus 100 does not support 3D pictures, the broadcast receiving apparatus 100 may display a video component for the 3D pictures in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented. In another specific embodiment, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may not display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented. In a specific embodiment, the broadcast receiving apparatus 100 may display at least one of an icon, text and graphic form indicating a target user profile of the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented, differently from other components. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may not display the corresponding component in the bar-shaped menu positioned at the lower or upper part of the screen on which a broadcast service is being presented.
  • In another specific embodiment, the broadcast receiving apparatus 100 may display the roles of components represented by service fragments through a service list indicating information about broadcast services based on the service fragments. For example, the broadcast receiving apparatus 100 may display the role of a currently broadcast component in the corresponding service along with the name of the corresponding service and a virtual channel number indicating the corresponding service while displaying the service list.
  • In a specific embodiment, the broadcast receiving apparatus 100 may display information about a component in a broadcast service list based on at least one of a target device and a target user profile of the component. Specifically, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may display the corresponding component in the broadcast service list, differently from other components. For example, when the broadcast receiving apparatus 100 does not support 3D pictures, the broadcast receiving apparatus 100 may display a video component for the 3D pictures in gray in the broadcast service list. In another specific embodiment, when the broadcast receiving apparatus 100 does not correspond to a target device, the broadcast receiving apparatus 100 may not display the corresponding component in the broadcast service list. In a specific embodiment, the broadcast receiving apparatus 100 may display at least one of an icon, text and graphic form indicating a target user profile of the corresponding component in the broadcast service list. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may display the corresponding component in the broadcast service list, differently from other components. In another specific embodiment, when the user of the broadcast receiving apparatus 100 does not correspond to the target user profile of the corresponding component, the broadcast receiving apparatus 100 may not display the corresponding component in the broadcast service list.
  • FIG. 39 shows an embodiment in which the component fragment according to an embodiment of the present invention indicates a composite video component.
  • The component fragment in the embodiment shown in FIG. 39 represents that a composite component having an identifier of bcast://lge.com/Component/1 includes an elementary component having an identifier of bcast://lge.com/Component/2 and an elementary component having an identifier of bcast://lge.com/Component/3. The component data element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 is a base layer for scalable video encoding and the elementary component having the identifier of bcast://lge.com/Component/3 is an enhancement layer for scalable video encoding.
  • FIG. 40 shows another embodiment in which the component fragment indicates a composite video component according to an embodiment of the present invention.
  • In the embodiment shown in FIG. 40, the component fragment represents that a composite component having an identifier of bcast://lge.com/Component/1 includes an PickOne component having an identifier of bcast://lge.com/Component/2 and a PickOne component having an identifier of bcast://lge.com/Component/3. The component data element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 indicates a 3D video left view and the elementary component having the identifier of bcast://lge.com/Component/3 is a 3D video right view.
  • FIG. 41 shows another embodiment in which the component fragment according to an embodiment of the present invention indicates a PickOne audio component.
  • In the embodiment shown in FIG. 41, the component fragment represents that a PickOne component having an identifier of bcast://lge.com/Component/1 includes a PickOne component having an identifier of bcast://lge.com/Component/2 and a composite component having an identifier of bcast://lge.com/Component/3, which are alternative. The component data element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/2 is a completely main audio. In addition, the component data element of the component fragment represents that the composite component having the identifier of bcast://lge.com/Component/3 includes a PickOne component having an identifier of bcast://lge.com/Component/4 and a PickOne component having an identifier of bcast://lge.com/Component/5. Here, the component data element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/4 is music. In addition, the component data element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/5 is a dialog.
  • Embodiments in which ESG data includes component fragments, the broadcast transmitting apparatus 10 transmits component information and the broadcast receiving apparatus 100 receives the component information have been described. However, components may have organic correlation and the broadcast receiving apparatus 100 needs to present components in consideration of correlation among components. Accordingly, there is a need for a method of representing relationship among components. This will be described below with reference to the attached drawings.
  • FIG. 42 shows an XML format of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • A content fragment includes an extension element for extendibility. The broadcast transmitting apparatus 10 may insert a component element indicating component information into the extension element. The content fragment may include the component element in the form of a sequence. Specifically, information included in the component element may be the same as information included in the aforementioned component fragment. Specifically, when a component is a video component, the component element may include at least one of the element representing the role of a video, the target user profile element indicating the profile of a target user of the component and the target device element indicating a target device of the component. When a component is an audio component, the component element may include at least one of the element representing the role of an audio, the target user profile element indicating the profile of a target user of the component and the target device element indicating a target device of the component. When a component is a closed captioning component, the component element may include at least one of the element representing the role of a closed captioning, the target user profile element indicating the profile of a target user of the component and the target device element indicating a target device of the component.
  • FIGS. 43 and 44 show XML formats of embodiments of a content fragment when information about a component is included as an element of the content fragment according to another embodiment of the present invention.
  • In the embodiment shown in FIG. 43, the broadcast receiving apparatus 100 may acquire information about a video component, an audio component and a closed captioning component based on a content fragment. That is, the broadcast receiving apparatus 100 may obtain the information about the video component, audio component and closed captioning component, required to present content indicated by the content fragment. Accordingly, the broadcast receiving apparatus 100 may recognize that the video component, audio component and closed captioning component need to be presented together without additional information.
  • In the embodiment shown in FIG. 44, the broadcast receiving apparatus 100 may acquire information about a video component, an audio component and a closed captioning component based on a content fragment. Particularly, the broadcast receiving apparatus 100 may recognize that corresponding content includes a component containing a 3D video left view and a component containing a 3D video right view based on the content fragment.
  • When a content fragment includes information about all components included in content as elements, the broadcast receiving apparatus 100 may be aware of all components included in the content without an element indicating correlation among the components.
  • Previous broadcast receiving apparatuses may malfunction when a new fragment such as a component fragment is added because they do not support the component fragment. This problem may be solved when a content fragment includes component information as shown in FIGS. 42 to 44.
  • To represent a correlation between components, a component fragment indicating an audio component and a component fragment indicating a closed captioning component may refer to a fragment indicating a related video component. This will be described with reference to FIGS. 45 and 46.
  • FIG. 45 shows an XML format of component fragments when a component fragment indicating an audio component and a component fragment indicating a closed captioning component refer to a fragment indicating a related video component according to an embodiment of the present invention.
  • The component fragment indicating the audio component and the component fragment indicating the closed captioning component may include an association element representing the fragment indicating the related video component. Specifically, the related video component may indicate a component presented along with the closed captioning component. In addition, the related video component may indicate a video component presented along with the closed captioning component in synchronization therewith. The association element may have an identifier for identifying the component fragment indicating the related video component. In a specific embodiment, the association element may be associatedTo.
  • FIG. 46 shows a relationship among component fragments when a component fragment indicating an audio component and a component fragment indicating a closed captioning component refer to a fragment indicating a related video component according to an embodiment of the present invention.
  • In the embodiment of FIG. 46, a component fragment of an audio component having an identifier of bcast://lge.com/Component/2 refers to a component fragment of a video component having an identifier of bcast://lge.com/Component/1. In addition, a component fragment of a closed captioning component having an identifier of bcast://lge.com/Component/3 refers to the component fragment of the video component having the identifier of bcast://lge.com/Component/1. Accordingly, the broadcast receiving apparatus 100 may acquire information about the video component associated to the audio component based on the component fragment of the audio component. In addition, the broadcast receiving apparatus 100 may acquire information about the video component associated to the closed captioning component based on the component fragment of the closed captioning component.
  • FIG. 47 shows an XML format of component fragments when a component fragment indicating a video component refers to a component fragment indicating a related audio component and a component fragment indicating a related closed captioning component according to an embodiment of the present invention.
  • The component fragment indicating the video component may include association elements that represent the component fragment indicating the related audio component and the component fragment indicating the related closed captioning component. The related audio component and the related closed captioning component may respectively represent an audio component and a closed captioning component which are presented along with the video component. Specifically, the related audio component and the related closed captioning component may respectively represent an audio component and a closed captioning component which are presented along with the video component in synchronization therewith. The association elements may have identifiers for identifying the component fragments indicating the audio component and the closed captioning component associated to the video component. In a specific embodiment, the association elements may be associatedTo.
  • FIG. 48 shows a relationship among component fragments when a component fragment indicating a video component refers to a component fragment indicating a related audio component and a component fragment indicating a related closed captioning component according to an embodiment of the present invention.
  • In the embodiment of FIG. 48, a component fragment of a video component having an identifier of bcast://lge.com/Component//1 refers to a component fragment of an audio component having an identifier of bcast://lge.com/Component/2 and a component fragment of a closed captioning component having an identifier of bcast://lge.com/Component/3. Accordingly, the broadcast receiving apparatus 100 may acquire information about the audio component and closed captioning component associated to the video component based on the component fragment of the video component.
  • A reference relationship among fragments is necessary for the broadcast receiving apparatus 100 to generate a service guide based on fragments included in ESG data. This will be described with reference to FIGS. 49 to 63.
  • FIG. 49 shows a reference relationship among fragments according to an embodiment of the present invention.
  • A content fragment representing content may refer to a fragment indicating a service including the content. In addition, a component fragment indicating a component may refer to a content fragment representing content including the component. Furthermore, a component fragment indicating a component may refer to a service fragment representing a service including the component. A schedule component may refer to a service fragment representing a service corresponding to the corresponding schedule, a content fragment representing content corresponding to the schedule and a component fragment representing a component corresponding to the schedule.
  • To this end, a component fragment needs to include additional component in addition to the aforementioned elements. This will be described with reference to FIG. 50.
  • FIG. 50 shows an XML format of a component fragment according to an embodiment of the present invention when the component fragment refers to a higher component fragment, a content fragment and a service fragment.
  • A component fragment indicating a component may include a reference service element that indicates a service fragment of a service including the component. The reference service element may have an identifier for identifying a service fragment indicating a service referred to. In a specific embodiment, the reference service element may be ServiceReference.
  • A component fragment indicating a component may include a reference content element that indicates a content fragment of content including the component. The reference content element may have an identifier for identifying a content fragment indicating content referred to. In a specific embodiment, the reference content element may be ContentReference.
  • A component fragment indicating a component may include a reference component element that indicates a content fragment of a higher component including the component. The reference component element may have an identifier for identifying a component fragment indicating a component referred to. In a specific embodiment, the reference component element may be ComponentReference.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about a higher component, content and a service including a component indicated by a component fragment based on the component fragment.
  • As a component fragment is added, a schedule fragment may include an element for referring to the component fragment. This will be described with reference to FIG. 51.
  • FIG. 51 shows an XML format of a schedule fragment according to an embodiment of the present invention when the schedule fragment refers to a component fragment, a content fragment and a service fragment.
  • A schedule fragment indicating a schedule may include a reference component element that indicates a content fragment of a component corresponding to the schedule. The reference component element may have an identifier for identifying a component fragment referred to by the schedule fragment. In a specific embodiment, the reference component element may be ComponentReference.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about a component, content and a service corresponding to the schedule indicated by the schedule fragment based on the schedule fragment.
  • FIG. 52 shows a reference relationship among a service fragment, a content fragment and component fragments representing a presentable video component, a presentable audio component and a presentable closed captioning component according to an embodiment of the present invention.
  • A service may include content and the content may include a presentable video component, a presentable audio component and a presentable closed captioning component. FIG. 52 shows such relationship among a service fragment, a content fragment and a plurality of component fragments. The component fragments respectively representing the presentable audio component, presentable closed captioning component and presentable video component refer to the content fragment representing content including the presentable closed captioning component, presentable video component and presentable audio component. In addition, the component fragment representing the presentable closed captioning component and the component fragment representing the presentable audio component are associated to the component fragment representing the presentable video component. Furthermore, the content fragment representing the content refers to the service fragment indicating a service including the content.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about correlation among the components and content including the components based on the component fragments. In addition, the broadcast receiving apparatus 100 may acquire information about a service including the content based on the content fragment representing the content.
  • FIG. 53 shows a reference relationship among a component fragment representing a composite component and component fragments representing lower components according to an embodiment of the present invention.
  • A composite component may include a plurality of components constituting one scene. The component fragments in the embodiment of FIG. 53 show such relationship. Specifically, in the embodiment shown in FIG. 53, a component fragment representing a video component of a second enhancement layer for scalable video encoding refers to a component fragment representing a video component of a first enhancement layer. In addition, the component fragment representing the video component of the first enhancement layer refers to a component fragment representing a video component of a base layer. Furthermore, the component fragment representing the video component of the base layer refers to a component fragment representing a composite video component.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about a relationship among components constituting the composite component based on the plurality of component fragments representing the plurality of components for scalable video coding. Specifically, the broadcast receiving apparatus 100 may recognize that the video component of the first enhancement layer is necessary to present the video component of the second enhancement layer based on the component fragment representing the video component of the second enhancement layer. In addition, the broadcast receiving apparatus 100 may recognize that the video component of the base layer is necessary to present the video component of the first enhancement layer based on the component fragment representing the video component of the first enhancement layer.
  • FIG. 54 shows a reference relationship among a component fragment representing an App-based enhancement component and component fragments representing lower components according to an embodiment of the present invention.
  • As described above, the App-based enhancement component may include an NRT content item component. The NRT content item component may include an NRT file component. In addition, the App-based enhancement component may include an application component. Furthermore, the App-based enhancement component may include an on demand component. The component fragments in the embodiment of FIG. 54 show such relationship. Specifically, in the embodiment shown in FIG. 54, a component fragment representing an NRT file component refers to a component fragment representing an NRT content item component. In addition, the component fragment representing the NRT content item component refers to a component fragment representing the App-based enhancement component. Furthermore, a component fragment representing an application component refers to the component fragment representing the App-based enhancement component. A component fragment representing an on demand component refers to the component fragment representing the App-based enhancement component.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about a relationship between the App-based enhancement component and components included in the App-based enhancement component. Specifically, the broadcast receiving apparatus 100 may recognize that the App-based enhancement component includes the NRT content item. In addition, the broadcast receiving apparatus 100 may recognize the NRT file component necessary to execute the NRT content item.
  • Distinguished from previous broadcast systems in which a service simply include a plurality of programs, a service may include various types of content in hybrid broadcast. For example, a service may include at least one of programs, on demand content and NRT content in hybrid broadcast. Accordingly, content fragments need to represent such relationship while referring to service fragments. This will be described with reference to the attached drawings.
  • FIG. 55 shows an XML format of a content fragment according to another embodiment of the present invention when the content fragment refers to a service. The content fragment may include a relationship element indicating a relationship with a service fragment referred to. The relationship element may indicate that the corresponding content is a program of the service represented by the service fragment referred to. Here, the relationship element may have a value of 1. In addition, the relationship element may indicate that the corresponding content is a content item of the service represented by the service fragment referred to. Here, the relationship element may have a value of 2. Further, the relationship element may indicate that the corresponding content is on demand content of the service represented by the service fragment referred to. The on demand content refers to content executed at the request of a user. Here, the relationship element may have a value of 3. In a specific embodiment, the relationship element may be referred to as “relationship.” In a specific embodiment, the broadcast receiving apparatus 100 may recognize a relationship between content and a service referred to by the content based on the relationship element.
  • In addition, the content fragment may include an element indicating the range of values that the relationship element may have. In a specific embodiment, the element indicating the range of values that the relationship element may have may be referred to as RelationshipRangeType. The broadcast receiving apparatus 100 may recognize a relationship between content and a service referred to by the content based on the relationship element.
  • Furthermore, the content fragment may include a weight element indicating the level of importance of a service fragment referred to. In a specific embodiment, the weight element may be referred to as “weight”. The broadcast receiving apparatus 100 may determine the level of importance of content included in a service based on the weight element. In addition, the broadcast receiving apparatus 100 may display content included in a service based on the weight element. Specifically, the broadcast receiving apparatus 100 may preferentially display content having a higher level of importance included in a service based on the weight element. For example, the broadcast receiving apparatus 100 may align and display content included in a service based on weight element values corresponding thereto.
  • FIG. 56 shows a reference relationship among content fragments and a service fragment according to another embodiment of the present invention.
  • A content fragment having an identifier of bcast://lge.com/Content/1 refers to a service fragment having an identifier of bcast://lge.com/Service/1 and has a relationship element value of 1. Accordingly, the broadcast receiving apparatus 100 may recognize that content indicated by the content fragment having the identifier of bcast://lge.com/Content/1 is a program of a service indicated by the service fragment having the identifier of bcast://lge.com/Service/1 based on the content fragment.
  • A content fragment having an identifier of bcast://lge.com/Content/2 refers to the service fragment having the identifier of bcast://lge.com/Service/1 and has a relationship element value of 2. Accordingly, the broadcast receiving apparatus 100 may recognize that content indicated by the content fragment having the identifier of bcast://lge.com/Content/2 is a content item of the service indicated by the service fragment having the identifier of bcast://lge.com/Service/1 based on the content fragment.
  • A content fragment having an identifier of bcast://lge.com/Content/3 refers to the service fragment having the identifier of bcast://lge.com/Service/1 and has a relationship element value of 3. Accordingly, the broadcast receiving apparatus 100 may recognize that content indicated by the content fragment having the identifier of bcast://lge.com/Content/3 is on deman content of the service indicated by the service fragment having the identifier of bcast://lge.com/Service/1 based on the content fragment.
  • In this way, the broadcast receiving apparatus 100 may recognize a relationship between content indicated by a content fragment and a service including the content based on the content fragment. Accordingly, the broadcast receiving apparatus 100 may recognize a relationship between a service and content based on ESG data without additional service signaling information.
  • However, when a lower fragment refers to a higher fragment as described above, the broadcast receiving apparatus 100 may recognize a relationship between the higher fragment and the lower fragment after processing the lower fragment first. Accordingly, when the broadcast receiving apparatus 100 intends to display only some services, the broadcast receiving apparatus 100 needs to check not only fragments indicating content and components included in the services but also other fragments. This is inefficient considering that the broadcast receiving apparatus 100 displays contents based on services, in general. Therefore, embodiments in which a higher fragment refers to a lower fragment will be described with reference to the attached drawings.
  • FIG. 57 illustrates a reference relationship among fragments according to another embodiment of the present invention.
  • In another embodiment of the present invention, a fragment representing a service may refer to a content fragment indicating content. In addition, a content fragment representing content including a component may refer to a component fragment representing the component. Furthermore, a component fragment representing a component may refer to a service fragment representing a service including the component. A schedule component may refer to a service fragment representing a service corresponding to the schedule indicated thereby, a content fragment representing content corresponding to the schedule and a component fragment representing a component corresponding to the schedule.
  • To this end, the service fragment, content fragment and component fragment need to include additional components in addition to the aforementioned elements. This will be described with reference to FIGS. 58 to 63.
  • FIG. 58 shows an XML format of a service fragment according to another embodiment of the present invention.
  • The service fragment may include a content reference element that represents a content fragment indicating content included in a service indicated by the service fragment. Here, the content reference element may have an identifier for identifying the content fragment representing the content included in the service. In a specific embodiment, the content reference element may be ContentReference.
  • The service fragment may include a component reference element that represents a component fragment indicating a component included in the service indicated by the service fragment. Here, the component reference element may have an identifier for identifying the component fragment representing the component included in the service. In a specific embodiment, the content reference element may be ComponentReference.
  • The broadcast receiving apparatus 100 may acquire information about content included in the service represented by the service fragment based on the service fragment. In addition, the broadcast receiving apparatus 100 may acquire information about a component included in the service indicated by the service fragment based on the service fragment.
  • FIG. 59 shows an XML format of a content fragment according to another embodiment of the present invention.
  • The content fragment may include a component reference element that represents a component fragment indicating a component included in content indicated by the content fragment. Here, the component reference element may have an identifier for identifying the component fragment representing the component included in the content. In a specific embodiment, the content reference element may be ComponentReference.
  • The broadcast receiving apparatus 100 may acquire information about a component included in the content represented by the content fragment based on the content fragment.
  • FIG. 60 shows an XML format of a component fragment according to another embodiment of the present invention.
  • The component fragment may include a component reference element that represents a component fragment indicating a component included in a component indicated by the component fragment. Here, the component reference element may have an identifier for identifying the component fragment representing the component included in the component. In a specific embodiment, the content reference element may be ComponentReference.
  • The broadcast receiving apparatus 100 may acquire information about a component included in the component represented by the component fragment based on the component fragment.
  • FIG. 61 shows a reference relationship among a service fragment, a content fragment component fragments according to another embodiment of the present invention.
  • A service may include content and the content may include a presentable video component, a presentable audio component and a presentable closed captioning component. FIG. 61 shows such relationship among a service fragment, a content fragment and a plurality of component fragments. The service fragment refers to the content fragment representing content included in a service indicated by the service fragment. The content fragment refers to a component fragment indicating a presentable video component included in the content represented by the content fragment. While the content fragment refers to only the component fragment indicating the presentable video component in FIG. 61, the content fragment may refer to a component fragment indicating a presentable audio component and a component fragment indicating a presentable closed captioning component. The figure shows that the component fragment indicating the presentable closed captioning component and the component fragment indicating the presentable audio component are associated to the component fragment indicating the presentable video component.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about the content included in the service represented by the service fragment based on the service fragment. In addition, the broadcast receiving apparatus 100 may acquire information about the component included in the content based on the component fragment.
  • FIG. 62 shows a reference relationship among a component fragment indicating a composite component and component fragments representing lower components.
  • A composite component may include a plurality of components constituting one scene. The component fragments in the embodiment of FIG. 62 show such relationship. Specifically, in the embodiment shown in FIG. 62, a component fragment representing a composite component refers to a component fragment representing a video component of a base layer for scalable video encoding. In addition, the component fragment representing the video component of the base layer for scalable video encoding refers to a component fragment representing a video component of a first enhancement layer. Furthermore, the component fragment representing the video component of the first enhancement layer refers to a component fragment representing a video component of a second enhancement layer.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about a relationship among components constituting the composite component based on the plurality of component fragments representing the plurality of components for scalable video coding. Specifically, the broadcast receiving apparatus 100 may recognize that the video component of the base layer and the video component of the first enhancement layer may be presented together based on the component fragment indicating the video component of the base layer. In addition, the broadcast receiving apparatus 100 may recognize that the video component of the first enhancement layer and the video component of the second enhancement layer may be presented together based on the component fragment indicating the video component of the first enhancement layer.
  • FIG. 63 shows a reference relationship among a component fragment representing an App-based enhancement component and component fragments representing lower components according to another embodiment of the present invention.
  • As described above, the App-based enhancement component may include an RNT content item component. The NRT content item component may include an NRT file component. In addition, the App-based enhancement component may include an application component. Furthermore, the App-based enhancement component may include an on demand component. The component fragments in the embodiment of FIG. 63 show such relationship. Specifically, in the embodiment shown in FIG. 63, a component fragment representing an App-based enhancement component refers to a component fragment representing an NRT content item component. In addition, the component fragment representing the NRT content item component refers to a component fragment representing an NRT file component. In addition, the component fragment representing the App-based enhancement component refers to a component fragment representing an application component. Furthermore, the component fragment representing the App-based enhancement component refers to a component fragment representing an on demand component.
  • Accordingly, the broadcast receiving apparatus 100 may acquire information about a relationship between the App-based enhancement component and components included in the App-based enhancement component based on the component fragment representing the App-based enhancement component. Specifically, the broadcast receiving apparatus 100 may recognize that the App-based enhancement component includes the NRT content item. In addition, the broadcast receiving apparatus 100 may recognize the NRT file component necessary to execute the NRT content item.
  • As described above, a service fragment may include the component reference element representing a component fragment corresponding to a component included in the service represented by the service fragment. A content fragment may include the component reference element representing a component fragment corresponding to a component included in the content represented by the content fragment. A component fragment may include the component reference element representing a component fragment corresponding to a component included in the component represented by the component fragment. Accordingly, the broadcast receiving apparatus 100 may extract a specific service fragment and a content fragment and a component fragment referred to by the specific service fragment in order to display information about a specific service from among a plurality of services. Accordingly, it is possible to reduce the quantity of ESG data that needs to be processed by the broadcast receiving apparatus 100 to display the information about the specific service, thereby improving ESG data processing efficiency of the broadcast receiving apparatus 100.
  • FIGS. 64 to 66 show a syntax of a component fragment according to another embodiment of the present invention.
  • The component fragment according to another embodiment of the present invention includes information about a component that is part of a broadcast service or content.
  • The component fragment may include an identifier attribute identifying the component fragment. In a specific embodiment, the identifier attribute may be “id.”
  • In addition, the component fragment may include a version attribute indicating whether the component has changed. In a specific embodiment, the version attribute may be “version.”
  • The component fragment may include a valid period attribute indicating a valid period of the component fragment. The attribute indicating the valid period of the component fragment may include a start time and an end time of the valid period. In a specific embodiment, the attribute indicating the start time of the valid period may be “validFrom” and the attribute indicating the end time of the valid period may be “validTo.”
  • The component fragment may include a service reference element indicating a service fragment of a service including the component. The service reference element may have an identifier for identifying the service fragment representing the service referred to as an identifier attribute. In a specific embodiment, the service reference element may be ReferenceService. In a specific embodiment, the identifier attribute may be idRef.
  • The component fragment may include a content reference element indicating a content fragment of content including the component. The content reference element may have an identifier for identifying the content fragment representing the content referred to as an identifier attribute. In a specific embodiment, the content reference element may be ReferenceContent. In a specific embodiment, the identifier attribute may be idRef.
  • In addition, the component fragment may include a component reference element indicating a component fragment of a higher component including the component. The component reference element may have an identifier for identifying the component fragment representing the component referred to as an identifier attribute. In a specific embodiment, the component reference element may be ReferenceComponent. In a specific embodiment, the identifier attribute may be idRef.
  • Furthermore, the component fragment may include a component type element indicating a component type. Since one component may simultaneously include multiple properties, one component fragment may include a plurality of component type elements.
  • As described above, the component type element may represent various types of components.
  • Specifically, the component type element may represent a continuous component. The continuous component is a component presented in a continuous stream. For example, the continuous component may be one of audio, video and closed captioning. In a specific embodiment, the component type element may have a value of 1 when the component type element indicates a continuous component. In addition, the component type element may represent an elementary component. In a specific embodiment, the component type element may have a value of 2 when the component type element indicates an elementary component. The component type element may represent a composite component. In a specific embodiment, the component type element may have a value of 3 when the component type element indicates a composite component. Furthermore, the component type element may represent a PickOne component. In a specific embodiment, the component type element may have a value of 4 when the component type element indicates a PickOne component. The component type element may represent a complex component. In a specific embodiment, the component type element may have a value of 5 when the component type element indicates a complex component. The component type element may represent a video presentable component. In a specific embodiment, the component type element may have a value of 6 when the component type element indicates a presentable video component. The component type element may represent a presentable audio component. In a specific embodiment, the component type element may have a value of 7 when the component type element indicates a presentable audio component. The component type element may represent a presentable closed captioning component. In a specific embodiment, the component type element may have a value of 8 when the component type element indicates a presentable closed captioning component. The component type element may represent that a component is an NRT file. In a specific embodiment, the component type element may have a value of 9 when the component type element indicates that a component is an NRT file. The component type element may represent that a component is NRT content. In a specific embodiment, the component type element may have a value of 10 when the component type element indicates that a component is NRT content. The component type element may represent that a component is an application. An application may be a set of documents constituting an additional service or an interactive service. In a specific embodiment, a document may include at least one of HTML, JavaScript, CSS, XML and multimedia files. An application may be regarded as an NRT content item. In a specific embodiment, the component type element may have a value of 11 when the component type element indicates that a component is an application. The component type element may represent that a component is an ATSC 3.0 application. An ATSC 3.0 application may refer to an application executed in environments conforming to ATSC 3.0 specification. In a specific embodiment, the component type element may have a value of 12 when the component type element indicates that a component is an ATSC 3.0 application. The component type element may indicate that a component is an on demand component. An on demand component represents a component that is delivered on demand. When the component type element indicates that a component is an on demand component in a specific embodiment, the component type element may have a value of 13. The component type element may indicate that a component is a notification stream. A notification stream delivers notifications to synchronize actions of applications with an underlying linear time base. When the component type element indicates that a component is a notification stream in a specific embodiment, the component type element may have a value of 14. The component type element may indicate that a component is an App-based enhancement. When the component type element indicates that a component is an App-based enhancement in a specific embodiment, the component type element may have a value of 15.
  • In a specific embodiment, the component type element may be referred to as ComponentType.
  • In addition, the component fragment may include a component role element specifying the role of the component. The value of the component role element may be an integer. Since one component may simultaneously execute various functions, one component fragment may include a plurality of component role elements.
  • The component role element may represent a default video when the component is a presentable video component. Here, the component role element may have a value of 1. In addition, the component role element may represent an alternative camera view when the component is a presentable video component. Here, the component role element may have a value of 2. Furthermore, the component role element may represent an alternative video component when the component is a presentable video component. Here, the component role element may have a value of 3. The component role element may represent a sign language inset when the component is a presentable video component. Here, the component role element may have a value of 4. The component role element may represent a follow subject video when the component is a presentable video component. Here, the component role element may have a value of 5. When the component is a composite video component, the component role element may represent a base layer for scalable video encoding. Here, the component role element may have a value of 6. The component role element may represent an enhancement layer for scalable video encoding when the component is a composite video component. Here, the component role element may have a value of 7. The component role element may represent a 3D video left view when the component is a composite video component. Here, the component role element may have a value of 8. The component role element may represent a 3D video right view when the component is a composite video component. Here, the component role element may have a value of 9. The component role element may represent 3D video depth information when the component is a composite video component. Here, the component role element may have a value of 10. The component role element may represent that a media component is a video at a specific position of a picture divided into a plurality of regions when the component is a composite video component. Here, the component role element may have a value of 11. The component role element may represent follow-subject metadata when the component is a composite video component. Here, the component role element may have a value of 12. The follow-subject metadata may include at least one of the name, position and size of the corresponding subject. When the follow-subject function is supported by metadata in units of frames of a stream, the follow-subject metadata may represent a main video component region on which the subject is focused.
  • When the component is a presentable audio component, the component role element may represent that the component is complete main. Here, the component role element may have a value of 13. When the component is a presentable audio component, the component role element may represent that the component is music. Here, the component role element may have a value of 14. When the component is a presentable audio component, the component role element may represent that the component is a dialog. Here, the component role element may have a value of 15. When the component is a presentable audio component, the component role element may represent that the component is effects. Here, the component role element may have a value of 16. When the component is a presentable audio component, the component role element may represent that the component is visual impaired. Here, the component role element may have a value of 17. When the component is a presentable audio component, the component role element may represent that the component is hearing impaired. Here, the component role element may have a value of 18. When the component is a presentable audio component, the component role element may represent that the component is a commentary. Here, the component role element may have a value of 19.
  • When the component is a presentable closed captioning component, the component role element may represent that the component is a normal closed captioning. Here, the component role element may have a value of 20. When the component is a presentable closed captioning component, the component role element may represent that the component is an easy-reader closed captioning for kindergarteners and elementary school students. Here, the component role element may have a value of 21.
  • In addition, the component fragment may include an extension element that may be reserved for future extension. In a specific embodiment, the extension element may be referred to as PrivateExt.
  • Furthermore, the component fragment may include a proprietary element for use of a specific application. In a specific embodiment, the proprietary element may be referred to as ProprietaryElements.
  • FIG. 67 shows an XML format of the component fragment according to another embodiment of the present invention.
  • The component fragment in an embodiment shown in FIG. 68 includes the elements and attributes described with reference to FIGS. 65 to 68.
  • FIG. 68 shows an XML format of ComponentRangeType included in the component fragment according to another embodiment of the present invention.
  • The component type element included in the component fragment may include an attribute specifying the range of values that the component type element may have. In a specific embodiment, the attribute specifying the range of values that the component type element may have may be referred to as ComponentTypeRangeType.
  • FIG. 69 shows an XML format of ComponentRoleRangeType included in the component fragment according to another embodiment of the present invention.
  • The component role element included in the component fragment may include an attribute specifying the range of values that the component role element may have. In a specific embodiment, the attribute specifying the range of values that the component role element may have may be referred to as ComponentRoleRangeType.
  • FIG. 70 shows a relationship among the component fragment according to another embodiment and a composite video component using scalable video encoding and components included in the composite video component.
  • The component fragment in the embodiment shown in FIG. 70 represents that a composite component having an identifier of bcast://lge.com/Component/1 includes an elementary component having an identifier of bcast://lge.com/Component/2 and an elementary component having an identifier of bcast://lge.com/Component/3. In addition, the component role element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 is a base layer for scalable video encoding and the elementary component having the identifier of bcast://lge.com/Component/3 is an enhancement layer for scalable video encoding.
  • FIG. 71 shows a relationship among the component fragment according to another embodiment and a composite video component including a 3D video and components included in the composite video component.
  • The component fragment in the embodiment shown in FIG. 71 represents that a composite component having an identifier of bcast://lge.com/Component/1 includes a PickOne component having an identifier of bcast://lge.com/Component/2 and a PickOne component having an identifier of bcast://lge.com/Component/3. In addition, the component role element of the component fragment represents that the elementary component having the identifier of bcast://lge.com/Component/2 is a 3D video left view and the elementary component having the identifier of bcast://lge.com/Component/3 is a 3D video right view.
  • FIG. 72 shows a relationship among the component fragment according to another embodiment and a PickOne audio component and components included in the PickOne audio component.
  • The component fragment in the embodiment shown in FIG. 72 represents that a PickOne component having an identifier of bcast://lge.com/Component/1 includes a PickOne component having an identifier of bcast://lge.com/Component/2 and a composite component having an identifier of bcast://lge.com/Component/3, which are alternative. In addition, the component role element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/2 is complete main. Furthermore, the component role element of the component fragment represents that the composite component having the identifier of bcast://lge.com/Component/3 includes a PickOne component having the identifier of bcast://lge.com/Component/4 and a PickOne component having the identifier of bcast://lge.com/Component/5. Here, the component role element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/4 is music. In addition, the component role element of the component fragment represents that the PickOne component having the identifier of bcast://lge.com/Component/5 is a dialog.
  • As described above, ESG data may include information about a component as lower elements of the content fragment. This will be described with reference to the attached drawings.
  • FIGS. 73 to 75 shows a syntax of a component element according to another embodiment of the present invention.
  • A content fragment includes an extension element for extendibility. The content fragment may include a component element that represents information about a component included in content as a lower element of the extension element.
  • Specifically, the component element may include a component type element representing a component type. The component type element may represent a presentable video component. Here, the component type element may have a value of 1. The component type element may represent a presentable audio component. Here, the component type element may have a value of 2. The component type element may represent a presentable closed captioning component. Here, the component type element may have a value of 3. The component type element may represent an App-based enhancement component. Here, the component type element may have a value of 4. In a specific embodiment, the component type element may be referred to as ComponentType. In addition, the component type element may include an element specifying the range of values that the component type element may have. Here, the element specifying the range of values that the component type element may have may be referred to as ComponentTypeRangeType. The number of component types represented by the component type element may be less than the number of component types indicated by the aforementioned component fragment because properties of content including components are indicated by a content fragment and thus there is no need to repeatedly describe the properties in the content fragment.
  • The component element may include a component role element representing the role of a component.
  • The component role element may represent a default video when the component is a presentable video component. Here, the default video may refer to a primary video. In this case, the component role element may have a value of 1. The component role element may represent an alternative camera view when the component is a presentable video component. Here, the component role element may have a value of 2. The component role element may represent an alternative video component when the component is a presentable video component. Here, the component role element may have a value of 3. The component role element may represent an alternative video component when the component is a presentable video component. Here, the component role element may have a value of 3. The component role element may represent a sign language inset when the component is a presentable video component. Here, the component role element may have a value of 4. The component role element may represent a follow subject video when the component is a presentable video component. Here, the component role element may have a value of 5.
  • When the component is a presentable audio component, the component role element may represent that the component is complete main. Here, the component role element may have a value of 6. When the component is a presentable audio component, the component role element may represent that the component is music. Here, the component role element may have a value of 7. When the component is a presentable audio component, the component role element may represent that the component is a dialog. Here, the component role element may have a value of 8. When the component is a presentable audio component, the component role element may represent that the component is effects. Here, the component role element may have a value of 9. When the component is a presentable audio component, the component role element may represent that the component is visual impaired. Here, the component role element may have a value of 10. When the component is a presentable audio component, the component role element may represent that the component is hearing impaired. Here, the component role element may have a value of 11. When the component is a presentable audio component, the component role element may represent that the component is a commentary. Here, the component role element may have a value of 12.
  • When the component is a presentable closed captioning component, the component role element may represent that the component is a normal closed captioning. Here, the component role element may have a value of 13. When the component is a presentable closed captioning component, the component role element may represent that the component is an easy-reader closed captioning for kindergarteners and elementary school students. Here, the component role element may have a value of 14.
  • When the component is an App-based enhancement component, the component role element may represent that the component is an application. Here, the component role element may have a value of 15. When the component is an App-based enhancement component, the component role element may represent that the component is an NRT content item. Here, the component role element may have a value of 16. When the component is an App-based enhancement component, the component role element may represent that the component is an on demand component. Here, the component role element may have a value of 17. When the component is an App-based enhancement component, the component role element may represent that the component is a notification stream component. Here, the component role element may have a value of 18. A notification stream delivers notifications to synchronize actions of applications with an underlying linear time base. When the component is an App-based enhancement component, the component role element may represent that the component is a start-over component. The start-over component refers to an application component that provides a function of viewing content from the beginning after the content starts to be broadcast at the request of a user. Here, the component role element may have a value of 19. When the component is an App-based enhancement component, the component role element may represent that the component is a companion screen component that may be consumed in the companion apparatus 300 interoperating with the broadcast receiving apparatus 100. Here, the component role element may have a value of 20. In a specific embodiment, the component role element may be referred to as ComponentRole. In addition, the component role element may include an element specifying the range of values that the component role element may have. Here, the element specifying the range of values that the component role element may have may be referred to as ComponentRoleRangeType.
  • The component element may include a start time element representing a display start time of a component. In a specific embodiment, the start time element may be referred to as StartTime.
  • The component element may include an end time element representing a display end time of a component. In a specific embodiment, the end time element may be referred to as EndTime.
  • The component element may include a language element indicating a component description language. In a specific embodiment, the language element may be referred to as Language.
  • The component element may include a session description language element indicating a session description language when a component is delivered according to a session based transport protocol. In a specific embodiment, the session description language element may be referred to as languageSDP.
  • The component element may include an element indicating a component display duration. In a specific embodiment, the duration element may be referred to as Length.
  • The component element may include an element indicating a parental rating of a component. In a specific embodiment, the parental rating element may be referred to as ParentalRating.
  • The component element may include a capability element indicating the capability of the broadcast receiving apparatus 100 necessary to present components. Specifically, the capability element may represent that connection to a broadband is necessary for component presentation. Here, the capability element may have a value 1.
  • In the case of a presentable video component, the capability element may represent that a simple definition (SD) video presentation capability is needed. Here, the capability element may be 2. In the case of a presentable video component, the capability element may represent that a high definition (HD) video presentation capability is needed. Here, the capability element may be 3. In the case of a presentable video component, the capability element may represent that a video presentation capability of ultra-high definition (UHD) of 4K or higher is needed. Here, the capability element may be 4. In the case of a presentable video component, the capability element may represent that a video presentation capability corresponding to definition of 8K or higher is needed. Here, the capability element may be 5. In the case of a presentable video component, the capability element may represent that definition for presenting 3 D video is needed. Here, the capability element may be 6. In the case of a presentable video component, the capability element may represent that definition for presenting high dynamic range imaging video is needed. Here, the capability element may be 7. In the case of a presentable video component, the capability element may represent that definition for presenting wide color gamut video is needed. Here, the capability element may be 8.
  • In the case of a presentable audio component, the capability element may represent that 2.0 channel audio needs to be presented. Here, the capability element may be 9. In the case of a presentable audio component, the capability element may represent that 2.1 channel audio needs to be presented. Here, the capability element may be 10. In the case of a presentable audio component, the capability element may represent that 5.1 channel audio needs to be presented. Here, the capability element may be 11. In the case of a presentable audio component, the capability element may represent that 6.1 channel audio needs to be presented. Here, the capability element may be 12. In the case of a presentable audio component, the capability element may represent that 7.1 channel audio needs to be presented. Here, the capability element may be 13. In the case of a presentable audio component, the capability element may represent that 21.1 channel audio needs to be presented. Here, the capability element may be 14. In the case of a presentable audio component, the capability element may represent that 3D audio needs to be presented. Here, the capability element may be 15. In the case of a presentable audio component, the capability element may represent that a dialog level adjustment is needed. Here, the capability element may be 16.
  • In addition, the capability element may represent that a magic remote control input needs to be received for component presentation. Here, the capability element may be 17. The capability element may represent that a touchscreen input needs to be received for component presentation. Here, the capability element may be 18. The capability element may represent that a mouse input needs to be received for component presentation. Here, the capability element may be 19. Furthermore, the capability element may represent that a keyboard input needs to be received for component presentation. Here, the capability element may be 20.
  • Furthermore, the capability element may represent that application rendering is needed. Here, the capability element may be 21.
  • In a specific embodiment, the capability element may be referred to as Device Capability. In addition, the capability element may include an element specifying the range of values that the capability element may have. Here, the element specifying the range of values that the capability element may have may be referred to as DeviceCapabilityRangeType.
  • The component element may include a target device element indicating a target device of a component. The target device element may represent a component provided through the screen of the broadcast receiving apparatus 100 that is a primary apparatus receiving broadcast services. Here, the target device element may be 1. The target device element may represent a component provided through the screen of the companion apparatus 300 interoperating with the broadcast receiving apparatus 100 that is a primary apparatus receiving broadcast services. Here, the target device element may be 2. The target device element may represent a component provided through part of the screen of broadcast receiving apparatus 100 that is a primary apparatus receiving broadcast services. Here, the target device element may be 3. In a specific embodiment, the target device element may be referred to as TargetDevice. In addition, the target device element may include an element specifying the range of values that the target device element may have. The element specifying the range of values that the target device element may have may be referred to as TargetDeviceRangeType.
  • FIGS. 76 to 80 show an XML format of the aforementioned component element.
  • The broadcast receiving apparatus 100 may display component information based on the component element. Specifically, the broadcast receiving apparatus 100 may display component information included in content based on the capability element included in the component element. This will be described in detail with reference to the attached drawings.
  • FIGS. 81 to 83 illustrate another embodiment of the present invention, in which the broadcast receiving apparatus displays components included in content based on the capability element included in the component element.
  • The broadcast receiving apparatus 100 may display component information included in content based on the capability element included in the component element. Specifically, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 such that the component that may be presented is discriminated from the component that may not be presented based on the capability element included in the component element. Specifically, the broadcast receiving apparatus 100 may differently display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 based on the capability element included in the component element. In a specific embodiment, For example, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 in different colors based on the capability element included in the component element. For example, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 on a white background and display a component that may not be presented by the broadcast receiving apparatus 100 on a gray background based on the capability element included in the component element. In another specific embodiment, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 through different icons based on the capability element included in the component element. In another specific embodiment, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 as different texts based on the capability element included in the component element. In another specific embodiment, the broadcast receiving apparatus 100 may display a component that may be presented by the broadcast receiving apparatus 100 and a component that may not be presented by the broadcast receiving apparatus 100 using different graphical symbols based on the capability element included in the component element. The broadcast receiving apparatus 100 may display component information in a service guide menu, as described above. In addition, the broadcast receiving apparatus 100 may display component information in a service list menu. Furthermore, the broadcast receiving apparatus 100 may display component information in a bar-shaped menu positioned at the lower or upper part of a screen on which content is presented.
  • In the embodiment shown in FIG. 81, content includes a component that requires HD video presentation and a component that requires 5.1 channel audio presentation. When the broadcast receiving apparatus 100 supports HD video presentation and 5.1 channel audio presentation, the broadcast receiving apparatus 100 may display component information in a white out, as shown in FIG. 81(b). When the broadcast receiving apparatus 100 does not support 5.1 channel audio presentation, the broadcast receiving apparatus 100 may display audio component information that requires 5.1 channel presentation in a gray out, as shown in FIG. 81(c).
  • In the embodiment shown in FIG. 82, content includes a component that requires UHD video presentation, an audio component including English and an audio component including Spanish that requires broadband connection. When the broadcast receiving apparatus 100 does not provide broadband connection, the broadcast receiving apparatus 100 may display the audio component including Spanish in a gray out, as shown in FIG. 82(b).
  • In the embodiment shown in FIG. 83, content includes a component that requires UHD video presentation, an audio component that requires 5.1 channel presentation and a video component that requires a wide color gamut. When the broadcast receiving apparatus 100 does not provide the wide color gamut, the broadcast receiving apparatus 100 may display the video component that requires the wide color gamut in a gray out, as shown in FIG. 83(c).
  • When the broadcast receiving apparatus 100 provides component element based on the capability element included in the component element, a user may select content depending on whether the broadcast receiving apparatus 100 may present the content. Specifically, the user may previously recognize that the broadcast receiving apparatus 100 may not present the corresponding content. Accordingly, the user may be prevented from waiting for the content that may not be presented by the broadcast receiving apparatus 100. This function may enhance user convenience considering that broadcast content types are diversified and high device capabilities are required in hybrid broadcast.
  • FIGS. 84 and 85 show a syntax of a component type according to another embodiment of the present invention.
  • As described above, the component element may include information indicating a component type. Particularly, the component element may include the information indicating a component type in the form of an element. However, when a component element type may be represented as one value, the component element may not include the information representing a component type in the form of an element. Accordingly, the component element may include the information representing a component type as a component type attribute. Specifically, the component type attribute may represent a presentable video component. Here, the component type attribute may have a value of 1. The component type attribute may represent a presentable audio component. Here, the component type attribute may be 2. The component type attribute may represent a presentable closed captioning component. Here, the component type attribute may be 3. The component type attribute may represent an App-based enhancement component. Here, the component type attribute may be 4.
  • As described above, the component role element may have an integer value. However, the component role element may have a string value that indicates the role of a component for convenient future data addition. In this case, the broadcast receiving apparatus 100 may display the string value of the component role element. Specifically, when a component is a presentable video component, the component role element may represent that the component is a default video. The default video may refer to a primary video. In this case, the component role element may have a value of “primary video.” When a component is a presentable video component, the component role element may represent that the component is an alternative camera view. Here, the component role element may have a value of “Alternative camera view.” When a component is a presentable video component, the component role element may represent that the component is an alternative video component. Here, the component role element may have a value of “Other alternative video component.” When a component is a presentable video component, the component role element may represent that the component is a sign language inset. Here, the component role element may have a value of “Sign language inset.” When a component is a presentable video component, the component role element may represent that the component is a follow subject video. Here, the component role element may have a value of “Follow subject video.”
  • When a component is a presentable audio component, the component role element may represent that the component is complete main. Here, the component role element may have a value of “Complete main.” When a component is a presentable audio component, the component role element may represent that the component is music. Here, the component role element may have a value of “Music.” When a component is a presentable audio component, the component role element may represent that the component is a dialog. Here, the component role element may have a value of “Dialog.” When a component is a presentable audio component, the component role element may represent that the component is effects. Here, the component role element may have a value of “Effects.” When a component is a presentable audio component, the component role element may represent that the component is visual impaired. Here, the component role element may have a value of “Visually impaired.” When a component is a presentable audio component, the component role element may represent that the component is hearing impaired. Here, the component role element may have a value of “Hearing impaired.” When a component is a presentable audio component, the component role element may represent that the component is a commentary. Here, the component role element may have a value of “Commentary.”
  • When a component is a presentable closed captioning component, the component role element may represent that the component is a normal closed captioning. Here, the component role element may have a value of “Normal.” When a component is a presentable closed captioning component, the component role element may represent that the component is an easy-reader closed captioning for kindergarteners and elementary school students. Here, the component role element may have a value of “Easy reader.”
  • When a component is an App-based enhancement component, the component role element may represent that the component is an application. Here, the component role element may have a value of “Application.” When a component is an App-based enhancement component, the component role element may represent that the component is an NRT content item. Here, the component role element may have a value of “NRT content item.” When a component is an App-based enhancement component, the component role element may represent that the component is an on demand component. Here, the component role element may have a value of “On demand.” When a component is an App-based enhancement component, the component role element may represent that the component is a notification stream component. Here, the component role element may have a value of “Notification Stream.” A notification stream delivers notifications to synchronize actions of applications with an underlying linear time base. When a component is an App-based enhancement component, the component role element may represent that the component is a start-over component. The start-over component refers to an application component that provides a function of viewing content from the beginning after the content starts to be broadcast at the request of a user. Here, the component role element may have a value of “Start-over.” When a component is an App-based enhancement component, the component role element may represent that the component is a companion screen component that may be consumed in the companion apparatus 300 interoperating with the broadcast receiving apparatus 100. Here, the component role element may have a value of “Companion-Screen.”
  • FIG. 86 shows an XML format of the aforementioned component element.
  • The aforementioned component element includes information about components without discrimination of component types. When the component element includes component information classified depending on component types, redundant use of lower elements and attributes may be reduced. This will be described with reference to the attached drawings.
  • FIGS. 87 and 88 show a syntax of a component element according to another embodiment of the present invention.
  • The component element may include lower elements indicating component information, which are classified according to component types. The component element may include component role elements indicating component roles, which are classified depending on component types.
  • Specifically, the component element may include a presentable video component element that indicates the role of a presentable video component. Here, the presentable video component element may represent that the presentable video component is a default video component. The default video may refer to a primary video. In this case, the presentable video component element may have a value of “primary video.” The presentable video component element may represent that the presentable video component is an alternative camera view. Here, the presentable video component element may have a value of “Alternative camera view.” The presentable video component element may represent that the presentable video component is an alternative video component. Here, the presentable video component element may have a value of “Other alternative video component.” The presentable video component element may represent that the presentable video component is a sign language inset. Here, the presentable video component element may have a value of “Sign language inset.” The presentable video component element may represent that the presentable video component is a follow subject video. Here, the presentable video component element may have a value of “Follow subject video.”
  • Specifically, the component element may include a presentable audio component element that indicates the role of a presentable audio component. Here, the presentable audio component element may represent that the presentable audio component is complete main. Here, the presentable audio component element may have a value of “Complete main” The presentable audio component element may represent that the presentable audio component is music. Here, the presentable audio component element may have a value of “Music.” The presentable audio component element may represent that the presentable audio component is a dialog. Here, the presentable audio component element may have a value of “Dialog.” The presentable audio component element may represent that the presentable audio component is effects. Here, the presentable audio component element may have a value of “Effects.” The presentable audio component element may represent that the presentable audio component is visual impaired. Here, the presentable audio component element may have a value of “Visually impaired.” The presentable audio component element may represent that the presentable audio component is hearing impaired. Here, the presentable audio component element may have a value of “hearing impaired.” The presentable audio component element may represent that the presentable audio component is a commentary. Here, the presentable audio component element may have a value of “Commentary.”
  • Specifically, the component element may include a presentable closed captioning component element that indicates the role of a presentable closed captioning component. The presentable closed captioning component element may represent that the presentable closed captioning component is a normal closed captioning. Here, the presentable closed captioning component element may have a value of “Normal.” The presentable closed captioning component element may represent that the presentable closed captioning component is an easy-reader closed captioning for kindergarteners and elementary school students. Here, the presentable closed captioning component element may have a value of “Easy reader.”
  • Specifically, the component element may include a presentable App component element that indicates the role of a presentable App-based enhancement component. The presentable App component element may represent that the presentable App-based enhancement component is an application. Here, the presentable App component element may have a value of “Application.” The presentable App component element may represent that the presentable App-based enhancement component is an NRT content item. Here, the presentable App component element may have a value of “NRT content item.” The presentable App component element may represent that the presentable App-based enhancement component is an on demand component. Here, the presentable App component element may have a value of “On demand” The presentable App component element may represent that the presentable App-based enhancement component is a notification stream component. Here, the presentable App component element may have a value of “Notification Stream.” The presentable App component element may represent that the presentable App-based enhancement component is a start-over component. The start-over component refers to an application component that provides a function of viewing content from the beginning after the content starts to be broadcast at the request of a user. Here, the presentable App component element may have a value of “Start-over.” The presentable App component element may represent that the presentable App-based enhancement component is a companion screen component that may be consumed in the companion apparatus 300 interoperating with the broadcast receiving apparatus 100. Here, the presentable App component element may have a value of “Companion-Screen.”
  • FIG. 89 shows an XML format of the aforementioned component element, the presentable video component element, presentable audio component element, presentable closed captioning component element and presentable App component element included in the component element.
  • As described above, the component element may include information indicating component types as attributes. In addition, the component element may include an element value indicating the role of a component as a string, thereby facilitating future addition of roles of components. Furthermore, the broadcast receiving apparatus 100 may display an element value indicating the role of a component to a user.
  • Component information included in ESG data may contain information indicating a device capability necessary to present components. Specifically, the aforementioned component fragment may include information indicating a device capability necessary to present component as an element. In addition, the aforementioned component element may include information indicating a device capability necessary to present component as an element. This will be described with reference to the attached drawings.
  • FIG. 90 shows a syntax of a capability element according to another embodiment of the present invention.
  • The capability element may include a capability code that is a value indicating a capability. In a specific embodiment, the value indicating a capability may be referred to as CapabilityCodes. Meanings indicated by the capability code will be described with reference to FIG. 91.
  • FIG. 91 shows values of the capability code element included in the capability element according to another embodiment of the present invention.
  • The capability code may represent that broadband connection is required for component presentation. Specifically, the capability code may represent that download through a broadband is required for component presentation. In this case, the capability code may have a value of 0x02.
  • The capability code may represent a device capability necessary for video rendering. Specifically, the capability code may represent that SD video presentation is required for component presentation. In this case, the capability code may have a value of 0x80. Specifically, the capability code may represent that HD video presentation is required for component presentation. In this case, the capability code may have a value of 0x81. Specifically, the capability code may represent that UHD video presentation is required for component presentation. In this case, the capability code may have a value of 0x82. Specifically, the capability code may represent that presentation of E-UHD video of 8K or higher is required for component presentation. In this case, the capability code may have a value of 0x83. Specifically, the capability code may represent that presentation of 3D video is required for component presentation. In this case, the capability code may have a value of 0x90. Specifically, the capability code may represent that presentation of high dynamic range video is required for component presentation. In this case, the capability code may have a value of 0x91. Specifically, the capability code may represent that presentation of wide color gamut video is required for component presentation. In this case, the capability code may have a value of 0x92.
  • In addition, the capability code may represent a device capability required for audio rendering. Specifically, the capability code may represent that presentation of stereo (2-channel) audio is required for component presentation. In this case, the capability code may have a value of 0xA0. Specifically, the capability code may represent that presentation of 5.1 channel audio is required for component presentation. In this case, the capability code may have a value of 0xA1. Specifically, the capability code may represent that presentation of 3D audio is required for component presentation. In this case, the capability code may have a value of 0xA2. Specifically, the capability code may represent that dialog level adjustment is required for component presentation. In this case, the capability code may have a value of 0xB1.
  • Furthermore, the capability code may represent a device capability required for application rendering. Specifically, the capability code may represent that a personal video recorder (PVR) function is required for component presentation. Here, the capability code may have a value of 0xC0. Specifically, the capability code may represent that a download function is required for component presentation. Specifically, the download function may refer to downloading to a persistent storage. Here, the capability code may have a value of 0xC1. Specifically, the capability code may represent that a DRM processing function is required for component presentation. Here, the capability code may have a value of 0xC2. Specifically, the capability code may represent that a conditional access (CA) processing function is required for component presentation. Here, the capability code may have a value of 0xC3.
  • The capability element will be described with reference to FIG. 90.
  • The capability element may include a capability string element indicating a string that represents a capability required for component presentation. The broadcast receiving apparatus 100 may display a capability required for component presentation based on the capability string element. Specifically, the broadcast receiving apparatus 100 may display a string indicated by the capability string element. In a specific embodiment, the capability string element may be referred to as CapabilityString.
  • The capability element may include a category element that indicates the category of the capability code. In a specific embodiment, the category element may be referred to as “category”. This will be described with reference to FIG. 92.
  • FIG. 92 shows values that may be represented by the category element of the capability element accordion to another embodiment of the present invention.
  • The category element may represent a download protocol required for component presentation. Here, the category element may have a value of 0x01. The category element may represent a forward error correction (FEC) algorithm required for component presentation. Here, the category element may have a value of 0x02. The category element may represent a wrapper/Archive format required for component presentation. Here, the category element may have a value of 0x03. The category element may represent a compression algorithm required for component presentation. Here, the category element may have a value of 0x04. The category element may represent a media type required for component presentation. Here, the category element may have a value of 0x05. The category element may represent a rendering capability required for component presentation. Here, the category element may have a value of 0x06.
  • In hybrid broadcast, various services including various types of component may be delivered. In addition, broadcast receiving apparatuses 100 have different presentation capabilities. Accordingly, the broadcast receiving apparatus 100 may display whether each component may be presented based on the capability element, as described above. Accordingly, the user may select content based on components included in services.
  • Broadcasters or content providers may sell content on a component-by-component basis. Specifically, broadcasters or content providers may separately sell some components included in a service. Specifically, broadcasters or content providers may sell components based on pay per-view (PPV). For example, broadcasters or content providers may provide base layer components of scalable video coding without charging and provide charged enhancement layer components. In addition, broadcasters or content providers may provide multi-view content while charging for components of some views. Furthermore, broadcasters or content providers may provide charged UHD components while providing HD component without charging. Broadcasters or content providers may provide charged stereo audio components. Broadcasters or content providers may provide audition related content while charging for a vote application with respect to the audition related content. This this end, the broadcast transmitting apparatus 100 needs to transmit charging information per component. In addition, the broadcast receiving apparatus 100 needs to display the charting information per component and provide an interface through which each component may be purchased. This will be described with reference to FIGS. 93 and 94.
  • FIGS. 93 and 94 illustrate a user interface providing payment per component according to an embodiment of the present invention.
  • The broadcast transmitting apparatus 10 may include charging information on each component in the aforementioned component information and deliver the component information including the charging information. Specifically, the broadcast transmitting apparatus 10 may include charging information on each component in the aforementioned component element and deliver the component element. In addition, the broadcast transmitting apparatus 10 may include charging information on each component in the aforementioned component fragment and deliver the component fragment.
  • The broadcast receiving apparatus 100 may acquire the charging information on each component based on the component information. Specifically, the broadcast receiving apparatus 100 may acquire the charging information on each component based on the component fragment. In addition, the broadcast receiving apparatus 100 may acquire the charging information on each component based on the component element. The broadcast receiving apparatus 100 may display the charging information on each component in the service guide menu. Specifically, the broadcast receiving apparatus 100 may represent that a corresponding component needs to be purchased to be presented in the service guide menu. Furthermore, the broadcast receiving apparatus 100 may display purchase conditions of the corresponding component in the service guide menu. The purchase conditions may include a price. The purchase conditions may include a presentable period. In the embodiment shown in FIG. 93, the broadcast receiving apparatus 100 displays the fact that a component including an alternative view of broadcast content needs to be purchased to be presented.
  • In addition, the broadcast receiving apparatus 100 may display the fact that a component included in content needs to be charged to be presented while presenting the content. The broadcast receiving apparatus 100 may provide a user interface through which the component is purchased. For example, the broadcast receiving apparatus 100 may display the fact that the component included in the content needs to be charged to be presented in the form of a message box while presenting the content. Furthermore, the broadcast receiving apparatus 100 may perform a procedure for purchasing the component based on user input. In the embodiment shown in FIG. 94, the broadcast receiving apparatus 100 displays the fact that alternative views of a baseball game may be viewed according to payment. When a user input for alternative views is received, the broadcast receiving apparatus 100 presents a component including the alternative views.
  • FIG. 95 illustrates an operation of the broadcast transmitting apparatus 10 according to an embodiment of the present invention.
  • The broadcast transmitting apparatus 10 acquires information about components (S101). The broadcast transmitting apparatus 10 may acquire the information about the components through the controller. Specifically, the broadcast transmitting apparatus 10 may acquire at least one of component types, device capabilities required for presentation of the components, roles of components, relationships with other components, information about services including the components, information about content including the components, information on target devices of the components, information on target users of the components, information on valid periods of the components, display start time of the components, display end time of the components and parental ratings of the components.
  • The broadcast transmitting apparatus 10 generates ESG data based on the information about the components (S103). The broadcast transmitting apparatus 10 may generate the ESG data based on the information about the component through the controller. Specifically, the broadcast transmitting apparatus 10 may generate the aforementioned component fragment based on the component. In another specific embodiment, the broadcast transmitting apparatus 10 may generate the aforementioned component element based on the components.
  • The broadcast transmitting apparatus 10 transmits a broadcast signal based on the information about the components (S105). The broadcast transmitting apparatus 10 may transmit the broadcast signal based on the information about the components through the transmitting unit. Specifically, the broadcast transmitting apparatus 10 may transmit a broadcast signal including the ESG data. In another specific embodiment, the broadcast transmitting apparatus 10 may transmit a broadcast signal including ESG data signaling information for ESG data reception. Here, the content server 50 may separately transmit the ESG data through a broadband.
  • FIG. 96 illustrates an operation of the broadcast receiving apparatus 100 according to an embodiment of the present invention.
  • The broadcast receiving apparatus 100 receives a broadcast signal (S201). The broadcast receiving apparatus 100 may receive the broadcast signal through the broadcast receiving unit 110.
  • The broadcast receiving apparatus 100 acquires ESG data based on the broadcast signal (S203). The broadcast receiving apparatus 100 may acquire the ESG data based on the broadcast signal through the controller 150. Specifically, the broadcast receiving apparatus 100 may acquire the ESG data from the broadcast signal. In another specific embodiment, the broadcast receiving apparatus 100 may extract the ESG data signaling information for ESG data reception from the broadcast signal. In addition, the broadcast receiving apparatus 100 may acquire the ESG data from the content server 50 based on the ESG data signaling information.
  • The broadcast receiving apparatus 100 acquires information about components based on the ESG data (S205). The broadcast receiving apparatus 100 may acquire the information about the components based on the ESG data through the controller 150. As described above, the information about the components may include at least one of component types, device capabilities required for presentation of the components, roles of components, relationships with other components, information about services including the components, information about content including the components, information on target devices of the components, information on target users of the components, information on valid periods of the components, display start time of the components, display end time of the components and parental ratings of the components.
  • The broadcast receiving apparatus 100 displays the information about the components (S207). The broadcast receiving apparatus 100 may display the information about the components through the controller 150. Specifically, the broadcast receiving apparatus 100 may display the information about the components in the service guide menu. For example, the broadcast receiving apparatus 100 may display the roles of components included in content in the service guide menu. In addition, the broadcast receiving apparatus 100 may display charging information of the components included in the content in the service guide menu. In another specific embodiment, the broadcast receiving apparatus 100 may display the information about the components on a content presentation screen. For example, the broadcast receiving apparatus 100 may display the roles of the components in a message box positioned at the lower or upper part of the content presentation screen. The broadcast receiving apparatus 100 may display the information about the components in a service list.
  • The broadcast receiving apparatus 100 receives a user input for a component (S209). The broadcast receiving apparatus 100 may receive a user input for a component through the controller 150. Specifically, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of a component. In another specific embodiment, the broadcast receiving apparatus 100 may receive a user input for reservation recording of a component. In another specific embodiment, the broadcast receiving apparatus 100 may receive a user input for viewing a component. In another specific embodiment, the broadcast receiving apparatus 100 may receive a user input for recording a component.
  • The broadcast receiving apparatus 100 presents a component based on the user input (S211). The broadcast receiving apparatus 100 may present a component based on the user input through the controller 150. Specifically, the broadcast receiving apparatus 100 may present a component corresponding to the user input. In a specific embodiment, the broadcast receiving apparatus 100 may present the component corresponding to the user input at a reservation viewing time. In another specific embodiment, the broadcast receiving apparatus 100 may immediately present the component corresponding to the user input. In another specific embodiment, the broadcast receiving apparatus 100 may immediately record the component corresponding to the user input. In another specific embodiment, the broadcast receiving apparatus 100 may record the component corresponding to the user input at a reservation recording time. Specific embodiments of the operation of the broadcast receiving apparatus 100 will be described with reference to the attached drawings.
  • FIG. 97 illustrates a content presentation screen of the broadcast receiving apparatus according to an embodiment of the present invention.
  • As described above, a service or content of hybrid broadcast may include a plurality of components. The plurality of component may be simultaneously presented. FIG. 97 shows simultaneous presentation of a component including a movie, a component including a sign language inset displayed on part of the screen, and a component including a follow subject video.
  • FIG. 98 shows a service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • As described above, the broadcast receiving apparatus 100 may display information about components. Specifically, the broadcast receiving apparatus 100 may display information about components included in a service. Specifically, the broadcast receiving apparatus 100 may display information about components included in content. In a specific embodiment, the broadcast receiving apparatus 100 may display information about components in the service guide menu. For example, the broadcast receiving apparatus 100 may display the roles of components included in content, as shown in FIG. 98(b). The broadcast receiving apparatus 100 may display information about components in the service guide menu based on user input. For example, the broadcast receiving apparatus 100 may display the service guide menu without displaying component information, as shown in FIG. 98(b). Here, when a user input for component information display is received, the broadcast receiving apparatus 100 may display information about components in the service guide menu as shown in FIG. 98(b). In a specific embodiment, the broadcast receiving apparatus 100 may display components based on content data type. Specifically, the broadcast receiving apparatus 100 may display information about components including data of a type selected by a user. For example, when a user input for selecting a video component is received, the broadcast receiving apparatus 100 may display the roles of video components in the service guide menu. When a user input for selecting an audio component is received, the broadcast receiving apparatus 100 may display the roles of audio components in the service guide menu. When a user input for selecting a closed captioning component is received, the broadcast receiving apparatus 100 may display the roles of closed captioning components in the service guide menu. When a user input for selecting an App-based enhancement component is received, the broadcast receiving apparatus 100 may display the roles of App-based enhancement components in the service guide menu. Accordingly, the user may select content based on components included content or services.
  • FIGS. 99 to 105 illustrate reservation viewing through the service guide menu of the broadcast receiving apparatus according to an embodiment of the present invention.
  • As shown in FIGS. 99 to 101, the broadcast receiving apparatus 100 may display information about video components included in content through the service guide menu.
  • Specifically, as shown in FIG. 99, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a component including a sign language inset through the service guide menu. The broadcast receiving apparatus 100 may display setting of reservation viewing in the service guide menu. The broadcast receiving apparatus 100 may present the corresponding content at a reservation viewing time.
  • The broadcast receiving apparatus 100 may cause the companion apparatus 300 to present the component for which the user sets reservation viewing. Specifically, the broadcast receiving apparatus 100 may deliver information about the component for which reservation viewing is set to the companion apparatus 300. Here, the companion apparatus 300 may present the component based on the information about the component for which reservation viewing is set.
  • As shown in FIG. 100, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including an alternative view component through the service guide menu. The broadcast receiving apparatus 100 may display setting of reservation viewing in the service guide menu. In addition, the broadcast receiving apparatus 100 may deliver information about the component to the companion apparatus 300. The companion apparatus 300 may present the corresponding content at a reservation viewing time.
  • As shown in FIG. 101, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a follow subject video component through the service guide menu. As described above, the companion apparatus 300 may present the corresponding content at a reservation viewing time.
  • As shown in FIG. 102, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a follow subject video component through the service guide menu.
  • As shown in FIGS. 102 to 104, the broadcast receiving apparatus 100 may display information about audio components included in content in the service guide menu.
  • Specifically, as shown in FIG. 102, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a music component through the service guide menu. As shown in FIG. 103, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a component including a dialog level adjustment function through the service guide menu. As shown in FIG. 104, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a component for visually impaired through the service guide menu. The broadcast receiving apparatus 100 may display setting of reservation viewing through the service guide menu. In addition, the broadcast receiving apparatus 100 may present corresponding content at a reservation viewing time.
  • As shown in FIG. 105, the broadcast receiving apparatus 100 may display information about closed captioning components included in content in the service guide menu. In addition, the broadcast receiving apparatus 100 may receive a user input for reservation viewing of content including a closed captioning component for kindergarteners and elementary school students through the service guide menu. The broadcast receiving apparatus 100 may display setting of reservation viewing in the service guide menu. The broadcast receiving apparatus 100 may present the corresponding content at a reservation viewing time.
  • In addition, the broadcast receiving apparatus 100 may display information about App-based enhancement components included in content in the service guide menu.
  • As described above, the broadcast receiving apparatus 100 may provide information about various components provided by hybrid broadcast based on ESG data. Accordingly, the broadcast receiving apparatus 100 may enhance convenience of selection of services or content by a user. Particularly, the broadcast receiving apparatus 100 may provide information about content or services scheduled to be broadcast in the future as well as currently presented services or content such that the user may easily select content or services.
  • The features, structures and effects described in the above embodiments are included in at least one embodiment of the present invention and are not limited to only one embodiment. Furthermore, features, structures and effects exemplified in each embodiment can be combined or modified in other embodiments by those skilled in the art. Therefore, such combination and modification should be interpreted as being included in the scope of the present invention.
  • Those skilled in the art will appreciate that the present invention may be carried out in other specific ways than those set forth herein without departing from the spirit and essential characteristics of the present invention. Therefore, the scope of the invention should be determined by the appended claims and their legal equivalents, not by the above description, and all changes coming within the meaning and equivalency range of the appended claims are intended to be embraced therein.

Claims (20)

1. A broadcast receiving apparatus comprising:
a broadcast receiving unit configured to receive a broadcast signal; and
a controller configured to receive electrical service guide (ESG) data including information about a broadcast service guide based onbased on the broadcast signal and to acquire information about a component included in at least one of a broadcast service and broadcast content based on based on the ESG data.
2. The broadcast receiving apparatus according to claim 1, wherein the information about the component includes device capability information indicating a device capability required to present the component.
3. The broadcast receiving apparatus according to claim 2, wherein the controller is configured to display the information about the component based on the device capability information.
4. The broadcast receiving apparatus according to claim 3, wherein the controller is configured to discriminately display information about a component presentable by the broadcast receiving apparatus and information about a component unpresentable by the broadcast receiving apparatus.
5. The broadcast receiving apparatus according to claim 1, wherein the information about the component includes reference information indicating inclusion relationships between the component and other components, between the component and the broadcast content and between the component and the broadcast service.
6. The broadcast receiving apparatus according to claim 1, wherein the information about the component includes association information indicating an associated component.
7. The broadcast receiving apparatus according to claim 6, wherein the associated component represents a component presented along with the component.
8. The broadcast receiving apparatus according to claim 1, wherein the ESG data is divided into fragments corresponding to information units and includes a service fragment including information about the broadcast service and a content fragment including information about content included in the broadcast service.
9. The broadcast receiving apparatus according to claim 8, wherein the information about the content is a content fragment included in the ESG data.
10. The broadcast receiving apparatus according to claim 8, wherein the information about the component is a component element included in the content fragment.
11. The broadcast receiving apparatus according to claim 1, wherein the information about the component includes charging information about the component.
12. The broadcast receiving apparatus according to claim 1, wherein the controller is configured to display the information about the component in a service guide menu.
13. The broadcast receiving apparatus according to claim 12, wherein the controller is configured to display the role of the component in the service guide menu.
14. The broadcast receiving apparatus according to claim 12, wherein the controller is configured to display the charging information about the component in the service guide menu.
15. The broadcast receiving apparatus according to claim 12, wherein the controller displays the information about the component based onbased on types of data included in the component.
16. The broadcast receiving apparatus according to claim 15, wherein the controller is configured to display the information about the component including data of a type selected by a user.
17. A method of operating a broadcast receiving apparatus, comprising:
receiving a broadcast signal;
receiving ESG data including information about a broadcast service guide based onbased on the broadcast signal; and
acquiring information about a component included in at least one of a broadcast service and broadcast content based onbased on the ESG data.
18. The method of operating a broadcast receiving apparatus according to claim 17, wherein the information about the component includes device capability information indicating a device capability required to present the component.
19. The method of operating a broadcast receiving apparatus according to claim 18, further comprising displaying the information about the component based onbased on the device capability information.
20. A broadcast transmitting apparatus comprising:
a controller configured to acquire information about a component included in at least one of a broadcast service and broadcast content and to generate ESG data including information about a broadcast service guide based onbased on the information about the component; and
a transmitting unit configured to transmit a broadcast signal based onbased on the ESG data.
US15/306,975 2014-04-27 2015-04-27 Broadcast transmitting apparatus, method of operating broadcast transmitting apparatus, broadcast receiving apparatus and method of operating broadcast receiving apparatus Abandoned US20170188099A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/306,975 US20170188099A1 (en) 2014-04-27 2015-04-27 Broadcast transmitting apparatus, method of operating broadcast transmitting apparatus, broadcast receiving apparatus and method of operating broadcast receiving apparatus

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201461984854P 2014-04-27 2014-04-27
US201461991624P 2014-05-12 2014-05-12
US201462000515P 2014-05-19 2014-05-19
US201462003039P 2014-05-27 2014-05-27
PCT/KR2015/004162 WO2015167184A1 (en) 2014-04-27 2015-04-27 Broadcast transmitting apparatus, method for operating broadcast transmitting apparatus, broadcast receiving apparatus, and method for operating broadcast receiving apparatus
US15/306,975 US20170188099A1 (en) 2014-04-27 2015-04-27 Broadcast transmitting apparatus, method of operating broadcast transmitting apparatus, broadcast receiving apparatus and method of operating broadcast receiving apparatus

Publications (1)

Publication Number Publication Date
US20170188099A1 true US20170188099A1 (en) 2017-06-29

Family

ID=54358852

Family Applications (13)

Application Number Title Priority Date Filing Date
US15/115,875 Active US10306278B2 (en) 2014-04-27 2015-04-27 Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/306,975 Abandoned US20170188099A1 (en) 2014-04-27 2015-04-27 Broadcast transmitting apparatus, method of operating broadcast transmitting apparatus, broadcast receiving apparatus and method of operating broadcast receiving apparatus
US15/115,864 Active US10306277B2 (en) 2014-04-27 2015-04-27 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/034,718 Active US10666993B2 (en) 2014-04-27 2015-04-27 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/115,567 Active US9888271B2 (en) 2014-04-27 2015-04-27 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/872,578 Active US10284886B2 (en) 2014-04-27 2018-01-16 Broadcast signal transmitting apparatus, boradcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/206,494 Active US10848797B2 (en) 2014-04-27 2018-11-30 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/359,272 Expired - Fee Related US10567815B2 (en) 2014-04-27 2019-03-20 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/385,803 Active US10743044B2 (en) 2014-04-27 2019-04-16 Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/732,693 Active US10939147B2 (en) 2014-04-27 2020-01-02 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/847,779 Active US10887635B2 (en) 2014-04-27 2020-04-14 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/916,870 Active US11070859B2 (en) 2014-04-27 2020-06-30 Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal
US17/083,101 Active US11570494B2 (en) 2014-04-27 2020-10-28 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/115,875 Active US10306278B2 (en) 2014-04-27 2015-04-27 Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal

Family Applications After (11)

Application Number Title Priority Date Filing Date
US15/115,864 Active US10306277B2 (en) 2014-04-27 2015-04-27 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/034,718 Active US10666993B2 (en) 2014-04-27 2015-04-27 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/115,567 Active US9888271B2 (en) 2014-04-27 2015-04-27 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US15/872,578 Active US10284886B2 (en) 2014-04-27 2018-01-16 Broadcast signal transmitting apparatus, boradcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/206,494 Active US10848797B2 (en) 2014-04-27 2018-11-30 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/359,272 Expired - Fee Related US10567815B2 (en) 2014-04-27 2019-03-20 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/385,803 Active US10743044B2 (en) 2014-04-27 2019-04-16 Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/732,693 Active US10939147B2 (en) 2014-04-27 2020-01-02 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/847,779 Active US10887635B2 (en) 2014-04-27 2020-04-14 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US16/916,870 Active US11070859B2 (en) 2014-04-27 2020-06-30 Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal
US17/083,101 Active US11570494B2 (en) 2014-04-27 2020-10-28 Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal

Country Status (8)

Country Link
US (13) US10306278B2 (en)
EP (4) EP3139526A4 (en)
JP (2) JP6599864B2 (en)
KR (6) KR101801593B1 (en)
CN (7) CN110177290B (en)
CA (3) CA3077488C (en)
MX (1) MX357843B (en)
WO (5) WO2015167186A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6599864B2 (en) * 2014-04-27 2019-11-06 エルジー エレクトロニクス インコーポレイティド Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, broadcast signal transmitting method, and broadcast signal receiving method
WO2015177986A1 (en) * 2014-05-19 2015-11-26 Sharp Kabushiki Kaisha A method for decoding a service guide
CA2948131A1 (en) * 2014-05-22 2015-11-26 Sharp Kabushiki Kaisha Method for decoding
US20170118503A1 (en) * 2014-06-20 2017-04-27 Sharp Kabushiki Kaisha Methods for xml representation of device capabilities
CA2959353A1 (en) * 2014-09-05 2016-03-10 Sharp Kabushiki Kaisha Syntax and semantics for device capabilities
KR101812186B1 (en) 2014-09-25 2017-12-26 엘지전자 주식회사 Broadcasting signal transmitting device, broadcasting signal receiving device, broadcasting signal transmitting method, and broadcasting signal receiving method
CN108024120B (en) * 2016-11-04 2020-04-17 上海动听网络科技有限公司 Audio generation, playing and answering method and device and audio transmission system
KR101967299B1 (en) * 2017-12-19 2019-04-09 엘지전자 주식회사 Autonomous vehicle for receiving a broadcasting signal and method of Autonomous vehicle for receiving a broadcasting signal
US11360466B2 (en) * 2019-01-04 2022-06-14 Gracenote, Inc. Generation of media station previews using a secondary tuner
CN112990120B (en) * 2021-04-25 2022-09-16 昆明理工大学 Cross-domain pedestrian re-identification method using camera style separation domain information

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030005454A1 (en) * 2001-06-29 2003-01-02 Rodriguez Arturo A. System and method for archiving multiple downloaded recordable media content
US20040017831A1 (en) * 2002-04-05 2004-01-29 Jian Shen System and method for processing SI data from multiple input transport streams
US20050060758A1 (en) * 2003-09-17 2005-03-17 Lg Electronic Inc. Digital broadcast receiver and method for processing caption thereof
US20090034556A1 (en) * 2007-06-29 2009-02-05 Lg Electronics Inc. Digital broadcasting system and method of processing data
US20090249392A1 (en) * 2008-03-28 2009-10-01 Lg Electronics Inc. Digital broadcast receiver and method for processing caption thereof
US20100019563A1 (en) * 2007-01-23 2010-01-28 John Thomson Guide shoe for a roller-type loader and wear inserts for guide shoes
US20100021447A1 (en) * 2005-11-22 2010-01-28 Universität Leipzig Medicament for Treating Problems Relating to Fertility and Pregnancy, and Autoimmune Diseases, and for Inducing an Immunological Tolerance in Transplant Patients, and Method for Producing Said Medicament
US20100162339A1 (en) * 2008-12-09 2010-06-24 Lg Electronics Inc Method for processing targeting descriptor in non-real-time receiver
US20100214474A1 (en) * 2009-02-26 2010-08-26 Funai Electric Co., Ltd. Video device
US20100299702A1 (en) * 2009-05-19 2010-11-25 Qualcomm Incorporated Delivery of selective content to client applications by mobile broadcast device with content filtering capability
US20100299708A1 (en) * 2009-05-20 2010-11-25 Samsung Electronics Co., Ltd. Method and apparatus for transmitting and receiving multi-format digital broadcasts
US20110011970A1 (en) * 2008-03-25 2011-01-20 Erik Rydsmo Seat Belt Pretensioner
US20110119708A1 (en) * 2009-11-13 2011-05-19 Samsung Electronics Co., Ltd. Method and apparatus for generating multimedia stream for 3-dimensional reproduction of additional video reproduction information, and method and apparatus for receiving multimedia stream for 3-dimensional reproduction of additional video reproduction information
US20110149036A1 (en) * 2008-12-02 2011-06-23 Jong-Yeul Suh Method for displaying 3d caption and 3d display apparatus for implementing the same
US20120033150A1 (en) * 2010-08-09 2012-02-09 Samsung Mobile Display Co., Ltd. Liquid crystal display panel and method of fabricating the same
US20130006127A1 (en) * 2007-05-16 2013-01-03 Parlikar Tushar A Systems and methods for model-based estimation of cardiac output and total peripheral resistance
US20130061275A1 (en) * 2010-03-11 2013-03-07 Lg Electronics Inc. Non-real-time broadcast service processing system and processing method thereof
US20130219435A1 (en) * 2012-02-17 2013-08-22 Echostar Technologies L.L.C. Channel tuning redirect
US20150373669A1 (en) * 2007-08-24 2015-12-24 Lg Electronics Inc. Digital broadcasting system and method of processing data in digital broadcasting system
US20170013285A1 (en) * 2014-04-27 2017-01-12 Lg Electronics Inc. Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal

Family Cites Families (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900908A (en) * 1995-03-02 1999-05-04 National Captioning Insitute, Inc. System and method for providing described television services
JP4221624B2 (en) * 1998-02-12 2009-02-12 ソニー株式会社 EPG transmission apparatus and method, EPG reception apparatus and method, and recording medium
GB2338364B (en) 1998-06-12 2003-03-05 British Sky Broadcasting Ltd Improvements in receivers for television signals
US6732370B1 (en) * 1998-11-30 2004-05-04 Diva Systems Corporation Service provider side interactive program guide encoder
CA2366057C (en) 1999-03-05 2009-03-24 Canon Kabushiki Kaisha Database annotation and retrieval
US7934232B1 (en) * 2000-05-04 2011-04-26 Jerding Dean F Navigation paradigm for access to television services
KR100391179B1 (en) * 2000-08-02 2003-07-12 한국전력공사 Teleoperated mobile cleanup device for highly radioactive fine waste
US20020078459A1 (en) * 2000-08-30 2002-06-20 Mckay Brent Interactive electronic directory service, public information and general content delivery system and method
CN1483263A (en) 2000-10-26 2004-03-17 ���ĺ� Initial free charge preview of multimedia multicast content
US20020170068A1 (en) 2001-03-19 2002-11-14 Rafey Richter A. Virtual and condensed television programs
US7110664B2 (en) 2001-04-20 2006-09-19 Front Porch Digital, Inc. Methods and apparatus for indexing and archiving encoded audio-video data
US7035468B2 (en) 2001-04-20 2006-04-25 Front Porch Digital Inc. Methods and apparatus for archiving, indexing and accessing audio and video data
US7908628B2 (en) 2001-08-03 2011-03-15 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator content coding and formatting
US7092888B1 (en) 2001-10-26 2006-08-15 Verizon Corporate Services Group Inc. Unsupervised training in natural language call routing
US7774815B1 (en) 2002-09-30 2010-08-10 Arris Group, Inc. Context-sensitive interactive television ticker
US8712218B1 (en) 2002-12-17 2014-04-29 At&T Intellectual Property Ii, L.P. System and method for providing program recommendations through multimedia searching based on established viewer preferences
GB0307451D0 (en) * 2003-03-31 2003-05-07 Matsushita Electric Ind Co Ltd Digital receiver with aural interface
GB2405557A (en) 2003-08-27 2005-03-02 Nokia Corp Service identification data relating services at a given frequency to services and identifying their media format
US7418472B2 (en) 2003-09-30 2008-08-26 Microsoft Corporation Systems and methods for determining remote device media capabilities
US20050188411A1 (en) 2004-02-19 2005-08-25 Sony Corporation System and method for providing content list in response to selected closed caption word
US7469979B2 (en) 2004-03-19 2008-12-30 Steelcase Inc. Pedestal system
CN1674645A (en) * 2004-03-24 2005-09-28 乐金电子(沈阳)有限公司 Digital television and representing method for electronic card thereof
KR100617128B1 (en) * 2004-11-17 2006-08-31 엘지전자 주식회사 Method and Apparatus for digital broadcast
US20070124788A1 (en) 2004-11-25 2007-05-31 Erland Wittkoter Appliance and method for client-sided synchronization of audio/video content and external data
TW200704183A (en) * 2005-01-27 2007-01-16 Matrix Tv Dynamic mosaic extended electronic programming guide for television program selection and display
US7614068B2 (en) * 2005-03-18 2009-11-03 Nokia Corporation Prioritization of electronic service guide carousels
US8387089B1 (en) * 2005-05-06 2013-02-26 Rovi Guides, Inc. Systems and methods for providing a scan
EP1753166A3 (en) * 2005-08-11 2007-08-29 Samsung Electronics Co., Ltd. Method and system for transmitting and receiving access information for a broadcast service
US7738863B2 (en) * 2005-08-25 2010-06-15 Nokia Corporation IP datacasting middleware
US8607271B2 (en) 2005-08-26 2013-12-10 Nokia Corporation Method to deliver messaging templates in digital broadcast service guide
US8024768B2 (en) 2005-09-15 2011-09-20 Penthera Partners, Inc. Broadcasting video content to devices having different video presentation capabilities
CA2625225A1 (en) * 2005-10-14 2007-04-19 Nokia Corporation Declaring terminal provisioning with service guide
JP2009515386A (en) 2005-11-01 2009-04-09 ノキア コーポレイション Method for enabling identification of range ESG fragments and stratification within ranges
US8763036B2 (en) 2005-11-04 2014-06-24 Nokia Corporation Method for indicating service types in the service guide
KR100871243B1 (en) 2005-11-07 2008-11-28 삼성전자주식회사 Method and apparatus for transmitting a service guide source in a mobile broadcasting system
US20070110057A1 (en) * 2005-11-07 2007-05-17 Sung-Oh Hwang Method and apparatus for transmitting service guide source in a mobile broadcast system
US7801910B2 (en) 2005-11-09 2010-09-21 Ramp Holdings, Inc. Method and apparatus for timed tagging of media content
JP2009524273A (en) 2005-11-29 2009-06-25 グーグル・インコーポレーテッド Repetitive content detection in broadcast media
KR101179828B1 (en) 2005-12-16 2012-09-04 삼성전자주식회사 Method and apparatus for structure of Electornic Service Guide according to relationship between service data stream and ESG data model in Digital Video Broadcasting system
KR100890037B1 (en) 2006-02-03 2009-03-25 삼성전자주식회사 Method and system for sharing generated service guide and its fragments in mobile broadcast system
US8458753B2 (en) 2006-02-27 2013-06-04 Time Warner Cable Enterprises Llc Methods and apparatus for device capabilities discovery and utilization within a content-based network
US8115869B2 (en) 2007-02-28 2012-02-14 Samsung Electronics Co., Ltd. Method and system for extracting relevant information from content metadata
CN101094434B (en) * 2006-06-19 2011-02-02 华为技术有限公司 Method for subscribing service related notification in mobile broadcast system
KR100793801B1 (en) * 2006-07-06 2008-01-11 엘지전자 주식회사 Method and device for displaying electronic program guide of tv
CN101132292A (en) * 2006-08-22 2008-02-27 华为技术有限公司 Method and system for transmitting electric program guidebooks
US20080091713A1 (en) 2006-10-16 2008-04-17 Candelore Brant L Capture of television metadata via OCR
US8296808B2 (en) 2006-10-23 2012-10-23 Sony Corporation Metadata from image recognition
US7689613B2 (en) 2006-10-23 2010-03-30 Sony Corporation OCR input to search engine
US7814524B2 (en) 2007-02-14 2010-10-12 Sony Corporation Capture of configuration and service provider data via OCR
KR100856208B1 (en) 2006-12-15 2008-09-03 삼성전자주식회사 Method for providing the application information of bradcasting data service in dvb-h system and the system therefor
KR101377951B1 (en) 2007-05-18 2014-03-25 엘지전자 주식회사 method of transmitting and receiving service guide information and apparatus for transmitting and receiving service guide information
KR101356499B1 (en) 2007-05-18 2014-01-29 엘지전자 주식회사 method of transmitting and receiving service guide information and apparatus for transmitting and receiving service guide information
JP4345848B2 (en) 2007-05-25 2009-10-14 船井電機株式会社 Digital broadcast receiver
KR20080107137A (en) 2007-06-05 2008-12-10 엘지전자 주식회사 Method of transmitting and receiving service guide information and apparatus for transmitting and receiving service guide information
KR101486373B1 (en) * 2007-07-29 2015-01-26 엘지전자 주식회사 Digital broadcasting system and method of processing data in digital broadcasting system
KR101420871B1 (en) 2007-08-21 2014-07-17 삼성전자주식회사 Method and apparatus for providing multi contents in an open mobile alliance mobile broadcasting service and system thereof
US20090070659A1 (en) * 2007-09-11 2009-03-12 Legend Silicon Corp. Ldpc decoder with an improved llr update method using a set of relative values free from a shifting action
KR101418591B1 (en) 2007-10-05 2014-07-10 삼성전자주식회사 Apparatus and method for announcing service guides in mobile communication system
KR20090069689A (en) 2007-12-26 2009-07-01 엘지전자 주식회사 Method of receiving service guide information and apparatus for receiving service guide information
CN101500135A (en) 2008-02-03 2009-08-05 北京视博数字电视科技有限公司 Program ordering method, system for conditional receiving system and terminal thereof
US9503691B2 (en) * 2008-02-19 2016-11-22 Time Warner Cable Enterprises Llc Methods and apparatus for enhanced advertising and promotional delivery in a network
US20090253416A1 (en) 2008-04-04 2009-10-08 Samsung Electronics Co. Ltd. Method and system for providing user defined bundle in a mobile broadcast system
EP2109313B1 (en) 2008-04-09 2016-01-13 Sony Computer Entertainment Europe Limited Television receiver and method
CN101583092A (en) * 2008-05-14 2009-11-18 三星电子株式会社 Data reproduction device and data reproduction method
WO2010058962A2 (en) 2008-11-18 2010-05-27 Lg Electronics Inc. Method for receiving a broadcast signal and broadcast receiver
JP5225037B2 (en) * 2008-11-19 2013-07-03 株式会社東芝 Program information display apparatus and method
CA2690174C (en) 2009-01-13 2014-10-14 Crim (Centre De Recherche Informatique De Montreal) Identifying keyword occurrences in audio data
US20100180310A1 (en) * 2009-01-15 2010-07-15 Samsung Electronics Co., Ltd. Rich media-enabled service guide provision method and system for broadcast service
KR20100084104A (en) * 2009-01-15 2010-07-23 삼성전자주식회사 A method for offering service guide using rich media in a digital broadcast system and a system thereof
KR101076587B1 (en) 2009-03-03 2011-10-26 엘지전자 주식회사 Method for obtaining content selectively using service guide delivered through broadcast network, and device using the same
JP5440836B2 (en) * 2009-03-24 2014-03-12 ソニー株式会社 Receiving apparatus and method, program, and receiving system
US20100250764A1 (en) 2009-03-31 2010-09-30 Nokia Corporation Method and Apparatus for Signaling Layer Information of Scalable Media Data
KR20100127162A (en) 2009-05-25 2010-12-03 엘지전자 주식회사 Method and apparatus for searching and downloading related contents in broadcast service at terminal
US20100316131A1 (en) 2009-06-12 2010-12-16 Motorola, Inc. Macroblock level no-reference objective quality estimation of video
US8438600B2 (en) * 2009-08-20 2013-05-07 Lg Electronics Inc. Method of processing EPG metadata in network device and network device for controlling the same
US9014546B2 (en) 2009-09-23 2015-04-21 Rovi Guides, Inc. Systems and methods for automatically detecting users within detection regions of media devices
JP2011091619A (en) 2009-10-22 2011-05-06 Sony Corp Transmitting apparatus, transmitting method, receiving apparatus, receiving method, program, and broadcasting system
US8914835B2 (en) 2009-10-28 2014-12-16 Qualcomm Incorporated Streaming encoded video data
KR101179826B1 (en) 2010-02-26 2012-09-04 고려대학교 산학협력단 Electric furnace capable of controlling amount of heat and method for melting metal and silicon
US8817072B2 (en) 2010-03-12 2014-08-26 Sony Corporation Disparity data transport and signaling
KR20110105092A (en) * 2010-03-18 2011-09-26 삼성전자주식회사 Apparatus and method for providing service access information in mobile broadcasting system
US8572488B2 (en) 2010-03-29 2013-10-29 Avid Technology, Inc. Spot dialog editor
CN102918857B (en) 2010-04-02 2015-11-25 三星电子株式会社 For sending the method and apparatus of the digital broadcast content for providing two and three dimensions content and the method and apparatus for receiving digital broadcast content
US9307272B2 (en) * 2010-04-16 2016-04-05 Lg Electronics Inc. Purchase transaction method for IPTV product and IPTV receiver thereof
US20110289533A1 (en) 2010-05-18 2011-11-24 Rovi Technologies Corporation Caching data in a content system
WO2011146276A2 (en) 2010-05-19 2011-11-24 Google Inc. Television related searching
US8694533B2 (en) 2010-05-19 2014-04-08 Google Inc. Presenting mobile content based on programming context
US8898723B2 (en) * 2010-08-20 2014-11-25 Sony Corporation Virtual channel declarative script binding
CN102137298B (en) 2011-03-02 2015-12-09 华为技术有限公司 The acquisition methods of 3D form descriptor and device
US8670505B2 (en) * 2011-03-31 2014-03-11 Subrahmanya Kondageri Shankaraiah Early detection of segment type using BPSK and DBPSK modulated carriers in ISDB-T receivers
US9171549B2 (en) * 2011-04-08 2015-10-27 Dolby Laboratories Licensing Corporation Automatic configuration of metadata for use in mixing audio programs from two encoded bitstreams
JP2014519732A (en) 2011-05-01 2014-08-14 サムスン エレクトロニクス カンパニー リミテッド Method and apparatus for transmitting / receiving broadcast service in digital broadcasting system, and system thereof
TWI562560B (en) 2011-05-09 2016-12-11 Sony Corp Encoder and encoding method providing incremental redundancy
US9723362B2 (en) 2011-06-07 2017-08-01 Lg Electronics Inc. Method for transmitting and receiving broadcast service and receiving device thereof
US9584238B2 (en) * 2011-06-24 2017-02-28 Nokia Corporation Accessing service guide information in a digital video broadcast system
EP2768198B1 (en) 2011-10-13 2019-05-22 Samsung Electronics Co., Ltd. Apparatus and method for configuring control message in broadcasting system
US10498473B2 (en) 2011-10-13 2019-12-03 Samsung Electronics Co. Ltd Method and apparatus for transmitting and receiving multimedia service
US9106265B2 (en) * 2011-11-04 2015-08-11 Silicon Laboratories Inc. Receive data flow path using a single FEC decoder
US9113230B2 (en) * 2011-12-21 2015-08-18 Sony Corporation Method, computer program, and reception apparatus for delivery of supplemental content
EP2618532A1 (en) * 2012-01-19 2013-07-24 Panasonic Corporation Improved Component Interleaving for Rotated Constellations
US9936231B2 (en) 2012-03-21 2018-04-03 Saturn Licensing Llc Trigger compaction
KR101915128B1 (en) 2012-05-17 2018-11-05 엘지전자 주식회사 Electronic device and method for information about service provider
US8806561B2 (en) 2012-08-07 2014-08-12 Lg Electronics Inc. Method and an apparatus for processing a broadcast signal including an interactive broadcast service
EA031744B1 (en) 2013-02-11 2019-02-28 Бэнг Энд Клин Гмбх Method and device for cleaning interiors of tanks and systems
KR102061044B1 (en) * 2013-04-30 2020-01-02 삼성전자 주식회사 Method and system for translating sign language and descriptive video service
KR101500135B1 (en) 2013-09-04 2015-03-06 주식회사 포스코 Packing assembling apparatus of blow tube
BR112016006860B8 (en) 2013-09-13 2023-01-10 Arris Entpr Inc APPARATUS AND METHOD FOR CREATING A SINGLE DATA STREAM OF COMBINED INFORMATION FOR RENDERING ON A CUSTOMER COMPUTING DEVICE
CN103848358B (en) 2014-03-24 2017-03-01 谢国贤 A kind of double cylinder jacking mechanism of tower crane tower body
CN106105249B (en) 2014-04-21 2019-07-26 夏普株式会社 Method for decoding service guide
CA2948131A1 (en) * 2014-05-22 2015-11-26 Sharp Kabushiki Kaisha Method for decoding

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030005454A1 (en) * 2001-06-29 2003-01-02 Rodriguez Arturo A. System and method for archiving multiple downloaded recordable media content
US20040017831A1 (en) * 2002-04-05 2004-01-29 Jian Shen System and method for processing SI data from multiple input transport streams
US20050060758A1 (en) * 2003-09-17 2005-03-17 Lg Electronic Inc. Digital broadcast receiver and method for processing caption thereof
US20100021447A1 (en) * 2005-11-22 2010-01-28 Universität Leipzig Medicament for Treating Problems Relating to Fertility and Pregnancy, and Autoimmune Diseases, and for Inducing an Immunological Tolerance in Transplant Patients, and Method for Producing Said Medicament
US20100019563A1 (en) * 2007-01-23 2010-01-28 John Thomson Guide shoe for a roller-type loader and wear inserts for guide shoes
US20130006127A1 (en) * 2007-05-16 2013-01-03 Parlikar Tushar A Systems and methods for model-based estimation of cardiac output and total peripheral resistance
US20090034556A1 (en) * 2007-06-29 2009-02-05 Lg Electronics Inc. Digital broadcasting system and method of processing data
US20150373669A1 (en) * 2007-08-24 2015-12-24 Lg Electronics Inc. Digital broadcasting system and method of processing data in digital broadcasting system
US20110011970A1 (en) * 2008-03-25 2011-01-20 Erik Rydsmo Seat Belt Pretensioner
US20090249392A1 (en) * 2008-03-28 2009-10-01 Lg Electronics Inc. Digital broadcast receiver and method for processing caption thereof
US20110149036A1 (en) * 2008-12-02 2011-06-23 Jong-Yeul Suh Method for displaying 3d caption and 3d display apparatus for implementing the same
US20100162339A1 (en) * 2008-12-09 2010-06-24 Lg Electronics Inc Method for processing targeting descriptor in non-real-time receiver
US20100214474A1 (en) * 2009-02-26 2010-08-26 Funai Electric Co., Ltd. Video device
US20100299702A1 (en) * 2009-05-19 2010-11-25 Qualcomm Incorporated Delivery of selective content to client applications by mobile broadcast device with content filtering capability
US20100299708A1 (en) * 2009-05-20 2010-11-25 Samsung Electronics Co., Ltd. Method and apparatus for transmitting and receiving multi-format digital broadcasts
US20110119708A1 (en) * 2009-11-13 2011-05-19 Samsung Electronics Co., Ltd. Method and apparatus for generating multimedia stream for 3-dimensional reproduction of additional video reproduction information, and method and apparatus for receiving multimedia stream for 3-dimensional reproduction of additional video reproduction information
US20130061275A1 (en) * 2010-03-11 2013-03-07 Lg Electronics Inc. Non-real-time broadcast service processing system and processing method thereof
US20120033150A1 (en) * 2010-08-09 2012-02-09 Samsung Mobile Display Co., Ltd. Liquid crystal display panel and method of fabricating the same
US20130219435A1 (en) * 2012-02-17 2013-08-22 Echostar Technologies L.L.C. Channel tuning redirect
US20170013285A1 (en) * 2014-04-27 2017-01-12 Lg Electronics Inc. Apparatus for transmitting broadcast signal, apparatus for receiving broadcast signal, method for transmitting broadcast signal, and method for receiving broadcast signal

Also Published As

Publication number Publication date
US20200137430A1 (en) 2020-04-30
US10939147B2 (en) 2021-03-02
KR20180014223A (en) 2018-02-07
EP3139623A4 (en) 2017-10-11
US10567815B2 (en) 2020-02-18
KR20160102509A (en) 2016-08-30
CN106170983A (en) 2016-11-30
CN110177290A (en) 2019-08-27
US20190222870A1 (en) 2019-07-18
US10887635B2 (en) 2021-01-05
KR20160105835A (en) 2016-09-07
US10306278B2 (en) 2019-05-28
KR101865299B1 (en) 2018-06-07
CN105981374A (en) 2016-09-28
US11570494B2 (en) 2023-01-31
KR101801594B1 (en) 2017-11-27
CN106134213A (en) 2016-11-16
US20200245005A1 (en) 2020-07-30
CA3077488A1 (en) 2015-11-05
WO2015167190A1 (en) 2015-11-05
EP3139526A4 (en) 2017-10-11
CN106134112B (en) 2019-03-08
EP3139526A1 (en) 2017-03-08
MX2016010117A (en) 2016-10-07
JP2017514329A (en) 2017-06-01
EP3139618A4 (en) 2017-10-11
US20160269792A1 (en) 2016-09-15
US10666993B2 (en) 2020-05-26
EP3073730A1 (en) 2016-09-28
US9888271B2 (en) 2018-02-06
US20170180761A1 (en) 2017-06-22
CN105981374B (en) 2019-08-09
CN106134213B (en) 2019-07-12
CA2941597A1 (en) 2015-11-05
CN110267068A (en) 2019-09-20
WO2015167184A1 (en) 2015-11-05
JP6599864B2 (en) 2019-11-06
US20200329261A1 (en) 2020-10-15
US20190246150A1 (en) 2019-08-08
JP6360184B2 (en) 2018-08-01
WO2015167187A1 (en) 2015-11-05
KR101825000B1 (en) 2018-03-22
CN110267068B (en) 2021-11-02
US10284886B2 (en) 2019-05-07
EP3073730A4 (en) 2017-08-16
US10743044B2 (en) 2020-08-11
CA2941597C (en) 2020-05-05
US11070859B2 (en) 2021-07-20
CN110234021A (en) 2019-09-13
US20210105511A1 (en) 2021-04-08
WO2015167189A1 (en) 2015-11-05
KR101944834B1 (en) 2019-02-01
MX357843B (en) 2018-07-26
US20170013285A1 (en) 2017-01-12
KR20160071405A (en) 2016-06-21
US20180146220A1 (en) 2018-05-24
CN110234021B (en) 2021-07-23
US10848797B2 (en) 2020-11-24
CN106170983B (en) 2019-07-16
WO2015167186A1 (en) 2015-11-05
CA3077488C (en) 2022-08-16
CN106134112A (en) 2016-11-16
CA3077439A1 (en) 2015-11-05
US20190098342A1 (en) 2019-03-28
EP3139618A1 (en) 2017-03-08
KR20160106095A (en) 2016-09-09
JP2017507506A (en) 2017-03-16
CA3077439C (en) 2023-04-25
CN110177290B (en) 2021-10-26
KR101801592B1 (en) 2017-11-27
US20170171635A1 (en) 2017-06-15
EP3139623A1 (en) 2017-03-08
US10306277B2 (en) 2019-05-28
KR101801593B1 (en) 2017-11-27
KR20160142851A (en) 2016-12-13

Similar Documents

Publication Publication Date Title
US10939147B2 (en) Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, method for transmitting broadcast signal, and method for receiving broadcast signal
US11190846B2 (en) Service guide information transmission method, service guide information reception method, service guide information transmission device, and service guide information reception device
US20160227271A1 (en) Broadcast transmission device and operating method thereof, and broadcast reception device and operating method thereof
US20160337716A1 (en) Broadcast transmitting device and operating method thereof, and broadcast receiving device and operating method thereof

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION