WO2024029634A1 - Broadcast reception device, content protection method, processing method for adding reverberation sound, and control method for broadcast reception device - Google Patents

Broadcast reception device, content protection method, processing method for adding reverberation sound, and control method for broadcast reception device Download PDF

Info

Publication number
WO2024029634A1
WO2024029634A1 PCT/JP2023/029279 JP2023029279W WO2024029634A1 WO 2024029634 A1 WO2024029634 A1 WO 2024029634A1 JP 2023029279 W JP2023029279 W JP 2023029279W WO 2024029634 A1 WO2024029634 A1 WO 2024029634A1
Authority
WO
WIPO (PCT)
Prior art keywords
broadcast
signal
broadcast receiving
receiving device
layer
Prior art date
Application number
PCT/JP2023/029279
Other languages
French (fr)
Japanese (ja)
Inventor
信夫 益岡
拓也 清水
康宣 橋本
和彦 吉澤
仁 秋山
展明 甲
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022123799A external-priority patent/JP2024021157A/en
Priority claimed from JP2022151952A external-priority patent/JP2024046518A/en
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Publication of WO2024029634A1 publication Critical patent/WO2024029634A1/en

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K15/00Acoustics not otherwise provided for
    • G10K15/02Synthesis of acoustic waves
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K15/00Acoustics not otherwise provided for
    • G10K15/08Arrangements for producing a reverberation or echo sound
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/53Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers
    • H04H20/59Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers for emergency or urgency
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H40/00Arrangements specially adapted for receiving broadcast information
    • H04H40/18Arrangements characterised by circuits or components specially adapted for receiving
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S3/00Systems employing more than two channels, e.g. quadraphonic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control

Definitions

  • the present invention relates to a broadcast receiving device, a content protection method, a reverberation sound addition processing method, and a control method for a broadcast receiving device.
  • Digital broadcasting services began in various countries in the late 1990s to replace conventional analog broadcasting services.
  • Digital broadcasting services include improving broadcast quality using error correction technology, multi-channeling and HD (High Definition) using compression coding technology, BML (Broadcast Markup Language) and HTML5 (Hyper Text Markup Language). version 5)
  • BML Broadcast Markup Language
  • HTML5 Hyper Text Markup Language
  • Patent Document 1 There is a system described in Patent Document 1 as a technology for realizing UHD broadcasting in digital broadcasting services.
  • the system described in Patent Document 1 is intended to replace the current digital broadcasting, and does not take into consideration the maintenance of the viewing environment of the current digital broadcasting service.
  • An object of the present invention is to provide a technology for more appropriately transmitting or receiving advanced digital broadcasting services with higher functionality, taking into consideration compatibility with current digital broadcasting services.
  • a broadcast receiving device capable of receiving signals for each sound source via broadcast waves, which includes a broadcast receiving section that receives the broadcast waves, and a plurality of speakers, and which transmits signals from the broadcast receiving device.
  • an audio output unit that outputs audio according to the signal for each sound source, and a control unit;
  • the control unit determines a playback position of audio based on the signal for each sound source according to the arrangement information of the plurality of speakers, and generates a signal corresponding to the 22.2ch audio channel based on the signal for each sound source.
  • a broadcast receiving device may be used that converts a signal corresponding to the 22.2ch audio channel into a signal for the audio output section.
  • FIG. 1 is a system configuration diagram of a broadcasting system according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 2 is a detailed block diagram of a first tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a detailed block diagram of a second tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a detailed block diagram of a third tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a detailed block diagram of a fourth tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 2 is a detailed block diagram of a first tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 1 is a software configuration diagram of a broadcast receiving device according to an embodiment of the present invention.
  • FIG. 1 is a configuration diagram of a broadcasting station server according to an embodiment of the present invention.
  • FIG. 1 is a configuration diagram of a service provider server according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a portable information terminal according to an embodiment of the present invention.
  • FIG. 1 is a software configuration diagram of a portable information terminal according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a segment configuration related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating layer allocation in layered transmission related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating the generation process of OFDM transmission waves related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the basic configuration of a transmission line encoding unit related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating segment parameters of an OFDM system related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating transmission signal parameters related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the arrangement of pilot signals of synchronous modulation segments in digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the arrangement of pilot signals of differential modulation segments in digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating bit allocation of a TMCC carrier related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating bit allocation of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating transmission parameter information of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating system identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a carrier modulation mapping method for TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating frequency conversion processing identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating physical channel number identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating main signal identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating 4K signal transmission layer identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating additional layer transmission identification of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating identification of a coding rate of an inner code of TMCC information related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating bit allocation of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating configuration identification of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating seismic motion warning information of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating signal identification of seismic motion warning information of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating additional information regarding transmission control of a modulated wave of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating additional transmission parameter information of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an AC signal error correction method for digital broadcasting according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a constellation format of an AC signal related to digital broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a dual polarization transmission system according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a dual polarization transmission system according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a dual polarization transmission system according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating frequency conversion processing according to an embodiment of the present invention.
  • 1 is a diagram illustrating the configuration of a pass-through transmission method according to an embodiment of the present invention.
  • FIG. FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention.
  • 1 is a diagram illustrating the configuration of a pass-through transmission method according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a single polarization transmission method according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a single polarization transmission method according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a single polarization transmission method according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a layer division multiplex transmission system according to an embodiment of the present invention.
  • FIG. 1 is a system configuration diagram of a broadcasting system using a hierarchical division multiplex transmission system according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating frequency conversion amplification processing according to an embodiment of the present invention.
  • 1 is a system configuration diagram of a broadcasting system using a hierarchical division multiplex transmission system according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a protocol stack of MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of tables used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of tables used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS.
  • FIG. 2 is a diagram illustrating a protocol stack in an MMT broadcast transmission path.
  • FIG. 2 is a diagram illustrating a protocol stack in an MMT communication line.
  • FIG. 2 is a diagram explaining the names and functions of tables used in TLV-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT TLV-SI.
  • FIG. 2 is a diagram illustrating the names and functions of messages used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of tables used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT.
  • FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT.
  • FIG. 3 is a diagram illustrating the relationship between MMT data transmission and each table.
  • FIG. 3 is an operation sequence diagram of channel setting processing of the broadcast receiving device according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating the data structure of a network information table.
  • FIG. 2 is a diagram illustrating the data structure of a ground distribution system descriptor.
  • FIG. 3 is a diagram illustrating the data structure of a service list descriptor.
  • FIG. 3 is a diagram illustrating the data structure of a TS information descriptor.
  • FIG. 1 is an external view of a remote controller according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a banner display when selecting a channel according to an embodiment of the present invention. It is a figure explaining speaker arrangement. It is a figure explaining speaker arrangement. It is a figure explaining speaker arrangement. It is a figure explaining speaker arrangement.
  • FIG. 3 is a diagram illustrating the positional relationship when headphones are used.
  • FIG. 3 is a diagram illustrating the positional relationship when headphones are used.
  • This is an example of the configuration of an audio decoder for an audio signal consisting of only channel-based signals.
  • This is an example of the configuration of an audio decoder for advanced audio signals.
  • This is an example of placement information of a speaker system.
  • This is the default value of the downmix coefficient from the 22.2ch signal to the 5.1ch signal.
  • This is the default value of the downmix coefficient from a 5.1ch signal to a 2ch signal.
  • This is an example of a screen for selecting a speaker system used for audio reproduction.
  • FIG. 3 is a diagram illustrating metadata of an object-based signal.
  • Metadata specifies the playback position of an object-based signal.
  • metadata specifies the playback position of an object-based signal.
  • metadata specifies the playback position of an object-based signal.
  • This is an example of a screen for selecting a playback position of an object-based signal.
  • This is an example of a screen for setting the playback position of an object-based signal.
  • metadata specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • stream data specifies the playback position of an object-based signal.
  • FIG. 3 is a diagram of a selection screen for selecting an audio signal for each output device.
  • FIG. 2 is a diagram illustrating audio playback in a cooperating device.
  • FIG. 3 is a diagram illustrating parameters describing the number of audio signals to be transmitted and the acquisition destination.
  • FIG. 3 is a diagram illustrating the data structure of an audio component descriptor.
  • FIG. 3 is a diagram illustrating audio component type data. This is an example in which transmitted audio signals are displayed in an electronic program guide. This is an example in which transmitted audio signals are displayed in an electronic program guide. This is an example of displaying a signal source and an output device.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 6 is a diagram illustrating an example of a reverberation sound processing flow when using headphones.
  • FIG. 3 is a diagram showing an example of an audio output setting menu.
  • FIG. 7 is a diagram showing an example of a detailed menu for reverberation sound settings.
  • FIG. 3 is a diagram illustrating an example of a banner display indicating a reverberation processing state.
  • FIG. 7 is a diagram illustrating an example of a banner display in partially enlarged display of a broadcast image.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment
  • FIG. 6 is a diagram for explaining viewing conditions before and after partially enlarged display of a broadcast image.
  • 5 is a flowchart illustrating a sound image localization mode switching operation according to the present embodiment.
  • FIG. 2 is a diagram showing an example of the appearance of the remote controller of the present embodiment.
  • FIG. 7 is a diagram showing an example of a banner display for setting the sound image localization mode of the present embodiment.
  • 12 is a flowchart showing an operation for correcting ⁇ when the sound image localization mode of the present modification is non-fixed.
  • FIG. 1 is a system configuration diagram showing an example of the configuration of a broadcasting system.
  • the broadcasting system includes, for example, a broadcast receiving device 100 and an antenna 200, a radio tower 300 of a broadcasting station and a broadcasting station server 400, a service provider server 500, a mobile phone communication server 600, a base station 600B of a mobile phone communication network, and a mobile phone communication network. It is composed of an information terminal 700, a broadband network 800 such as the Internet, a router device 800R, an information server 900, headphones 910, and an HMD (Head Mound Display) 920. Note that the broadcasting system may include either one of the headphones 910 and the HMD 920, or may include a speaker system (not shown) instead of the headphones 910 and the HMD 920. Further, various server devices and communication devices may be further connected to the Internet 800.
  • the broadcast receiving device 100 is a television receiver equipped with a reception function for advanced digital broadcasting services. Broadcast receiving device 100 may further include a receiving function for existing digital broadcasting services. Furthermore, by linking digital broadcasting services (existing digital broadcasting services or advanced digital broadcasting services) with functions using broadband networks, we will be able to obtain additional content via broadband networks, perform calculation processing on server devices, and link with mobile terminal devices. It is compatible with broadcast communication cooperation systems that combine presentation processing, etc. with digital broadcasting services. Broadcast receiving device 100 receives digital broadcast waves transmitted from radio tower 300 via antenna 200 . The digital broadcast wave may be directly transmitted from the radio tower 300 to the antenna 200, or may be transmitted via a broadcasting satellite, a communication satellite, etc. (not shown). A broadcast signal retransmitted by a cable television station may be received via a cable line or the like. Further, the broadcast receiving device 100 can be connected to the Internet 800 via the router device 800R, and can transmit and receive data through communication with each server device on the Internet 800.
  • the router device 800R is connected to the Internet 800 by wireless or wired communication, and is also connected to the broadcast receiving device 100 by wired communication and to the mobile information terminal 700 by wireless communication. This enables each server device, broadcast receiving device 100, and portable information terminal 700 on the Internet 800 to mutually transmit and receive data via the router device 800R.
  • the router device 800R, the broadcast receiving device 100, and the mobile information terminal 700 constitute a LAN (Local Area Network). Note that communication between the broadcast receiving device 100 and the mobile information terminal 700 may be performed directly using a method such as Bluetooth (registered trademark) or NFC (Near Field Communication) without going through the router device 800R.
  • the radio tower 300 is a broadcasting facility of a broadcasting station, and transmits digital broadcast waves containing various control information related to digital broadcasting services, content data of broadcast programs (video content, audio content, etc.), and the like.
  • the broadcast station also includes a broadcast station server 400.
  • Broadcasting station server 400 stores content data of broadcast programs and metadata of each broadcast program, such as program title, program ID, program summary, performers, broadcast date and time, and the like.
  • the broadcasting station server 400 provides the content data and metadata to the service provider based on a contract. Content data and metadata are provided to the service provider through an API (Application Programming Interface) included in the broadcast station server 400.
  • API Application Programming Interface
  • the service provider server 500 is a server device prepared by a service provider to provide services using a broadcasting and communication cooperation system.
  • the service provider server 500 stores, manages, and stores content data and metadata provided by the broadcasting station server 400, as well as content data and applications (operating programs and/or various data, etc.) created for the broadcast communication cooperation system. Perform distribution, etc. It also has the ability to search for and provide a list of available applications in response to inquiries from television receivers. Note that storage, management, distribution, etc. of the content data and metadata, and storage, management, distribution, etc. of the application may be performed by different server devices.
  • the broadcasting station and the service provider may be the same or different providers.
  • a plurality of service provider servers 500 may be provided for different services.
  • the functions of the service provider server 500 may also be provided by the broadcasting station server 400.
  • Mobile telephone communication server 600 is connected to the Internet 800, and on the other hand, is connected to mobile information terminal 700 via base station 600B.
  • the mobile phone communication server 600 manages telephone communication (calls) and data transmission and reception via the mobile phone communication network of the mobile information terminal 700, and transmits and receives data through communication between the mobile information terminal 700 and each server device on the Internet 800. It is possible to send and receive.
  • communication between the mobile information terminal 700 and the broadcast receiving device 100 may be performed via the base station 600B, the mobile telephone communication server 600, the Internet 800, and the router device 800R.
  • the information server 900 is a server device that provides information such as the sound field environment of a concert hall or theater.
  • the information server 900 provides information to supplement sound field environment metadata when the broadcast content does not have or is lacking in sound field environment metadata. For example, if the name of the theater where the performance will be performed is indicated from the title or metadata of the broadcast content, information about the sound field environment of the theater can be obtained by searching the information stored in the information server based on the theater name. be obtained.
  • the information server 900 may provide not only environmental information of the performance location but also a head-related transfer function that takes sound reproduction through headphones 910 into consideration.
  • the sound field environment and head-related transfer functions may be provided including transfer functions that take hearing-impaired people into consideration.
  • the information server 900 may have a function of acquiring a viewer's brain waves and arranging a sound field environment and head-related transfer function suitable for the viewer.
  • the HMD 920 receives video and audio data from the broadcast receiving device 100, displays the video to the viewer, and reproduces the audio.
  • FIG. 2A is a block diagram showing an example of the internal configuration of broadcast receiving device 100.
  • the broadcast receiving apparatus 100 includes a main control section 101, a system bus 102, a ROM 103, a RAM 104, a storage section 110, a LAN communication section 121, an expansion interface section 124, a digital interface section 125, a first tuner/demodulation section 130C, and a first tuner/demodulation section 130C.
  • the main control section 101 is a microprocessor unit that controls the entire broadcast receiving apparatus 100 according to a predetermined operation program.
  • the system bus 102 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 101 and each operational block within the broadcast receiving apparatus 100.
  • the ROM (Read Only Memory) 103 is a non-volatile memory in which basic operating programs such as an operating system and other operating programs are stored. used. Further, the ROM 103 stores operation setting values and the like necessary for the operation of the broadcast receiving apparatus 100.
  • a RAM (Random Access Memory) 104 serves as a work area when basic operation programs and other operation programs are executed.
  • the ROM 103 and the RAM 104 may be integrated with the main control unit 101. Further, the ROM 103 may not have an independent configuration as shown in FIG. 2A, but may use a part of the storage area within the storage section 110.
  • the storage unit 110 stores operating programs and operating settings of the broadcast receiving device 100, personal information of the user of the broadcast receiving device 100, and the like. Further, it is possible to store operating programs downloaded via the Internet 800 and various data created using the operating programs. It is also possible to store content such as moving images, still images, audio, etc., obtained from broadcast waves or downloaded via the Internet 800. All or part of the functions of the ROM 103 may be replaced by a partial area of the storage section 110. Further, the storage unit 110 needs to hold stored information even when power is not supplied to the broadcast receiving apparatus 100 from the outside. Therefore, for example, devices such as semiconductor element memories such as flash ROMs and SSDs (Solid State Drives), and magnetic disk drives such as HDDs (Hard Disc Drives) are used.
  • each of the operating programs stored in the ROM 103 and the storage unit 110 can be added, updated, and expanded in function by downloading from each server device or broadcast wave on the Internet 800.
  • the LAN communication unit 121 is connected to the Internet 800 via the router device 800R, and sends and receives data to and from each server device and other communication devices on the Internet 800. It also acquires the content data (or part of it) of the program transmitted via the communication line.
  • the connection to the router device 800R may be a wired connection or a wireless connection such as Wi-Fi (registered trademark).
  • the LAN communication unit 121 includes an encoding circuit, a decoding circuit, and the like.
  • the broadcast receiving device 100 may further include other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B each receive broadcast waves of the digital broadcasting service, and Tuning processing (channel selection) is performed by tuning to a predetermined service channel based on the control. Furthermore, it performs demodulation processing and waveform shaping processing of the modulated wave of the received signal, reconstruction processing of the frame structure and hierarchical structure, energy despreading processing, error correction decoding processing, etc., and reproduces the packet stream. It also extracts and decodes a transmission multiplexing configuration control (TMCC) signal from the received signal.
  • TMCC transmission multiplexing configuration control
  • the first tuner/demodulator 130C can input digital broadcast waves of the current digital terrestrial broadcasting service received by the antenna 200C, which is the current digital terrestrial broadcast receiving antenna.
  • the first tuner/demodulator 130C inputs a broadcast signal of one polarization between a horizontal (H) polarization signal and a vertical (V) polarization signal of dual-polarization terrestrial digital broadcasting, which will be described later. It is also possible to demodulate layer segments that use the same modulation method as the digital terrestrial broadcasting service.
  • the first tuner/demodulator 130C can input a broadcast signal of single-polarized digital terrestrial broadcasting, which will be described later, and demodulate the segment of the layer that uses the same modulation method as the current digital terrestrial broadcasting service. be.
  • the first tuner/demodulator 130C can receive a broadcast signal of layer division multiplexed digital terrestrial broadcasting, which will be described later, and demodulate the segment of the layer that uses the same modulation method as the current digital terrestrial broadcasting service. be.
  • the second tuner/demodulator 130T inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service received by the antenna 200T, which is a dual polarization digital terrestrial broadcast receiving antenna, via the converter 201T. Further, the second tuner/demodulator 130T may input digital broadcast waves of the advanced digital terrestrial broadcasting service received by a single-polarized digital terrestrial broadcast receiving antenna (not shown). When the second tuner/demodulator 130T inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service from a single-polarized digital terrestrial broadcast receiving antenna (not shown), the converter 201T may not be used. Note that the antenna 200T that receives digital broadcast waves of dual-polarization terrestrial digital broadcasting includes an element that receives horizontally polarized signals and an element that receives vertically polarized signals.
  • a single-polarized terrestrial digital broadcast receiving antenna includes either an element for receiving a horizontally polarized signal or an element for receiving a vertically polarized signal.
  • the single-polarized digital terrestrial broadcast reception antenna may be used in common with the antenna 200C, which is the current antenna for terrestrial digital broadcast reception.
  • the third tuner/demodulator 130L inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service received by the antenna 200L, which is a hierarchical division multiplexing digital terrestrial broadcast receiving antenna, via the converter 201L.
  • the fourth tuner/demodulator 130B converts digital broadcast waves of an advanced BS (Broadcasting Satellite) digital broadcasting service or an advanced CS (Communication Satellite) digital broadcasting service received by the antenna 200B, which is a BS/CS shared receiving antenna, into a converter. 201B.
  • BS Broadcasting Satellite
  • CS Common Satellite
  • the antenna 200C, the antenna 200T, the antenna 200L, the antenna 200B, the converter 201T, the converter 201L, and the converter 201B do not constitute part of the broadcast receiving device 100, but rather the building in which the broadcast receiving device 100 is installed. It belongs to the equipment side such as.
  • the current terrestrial digital broadcasting described above is a broadcast signal of a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels.
  • Dual-polarization terrestrial digital broadcasting is terrestrial digital broadcasting that uses multiple polarizations, horizontal (H) and vertical (V). This segment transmits a terrestrial digital broadcasting service that can transmit video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels.
  • Single-polarized terrestrial digital broadcasting is terrestrial digital broadcasting that uses either horizontal (H) polarization or vertical (V) polarization, and some segments are divided into 1920 pixels horizontally x vertically.
  • a terrestrial digital broadcasting service capable of transmitting video with a maximum resolution of more than 1080 pixels will be transmitted.
  • the current terrestrial digital broadcasting service which transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, in multiple segments with different polarizations according to each embodiment of the present invention, and It is possible to simultaneously transmit a digital terrestrial broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 pixels x 1080 vertical pixels.
  • single-polarized terrestrial digital broadcasting can transmit images with a maximum resolution of 1920 horizontal pixels x vertical 1080 pixels using the same modulation method as the above-mentioned current digital terrestrial broadcasting in some divided segments. .
  • the current digital terrestrial broadcasting service transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically in different segments of each embodiment of the present invention, and 1920 pixels horizontally x 1080 pixels vertically. It is possible to simultaneously transmit a digital terrestrial broadcasting service that can transmit video whose maximum resolution is a number of pixels exceeding the number of pixels.
  • layer division multiplex terrestrial digital broadcasting (advanced terrestrial digital broadcasting that employs layer division multiplex transmission method) will be described later, but it is capable of transmitting video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically.
  • This is a broadcast signal of terrestrial digital broadcasting service.
  • Hierarchical division multiplexing terrestrial digital broadcasting multiplexes a plurality of digital broadcasting signals with different signal levels. Note that digital broadcast signals with different signal levels mean that the power for transmitting the digital broadcast signals is different.
  • the hierarchical division multiplexing terrestrial digital broadcasting of each embodiment of the present invention is the broadcasting of the current terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels as a plurality of digital broadcasting signals with different signal levels. It is possible to hierarchically multiplex and transmit signals and broadcast signals of digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels in the frequency band of the same physical channel.
  • the current terrestrial digital broadcasting service transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically in multiple layers with different signal levels, and It is possible to simultaneously transmit a digital terrestrial broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 pixels x 1080 vertical pixels.
  • the broadcast receiving device in each embodiment of the present invention may have any configuration as long as it can suitably receive advanced digital broadcasting, and may include a first tuner/demodulator 130C, a second tuner/demodulator 130T, and a third tuner/demodulator. It is not essential to include all of the section 130L and the fourth tuner/demodulator 130B. For example, it is sufficient to include at least one of the second tuner/demodulator 130T or the third tuner/demodulator 130L. Further, in order to realize more advanced functions, one or more of the above four tuners/demodulators may be provided in addition to either the second tuner/demodulator 130T or the third tuner/demodulator 130L. good.
  • the antenna 200C, the antenna 200T, and the antenna 200L may be used in combination as appropriate. Further, among the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L, a plurality of tuners/demodulators may be used in common (or integrated) as appropriate.
  • the first decoder section 140S and the second decoder section 140U each receive the output from the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B.
  • a packet stream or a packet stream obtained from each server device on the Internet 800 via the LAN communication unit 121 is input.
  • the packet streams input to the first decoder unit 140S and the second decoder unit 140U are MPEG (Moving Picture Experts Group)-2 TS (Transport Stream) or MPEG-2 PS (Program Stream).
  • TLV Type Length Value
  • MMT MPEG Media Transport
  • the first decoder section 140S and the second decoder section 140U perform conditional access (CA) processing and extract video data, audio data, and various information data from the packet stream based on various control information included in the packet stream. etc., decoding processing of video data and audio data, acquisition of program information and EPG (Electronic Program Guide) generation processing, processing of playing back data broadcasting screens and multimedia data, etc. . It also performs a process of superimposing the generated EPG and reproduced multimedia data with decoded video data and audio data.
  • CA conditional access
  • the video selection unit 191 inputs the video data output from the first decoder unit 140S and the video data output from the second decoder unit 140U, and selects and/or superimposes the data as appropriate based on the control of the main control unit 101. Process. Further, the video selection unit 191 appropriately performs scaling processing, OSD (On Screen Display) data superimposition processing, and the like.
  • the monitor unit 192 is, for example, a display device such as a liquid crystal panel, and displays the video data selected and/or superimposed by the video selection unit 191, and provides the video data to the user of the broadcast receiving apparatus 100.
  • the video output unit 193 is a video output interface that outputs the video data selected and/or superimposed by the video selection unit 191 to the outside.
  • the video output interface is, for example, HDMI (High-Defenition Multimedia Interface) (registered trademark).
  • the audio selection unit 194 inputs the audio data output from the first decoder unit 140S and the audio data output from the second decoder unit 140U, and selects and/or mixes the audio data as appropriate based on the control of the main control unit 101. Process.
  • the speaker unit 195 outputs the audio data selected and/or mixed by the audio selection unit 194 and provides the audio data to the user of the broadcast receiving device 100 .
  • the audio output unit 196 is an audio output interface that outputs the audio data selected and/or mixed by the audio selection unit 194 to the outside. Examples of the audio output interface include an analog headphone jack, an optical digital interface, Bluetooth, and an ARC (Audio Return Channel) assigned to an HDMI input terminal.
  • the digital interface unit 125 is an interface that outputs or inputs a packet stream containing encoded digital video data and/or digital audio data.
  • the digital interface section 125 allows the first decoder section 140S and the second decoder section 140U to communicate with each other from the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B. It is possible to output the input packet stream as is. Further, a packet stream input from the outside via the digital interface section 125 may be input to the first decoder section 140S or the second decoder section 140U, or may be controlled to be stored in the storage section 110.
  • the video data and audio data separated and extracted by the first decoder section 140S and the second decoder section 140U may be output. Further, it is also possible to control the video data and audio data input from the outside via the digital interface unit 125 to be input to the first decoder unit 140S and the second decoder unit 140U, or to be stored in the storage unit 110. good.
  • the expansion interface unit 124 is a group of interfaces for expanding the functions of the broadcast receiving device 100, and includes a video/audio interface, a USB (Universal Serial Bus) interface, a memory interface, and the like.
  • the video/audio interface inputs video signals/audio signals from an external video/audio output device, outputs video signals/audio signals to the external video/audio input device, and so on.
  • Video/audio interface For example, there are pin jacks and D terminals that handle analog signals, and HDMI that handles digital signals.
  • the USB interface is connected to a PC or the like to send and receive data. Broadcast programs and other content data may be recorded by connecting an HDD. Additionally, a keyboard or other USB devices may be connected.
  • the memory interface connects a memory card or other memory medium to send and receive data.
  • the operation input unit 180 is an instruction input unit for inputting operation instructions to the broadcast receiving apparatus 100, and is an operation input unit that is arranged with button switches and a remote control reception unit that receives commands transmitted from a remote controller (not shown). Consists of keys. Only one of them may be used. Further, the operation input section 180 can be replaced with a touch panel or the like arranged over the monitor section 192. A keyboard or the like connected to the expansion interface unit 124 may be used instead.
  • the remote control can be replaced by a portable information terminal 700 equipped with a remote control command sending function. Note that any of the "keys" provided on the remote control described in the following embodiments may be expressed as "buttons" without any problem.
  • the broadcast receiving device 100 when the broadcast receiving device 100 is a television receiver or the like, the video output section 193 and the audio output section 196 are not essential components. Further, the broadcast receiving device 100 may be an optical disk drive recorder such as a DVD (Digital Versatile Disc) recorder, a magnetic disk drive recorder such as an HDD recorder, an STB (Set Top Box), or the like. It may be a PC (Personal Computer), a tablet terminal, or the like that is equipped with a function of receiving a digital broadcasting service. When the broadcast receiving device 100 is a DVD recorder, an HDD recorder, an STB, or the like, the monitor section 192 and the speaker section 195 are not essential components. By connecting an external monitor and external speakers to the video output section 193 and the audio output section 196 or the digital interface section 125, operations similar to those of a television receiver or the like are possible.
  • an optical disk drive recorder such as a DVD (Digital Versatile Disc) recorder
  • a magnetic disk drive recorder such as
  • FIG. 2B is a block diagram showing an example of a detailed configuration of the first tuner/demodulator 130C.
  • the channel selection/detection unit 131C inputs the current digital broadcast wave received by the antenna 200C, and selects a channel based on the channel selection control signal.
  • the TMCC decoding section 132C extracts the TMCC signal from the output signal of the channel selection/detection section 131C and obtains various TMCC information.
  • the acquired TMCC information is used to control each subsequent process. Details of the TMCC signal and TMCC information will be described later.
  • the demodulator 133C uses QPSK (Quadrature Phase Shift Keying), DQPSK (Differential QPSK), and 16QAM (Quadrature Amplitude Modulation) based on TMCC information and the like.
  • Input a modulated wave modulated using a method such as Performs demodulation processing including frequency deinterleaving, time deinterleaving, carrier demapping processing, etc.
  • the demodulation unit 133C may be capable of supporting a modulation method different from each of the above-mentioned modulation methods.
  • the stream playback unit 134C performs layer division processing, inner code error correction processing such as Viterbi decoding, energy despreading processing, stream playback processing, outer code error correction processing such as RS (Reed Solomon) decoding, and the like. Note that as the error correction process, a method different from each of the above-mentioned methods may be used. Further, the packet stream reproduced and output by the stream reproduction unit 134C is, for example, MPEG-2 TS. Other formats of packet streams may also be used.
  • FIG. 2C is a block diagram showing an example of a detailed configuration of the second tuner/demodulator 130T.
  • the channel selection/detection unit 131H inputs the horizontal (H) polarized signal of the digital broadcast wave received by the antenna 200T, and selects a channel based on the channel selection control signal.
  • the channel selection/detection unit 131V inputs the vertical (V) polarized wave signal of the digital broadcast wave received by the antenna 200T, and selects a channel based on the channel selection control signal. Note that the operation of the channel selection process in the channel selection/detection section 131H and the operation of the channel selection process in the channel selection/detection section 131V may be controlled in conjunction with each other, or may be controlled independently.
  • channel selection/detection section 131H and the channel selection/detection section 131V as one channel selection/detection section, one of the digital broadcasting services transmitted using both horizontal and vertical polarization. It is also possible to control to select one channel, and by assuming that the channel selection/detection section 131H and the channel selection/detection section 131V are two independent channel selection/detection sections, it is possible to perform control to select only horizontally polarized waves (or It is also possible to perform control to select two different channels of a digital broadcasting service transmitted using vertically polarized waves only.
  • the horizontal (H) polarized signal and the vertical (V) polarized signal received by the second tuner/demodulator 130T of the broadcast receiving device in each embodiment of the present invention are broadcast waves whose polarization directions differ by approximately 90 degrees. Any polarization signal may be used, and the configurations related to the horizontal (H) polarization signal, vertical (V) polarization signal, and their reception described below may be reversed.
  • the TMCC decoding unit 132H extracts the TMCC signal from the output signal of the channel selection/detection unit 131H and obtains various TMCC information.
  • the TMCC decoding unit 132V extracts the TMCC signal from the output signal of the channel selection/detection unit 131V and obtains various TMCC information. Only one of the TMCC decoding section 132H and the TMCC decoding section 132V may be provided. The acquired TMCC information is used to control each subsequent process.
  • the demodulation unit 133H and the demodulation unit 133V each perform BPSK (Binary Phase Shift Keying), DBPSK (Differential BPSK), QPSK, DQPSK, 8PSK (Phase Shift Keying) based on TMCC information, etc. Keying), 16APSK (Amplitude and Phase Shift Keying) ), 32APSK, 16QAM, 64QAM, 256QAM, 1024QAM, etc., and performs demodulation processing including frequency deinterleaving, time deinterleaving, carrier demapping processing, etc.
  • the demodulating section 133H and the demodulating section 133V may be capable of supporting a modulation method different from each of the above-mentioned modulation methods.
  • the stream playback unit 134H and the stream playback unit 134V perform layer division processing, inner code error correction processing such as Viterbi decoding and LDPC (Low Density Parity Check) decoding, energy despreading processing, stream playback processing, RS decoding, and BCH decoding, respectively. Performs outer code error correction processing, etc.
  • inner code error correction processing such as Viterbi decoding and LDPC (Low Density Parity Check) decoding
  • energy despreading processing processing
  • stream playback processing RS decoding
  • BCH decoding Low Density Parity Check
  • outer code error correction processing etc.
  • the packet stream reproduced and output by the stream reproduction unit 134H is, for example, MPEG-2 TS.
  • the packet stream reproduced and output by the stream reproduction unit 134V is, for example, a TLV including an MPEG-2 TS or an MMT packet stream. Each of these may be a packet stream in another format.
  • the channel selection/detection unit 131V, TMCC decoding unit 132V, and demodulation unit 133V may not be provided.
  • the signal of the segment that transmits the current digital terrestrial broadcasting service is stream-played.
  • the signal of the segment that transmits the advanced terrestrial digital broadcasting service is input to the stream playback unit 134V.
  • FIG. 2D is a block diagram showing an example of a detailed configuration of the third tuner/demodulator 130L.
  • the channel selection/detection unit 131L receives digital broadcast waves that have been subjected to layered division multiplexing (LDM) processing from the antenna 200L, and selects a channel based on a channel selection control signal.
  • Digital broadcast waves that have been subjected to layer division multiplexing processing can be used for digital broadcasting services (or for different broadcasting services of the same broadcasting service) in which the modulated waves of the upper layer (UL) and the modulated waves of the lower layer (LL) are different. channel) can be used for transmission.
  • the modulated wave of the upper layer is output to the demodulator 133S, and the modulated wave of the lower layer is output to the demodulator 133L.
  • the TMCC decoding unit 132L inputs the upper layer modulated wave and the lower layer modulated wave output from the channel selection/detection unit 131L, extracts the TMCC signal, and obtains various TMCC information. Note that the signal input to the TMCC decoding unit 132L may be only the modulated wave of the upper layer or only the modulated wave of the lower layer.
  • FIG. 2E is a block diagram showing an example of a detailed configuration of the fourth tuner/demodulator 130B.
  • the channel selection/detection unit 131B inputs the digital broadcast waves of the advanced BS digital broadcasting service and the advanced CS digital broadcasting service received by the antenna 200B, and selects a channel based on the channel selection control signal. Other operations are the same as those of the channel selection/detection section 131H and the channel selection/detection section 131V, so detailed explanation will be omitted.
  • TMCC decoding unit 132B, demodulation unit 133B, and stream playback unit 134B also operate in the same manner as the TMCC decoding unit 132H, TMCC decoding unit 132V, demodulation unit 133H, demodulation unit 133V, and stream playback unit 134V, so the details are The explanation will be omitted.
  • FIG. 2F is a block diagram showing an example of a detailed configuration of the first decoder section 140S.
  • the selection unit 141S under the control of the main control unit 101, selects the packet stream input from the first tuner/demodulation unit 130C, the packet stream input from the second tuner/demodulation unit 130T, and the packet stream input from the third tuner/demodulation unit 130L. One of the packet streams is selected and output.
  • the packet streams input from the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L are, for example, MPEG-2 TS.
  • the CA descrambler 142S performs processing for canceling a predetermined scrambling encryption algorithm based on various control information related to conditional access superimposed on the packet stream.
  • the demultiplexer 143S is a stream decoder, and separates and extracts video data, audio data, superimposed text data, subtitle data, program information data, etc. based on various control information included in the input packet stream.
  • the separated and extracted video data is distributed to the video decoder 145S
  • the separated and extracted audio data is distributed to the audio decoder 146S
  • the separated and extracted character superimposition data, subtitle data, program information data, etc. are distributed to the data decoder 144S.
  • a packet stream (eg, MPEG-2 PS, etc.) obtained from a server device on the Internet 800 via the LAN communication unit 121 may be input to the demultiplexing unit 143S.
  • the demultiplexer 143S outputs the packet stream input from the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L to the outside via the digital interface unit 125. It is possible to input a packet stream obtained from the outside via the digital interface section 125.
  • the video decoder 145S performs decoding processing of compression-encoded video information, colorimetry conversion processing, dynamic range conversion processing, etc. on the decoded video information on the video data input from the demultiplexer 143S.
  • processing such as resolution conversion (up/down conversion) based on the control of the main control unit 101 is performed as appropriate to UHD (horizontal 3840 pixels x vertical 2160 pixels), HD (horizontal 1920 pixels x vertical 1080 pixels), and SD ( Video data is output at a resolution of 720 pixels horizontally x 480 pixels vertically. Video data may be output at other resolutions.
  • the audio decoder 146S performs decoding processing of compressed and encoded audio information.
  • a plurality of video decoders 145S and audio decoders 146S may be provided in order to simultaneously perform decoding processing of video data and audio data.
  • the data decoder 144S performs processes such as generating an EPG based on program information data, generating a data broadcasting screen based on BML data, and controlling a cooperative application based on a broadcast communication cooperative function.
  • the data decoder 144S has a BML browser function that executes a BML document, and data broadcasting screen generation processing is executed by the BML browser function.
  • the data decoder 144S also performs processes such as decoding character superimposition data to generate character superimposition information and decoding subtitle data to generate subtitle information.
  • the superimposing section 147S, the superimposing section 148S, and the superimposing section 149S perform superimposition processing on the video data output from the video decoder 145S and the EPG, data broadcast screen, etc. output from the data decoder 144S, respectively.
  • the synthesis unit 151S performs a process of synthesizing the audio data output from the audio decoder 146S and the audio data reproduced by the data decoder 144S.
  • the selection unit 150S selects the resolution of video data based on the control of the main control unit 101. Note that the functions of the superimposing section 147S, the superimposing section 148S, the superimposing section 149S, and the selection section 150S may be integrated with the video selection section 191. The functions of the synthesis section 151S may be integrated with the voice selection section 194.
  • FIG. 2G is a block diagram showing an example of a detailed configuration of the second decoder section 140U.
  • the selection unit 141U selects the packet stream input from the second tuner/demodulation unit 130T, the packet stream input from the third tuner/demodulation unit 130L, and the packet stream input from the fourth tuner/demodulation unit 130B.
  • One of the packet streams is selected and output.
  • the packet streams input from the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B are, for example, an MMT packet stream or a TLV including an MMT packet stream. It may be an MPEG-2 TS format packet stream that uses HEVC (High Efficiency Video Coding) or the like as a video compression method.
  • the CA descrambler 142U performs processing for canceling a predetermined scrambling encryption algorithm based on various control information related to conditional access superimposed on the packet stream.
  • the demultiplexer 143U is a stream decoder, and separates and extracts video data, audio data, superimposed text data, subtitle data, program information data, etc. based on various control information included in the input packet stream.
  • the separated and extracted video data is distributed to the video decoder 145U
  • the separated and extracted audio data is distributed to the audio decoder 146U
  • the separated and extracted character superimposition data, subtitle data, program information data, etc. are distributed to the multimedia decoder 144U.
  • a packet stream (eg, MPEG-2 PS, MMT packet stream, etc.) acquired from a server device on the Internet 800 via the LAN communication unit 121 may be input to the demultiplexing unit 143U.
  • the demultiplexer 143U outputs the packet stream input from the second tuner/demodulator 130T, the third tuner/demodulator 130L, or the fourth tuner/demodulator 130B to the outside via the digital interface unit 125. It is possible to input a packet stream obtained from the outside via the digital interface section 125.
  • the multimedia decoder 144U performs a process of generating an EPG based on program information data, a process of generating a multimedia screen based on multimedia data, a process of controlling a cooperative application based on a broadcast communication cooperative function, and the like.
  • the multimedia decoder 144U has an HTML browser function that executes HTML documents, and multimedia screen generation processing is executed by the HTML browser function.
  • the video decoder 145U, the audio decoder 146U, the superimposing section 147U, the superimposing section 148U, the superimposing section 149U, the combining section 151U, and the selecting section 150U are respectively the video decoder 145S, the audio decoder 146S, the superimposing section 147S, the superimposing section 148S, and the superimposing section 149S. It is a component having the same functions as the synthesis section 151S and the selection section 150S. In the description of the video decoder 145S, the audio decoder 146S, the superimposing section 147S, the superimposing section 148S, the superimposing section 149S, the combining section 151S, and the selecting section 150S in FIG.
  • the video decoder 145U, the audio decoder 146U, the superimposing section 147U, the superimposing section 148U, the superimposing section 149U, the synthesizing section 151U, and the selecting section 150U will be explained separately, so a separate detailed explanation will be omitted.
  • FIG. 2H is a software configuration diagram of the broadcast receiving apparatus 100, and shows an example of the software configuration in the storage unit 110 (or ROM 103, hereinafter the same) and RAM 104.
  • the storage unit 110 stores a basic operation program 1001, a reception function program 1002, a browser program 1003, a content management program 1004, and other operation programs 1009.
  • the storage unit 110 also includes a content storage area 1011 that stores content data such as videos, still images, and audio, and authentication information that is used for communication and cooperation with external mobile terminal devices, server devices, etc. , an authentication information storage area 1012 for storing information, and a various information storage area 1019 for storing various other information.
  • the basic operation program 1001 stored in the storage unit 110 is expanded into the RAM 104, and the main control unit 101 further executes the expanded basic operation program to configure the basic operation control unit 1101. Further, the reception function program 1002, browser program 1003, and content management program 1004 stored in the storage unit 110 are each expanded into the RAM 104, and the main control unit 101 executes each of the expanded operation programs. This configures a reception function control unit 1102, a browser engine 1103, and a content management unit 1104. Furthermore, the RAM 104 is provided with a temporary storage area 1200 that temporarily holds data created when each operating program is executed, as needed.
  • the main control unit 101 controls each operation block by expanding the basic operation program 1001 stored in the storage unit 110 into the RAM 104 and executing it.
  • the basic operation control unit 1101 controls each operation block. Similar descriptions are made for other operating programs.
  • the reception function control unit 1102 performs basic control of the broadcast reception function, broadcast communication cooperation function, etc. of the broadcast reception device 100.
  • the channel selection/demodulation section 1102a performs channel selection processing and TMCC information in the first tuner/demodulation section 130C, the second tuner/demodulation section 130T, the third tuner/demodulation section 130L, the fourth tuner/demodulation section 130B, etc.
  • the stream playback control unit 1102b performs layer division processing, error correction decoding processing, and energy processing in the first tuner/demodulation unit 130C, second tuner/demodulation unit 130T, third tuner/demodulation unit 130L, fourth tuner/demodulation unit 130B, etc.
  • the AV decoder section 1102c mainly controls demultiplexing processing (stream decoding processing), video data decoding processing, audio data decoding processing, etc. in the first decoder section 140S, the second decoder section 140U, and the like.
  • the multimedia (MM) data playback unit 1102d performs BML data playback processing, text super data decoding processing, subtitle data decoding processing, communication cooperation application control processing in the first decoder unit 140S, and HTML data playback processing in the second decoder unit 140U. It mainly controls processes such as multimedia screen generation processing and communication cooperation application control processing.
  • the EPG generation unit 1102e mainly controls EPG generation processing and display processing of the generated EPG in the first decoder unit 140S and the second decoder unit 140U.
  • the presentation processing unit 1102f controls colorimetry conversion processing, dynamic range conversion processing, resolution conversion processing, audio downmix processing, etc. in the first decoder unit 140S and the second decoder unit 140U, and also controls the video selection unit 191 and the audio selection unit 194. etc. are controlled.
  • the BML browser 1103a and HTML browser 1103b of the browser engine 1103 interpret BML documents and HTML documents during the aforementioned BML data playback processing and HTML data playback processing, and perform data broadcasting screen generation processing and multimedia screen generation processing. .
  • the content management unit 1104 manages time schedules and execution controls when making recording reservations and viewing reservations for broadcast programs, and manages copyrights when outputting broadcast programs, recorded programs, etc. from the digital interface unit 125, the LAN communication unit 121, etc. Performs expiration date management, etc. of cooperative applications acquired based on management and broadcast communication cooperation functions.
  • Each of the operating programs may be stored in advance in the storage unit 110 and/or the ROM 103 at the time of product shipment.
  • the information may be obtained from a server device on the Internet 800 via the LAN communication unit 121 or the like after the product is shipped.
  • each of the operating programs stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 124 or the like. It may be newly acquired or updated via broadcast waves.
  • FIG. 3A is an example of an internal configuration of the broadcast station server 400.
  • the broadcast station server 400 includes a main control section 401, a system bus 402, a RAM 404, a storage section 410, a LAN communication section 421, and a digital broadcast signal transmission section 460 as components.
  • the main control unit 401 is a microprocessor unit that controls the entire broadcasting station server 400 according to a predetermined operating program.
  • the system bus 402 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 401 and each operational block within the broadcast station server 400.
  • the RAM 404 serves as a work area when each operating program is executed.
  • the storage unit 410 stores a basic operation program 4001, a content management/distribution program 4002, and a content transmission program 4003, and further includes a content data storage area 4011 and a metadata storage area 4012.
  • the content data storage area 4011 stores content data of each broadcast program broadcast by a broadcast station.
  • the metadata storage area 4012 stores metadata such as the program title, program ID, program summary, performers, broadcast date and time of each of the broadcast programs.
  • the basic operating program 4001, content management/distribution program 4002, and content sending program 4003 stored in the storage unit 410 are each expanded to the RAM 404, and the main control unit 401 is further expanded to the expanded basic operating program and the content management/distribution program 4003.
  • a basic operation control section 4101, a content management/distribution control section 4102, and a content transmission control section 4103 are configured.
  • a content management/distribution control unit 4102 manages content data, metadata, etc. stored in a content data storage area 4011 and a metadata storage area 4012, and distributes the content data, metadata, etc. to a service provider based on a contract. Control when providing. Furthermore, when providing content data, metadata, etc. to the service provider, the content management/distribution control unit 4102 also performs authentication processing of the service provider server 500 as necessary.
  • the content transmission control unit 4103 includes content data of the broadcast program stored in the content data storage area 4011, program title, program ID, program content copy control information, etc. of the broadcast program stored in the metadata storage area 4012. It performs time schedule management and the like when transmitting a stream via the digital broadcast signal transmitting section 460.
  • the LAN communication unit 421 is connected to the Internet 800 and communicates with the service provider server 500 and other communication devices on the Internet 800.
  • the LAN communication unit 421 includes an encoding circuit, a decoding circuit, and the like.
  • the digital broadcast signal transmitting unit 460 performs processing such as modulation on the stream composed of content data and program information data of each broadcast program stored in the content data storage area 4011, and transmits the stream in digital form via the radio tower 300. Send it out as a broadcast wave.
  • FIG. 3B is an example of the internal configuration of the service provider server 500.
  • the service provider server 500 includes a main control section 501, a system bus 502, a RAM 504, a storage section 510, and a LAN communication section 521.
  • the main control unit 501 is a microprocessor unit that controls the entire service provider server 500 according to a predetermined operating program.
  • the system bus 502 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 501 and each operation block in the service provider server 500.
  • the RAM 504 serves as a work area when each operating program is executed.
  • the storage unit 510 stores a basic operation program 5001, a content management/distribution program 5002, and an application management/distribution program 5003, and further includes a content data storage area 5011, a metadata storage area 5012, and an application storage area 5013.
  • the content data storage area 5011 and the metadata storage area 5012 store content data and metadata provided by the broadcasting station server 400, content produced by a service provider, metadata related to the content, and the like.
  • the application storage area 5013 stores applications (operating programs and/or various data, etc.) necessary for realizing each service of the broadcasting and communication cooperation system, to be distributed in response to requests from each television receiver.
  • the basic operation program 5001, content management/distribution program 5002, and application management/distribution program 5003 stored in the storage unit 510 are each expanded to the RAM 504, and the main control unit 501 is By executing the management/distribution program and the application management/distribution program, a basic operation control section 5101, a content management/distribution control section 5102, and an application management/distribution control section 5103 are configured.
  • the content management/distribution control unit 5102 acquires content data, metadata, etc. from the broadcasting station server 400, manages content data, metadata, etc. stored in the content data storage area 5011 and the metadata storage area 5012, and manages each content data, metadata, etc. Controls the distribution of the content data, metadata, etc. to the television receiver. Further, the application management/distribution control unit 5103 manages each application stored in the application storage area 5013 and controls the distribution of each application in response to a request from each television receiver. Furthermore, when distributing each application to each television receiver, the application management/distribution control unit 5103 also performs authentication processing of the television receiver, etc., as necessary.
  • the LAN communication unit 521 is connected to the Internet 800 and communicates with the broadcasting station server 400 and other communication devices on the Internet 800. It also communicates with the broadcast receiving device 100 and the mobile information terminal 700 via the router device 800R.
  • the LAN communication unit 521 includes an encoding circuit, a decoding circuit, and the like.
  • FIG. 3C is a block diagram showing an example of the internal configuration of mobile information terminal 700.
  • the mobile information terminal 700 includes a main control section 701, a system bus 702, a ROM 703, a RAM 704, a storage section 710, a communication processing section 720, an expansion interface section 724, an operation section 730, an image processing section 740, an audio processing section 750, and a sensor section 760. , consists of.
  • the main control unit 701 is a microprocessor unit that controls the entire mobile information terminal 700 according to a predetermined operating program.
  • the system bus 702 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 701 and each operational block within the mobile information terminal 700.
  • the ROM 703 is a nonvolatile memory in which basic operating programs such as an operating system and other operating programs are stored, and for example, a rewritable ROM such as an EEPROM or a flash ROM is used. Further, the ROM 703 stores operation setting values and the like necessary for the operation of the mobile information terminal 700.
  • the RAM 704 serves as a work area when executing the basic operation program and other operation programs.
  • the ROM 703 and the RAM 704 may be integrated with the main control unit 701. Further, the ROM 703 may not have an independent configuration as shown in FIG. 3C, but may use a part of the storage area within the storage unit 710.
  • the storage unit 710 stores the operating program and operation setting values of the portable information terminal 700, the personal information of the user of the portable information terminal 700, and the like. Further, it is possible to store operating programs downloaded via the Internet 800 and various data created using the operating programs. Further, content downloaded via the Internet 800, such as moving images, still images, and audio, can also be stored. All or part of the functions of the ROM 703 may be replaced by a partial area of the storage section 710. Furthermore, the storage unit 710 needs to retain stored information even when power is not supplied to the portable information terminal 700 from the outside. Therefore, for example, devices such as semiconductor element memories such as flash ROMs and SSDs, and magnetic disk drives such as HDDs are used.
  • each of the operating programs stored in the ROM 703 and the storage unit 710 can be added, updated, and expanded in functionality by downloading from each server device on the Internet 800.
  • the communication processing unit 720 includes a LAN communication unit 721, a mobile telephone network communication unit 722, and an NFC communication unit 723.
  • the LAN communication unit 721 is connected to the Internet 800 via the router device 800R, and sends and receives data to and from each server device and other communication devices on the Internet 800. Connection with the router device 800R is performed by wireless connection such as Wi-Fi (registered trademark).
  • the mobile telephone network communication unit 722 performs telephone communication (call) and data transmission/reception through wireless communication with the base station 600B of the mobile telephone communication network.
  • the NFC communication unit 723 performs wireless communication when in close proximity to a corresponding reader/writer.
  • the LAN communication section 721, mobile telephone network communication section 722, and NFC communication section 723 each include an encoding circuit, a decoding circuit, an antenna, and the like. Further, the communication processing unit 720 may further include other communication units such as a Bluetooth (registered trademark) communication unit and an infrared communication unit.
  • the expansion interface unit 724 is a group of interfaces for expanding the functions of the mobile information terminal 700, and in this embodiment, it is assumed to be composed of a video/audio interface, a USB interface, a memory interface, etc.
  • the video/audio interface inputs video signals/audio signals from an external video/audio output device, outputs video signals/audio signals to the external video/audio input device, and so on.
  • the USB interface is connected to a PC or the like to send and receive data. Additionally, a keyboard or other USB devices may be connected.
  • the memory interface connects a memory card or other memory medium to send and receive data.
  • the operation unit 730 is an instruction input unit for inputting operation instructions to the mobile information terminal 700, and in this embodiment, it is composed of a touch panel 730T arranged over a display unit 741 and an operation key 730K arranged with button switches. . It may be only one of them.
  • the portable information terminal 700 may be operated using a keyboard or the like connected to the expansion interface section 724.
  • the portable information terminal 700 may be operated using a separate terminal device connected by wired communication or wireless communication. That is, the portable information terminal 700 may be operated from the broadcast receiving device 100. Further, the touch panel function may be provided in the display section 741.
  • the image processing section 740 includes a display section 741, an image signal processing section 742, a first image input section 743, and a second image input section 744.
  • the display unit 741 is, for example, a display device such as a liquid crystal panel, and provides image data processed by the image signal processing unit 742 to the user of the mobile information terminal 700.
  • the image signal processing section 742 includes a video RAM (not shown), and the display section 741 is driven based on image data input to the video RAM. Further, the image signal processing unit 742 has a function of performing format conversion, superimposition processing of menus and other OSD (On Screen Display) signals, etc. as necessary.
  • the first image input unit 743 and the second image input unit 744 convert light input from a lens into an electrical signal using an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor. , a camera unit that inputs image data of surroundings and objects.
  • an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • the audio processing section 750 includes an audio output section 751, an audio signal processing section 752, and an audio input section 753.
  • the audio output unit 751 is a speaker, and provides the user of the mobile information terminal 700 with an audio signal processed by the audio signal processing unit 752.
  • the voice input unit 753 is a microphone, and converts the user's voice into voice data and inputs the voice data.
  • the sensor unit 760 is a group of sensors for detecting the state of the mobile information terminal 700, and in this embodiment, it includes a GPS receiving unit 761, a gyro sensor 762, a geomagnetic sensor 763, an acceleration sensor 764, an illuminance sensor 765, and a proximity sensor 766. , consists of. These sensor groups make it possible to detect the position, inclination, direction, movement, surrounding brightness, proximity of surrounding objects, etc. of the mobile information terminal 700. Furthermore, the mobile information terminal 700 may further include other sensors such as an atmospheric pressure sensor.
  • the mobile information terminal 700 may be a mobile phone, a smart phone, a tablet terminal, or the like. It may be a PDA (Personal Digital Assistant) or a notebook PC. Further, it may be a digital still camera, a video camera capable of shooting moving images, a portable game machine, a navigation device, or other portable digital equipment.
  • PDA Personal Digital Assistant
  • notebook PC Portable Computer System
  • the configuration example of the mobile information terminal 700 shown in FIG. 3C includes many configurations that are not essential to this embodiment, such as a sensor section 760, but even if the configuration does not include these, this embodiment does not impair its effectiveness. Furthermore, configurations not shown may be further added, such as a digital broadcast reception function and an electronic money payment function.
  • FIG. 3D is a software configuration diagram of the mobile information terminal 700, and shows an example of the software configuration in the ROM 703, RAM 704, and storage unit 710.
  • the ROM 703 stores a basic operation program 7001 and other operation programs.
  • the storage unit 710 stores a cooperation control program 7002 and other operating programs.
  • the storage unit 710 also includes a content storage area 7200 that stores content data such as videos, still images, and audio, and an authentication information storage area 7300 that stores authentication information necessary for accessing the television receiver and each server device. , and various information storage areas for storing various other information.
  • the basic operation program 7001 stored in the ROM 703 is expanded to the RAM 704, and the main control unit 701 further executes the expanded basic operation program to configure the basic operation execution unit 7101.
  • the cooperation control program 7002 stored in the storage unit 710 is similarly expanded to the RAM 704, and further, the main control unit 701 configures the cooperation control execution unit 7102 by executing the expanded cooperation control program.
  • the RAM 704 is provided with a temporary storage area that temporarily holds data created when each operating program is executed, as needed.
  • the cooperation control execution unit 7102 manages device authentication and connection, transmission and reception of each data, etc. when the mobile information terminal 700 performs a cooperation operation with the television receiver. Further, the cooperation control execution unit 7102 is provided with a browser engine function for executing an application that works in conjunction with the television receiver.
  • Each of the operating programs may be stored in advance in the ROM 703 and/or the storage unit 710 at the time of product shipment. After the product is shipped, the information may be obtained from a server device on the Internet 800 via the LAN communication section 721 or the mobile telephone network communication section 722. Further, each of the operating programs stored in a memory card, an optical disk, etc. may be acquired via the expansion interface unit 724 or the like.
  • the broadcast receiving device 100 is a terrestrial broadcasting system that shares at least some specifications with the ISDB-T (Integrated Services Digital Broadcasting) system. It is possible to receive digital broadcasting services.
  • dual-polarization terrestrial digital broadcasting and single-polarization terrestrial digital broadcasting that can be received by the second tuner/demodulator 130T are advanced terrestrial digital broadcasting that shares some specifications with the ISDB-T system.
  • the hierarchical division multiplexing terrestrial digital broadcasting that can be received by the third tuner/demodulator 130L is an advanced terrestrial digital broadcasting that shares some specifications with the ISDB-T system.
  • the current terrestrial digital broadcasting that can be received by the first tuner/demodulator 130C is ISDB-T type terrestrial digital broadcasting.
  • advanced BS digital broadcasting and advanced CS digital broadcasting that can be received by the fourth tuner/demodulator 130B are digital broadcasting that is different from the ISDB-T system.
  • the dual-polarization terrestrial digital broadcasting, the single-polarization terrestrial digital broadcasting, and the hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment use OFDM, which is one of the multicarrier systems, as a transmission method, similar to the ISDB-T system.
  • OFDM Orthogonal Frequency Division Multiplexing
  • the symbol length is long, and it is effective to add a redundant part in the time axis direction called a guard interval, which can reduce the effects of multipath within the range of the guard interval. It is. Therefore, it is possible to realize an SFN (Single Frequency Network), and effective use of frequencies is possible.
  • the OFDM carrier is divided into groups called segments, as in the ISDB-T system. As shown in 4A, one channel bandwidth of the digital broadcasting service is composed of 13 segments. The center of the band is set as segment 0, and segment numbers (0 to 12) are sequentially assigned above and below this.
  • Transmission path encoding of dual-polarization terrestrial digital broadcasting, single-polarization terrestrial digital broadcasting, and hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment is performed in units of OFDM segments.
  • each layer is composed of one or more OFDM segments, and parameters such as carrier modulation method, inner code coding rate, time interleave length, etc. can be set for each layer.
  • the number of layers may be set arbitrarily; for example, it may be set to a maximum of three layers.
  • FIG. 4B shows an example of layer allocation of OFDM segments when the number of layers is 3 or 2. In the example of FIG. 4B(1), the number of layers is three, and there are an A layer, a B layer, and a C layer.
  • the A layer consists of one segment (segment 0), the B layer consists of seven segments (segments 1 to 7), and the C layer consists of five segments (segments 8 to 12).
  • the number of layers is three, and there are an A layer, a B layer, and a C layer.
  • the A layer consists of one segment (segment 0), the B layer consists of five segments (segments 1 to 5), and the C layer consists of seven segments (segments 6 to 12).
  • the number of layers is 2, and there are an A layer and a B layer.
  • the A layer consists of one segment (segment 0), and the B layer consists of 12 segments (segments 1 to 12).
  • the number of OFDM segments, transmission path coding parameters, etc. of each layer are determined according to configuration information, and are transmitted by a TMCC signal, which is control information for assisting the operation of the receiver.
  • the layer assignment in FIG. 4B(1) can be used in the dual-polarization terrestrial digital broadcasting according to this embodiment, and the same segment layer assignment may be used for both horizontal and vertical polarization. Specifically, it is sufficient to transmit the current mobile reception service of digital terrestrial broadcasting in the above-mentioned one segment of horizontally polarized waves as layer A. (In addition, the current mobile reception service for digital terrestrial broadcasting may transmit the same service using the above-mentioned one segment of vertically polarized waves.
  • layer B horizontally polarized It is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting, using the seven segments of the wave.
  • the digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically may transmit the same service using the above 7 segments of vertically polarized waves. In this case, this is also considered as the B layer.
  • an advanced terrestrial network capable of transmitting video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically in the above five segments for both horizontally polarized waves and vertically polarized waves, a total of 10 segments. It may also be configured to transmit digital broadcasting services. Details of the transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
  • the hierarchy assignment in FIG. 4B(1) can be used in the single-polarized digital terrestrial broadcasting according to this embodiment.
  • the current mobile reception service of digital terrestrial broadcasting may be transmitted in the above-mentioned one segment as layer A.
  • the above seven segments may be used to transmit a terrestrial digital broadcasting service that transmits video having a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting.
  • the C layer may be configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 horizontal pixels x 1080 vertical pixels in the five segments described above.
  • the C layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Details of the transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
  • one segment of the A layer transmits the current mobile reception service of the terrestrial digital broadcasting
  • eight segments of the B layer transmit the current terrestrial digital broadcasting service.
  • Digital terrestrial broadcasting service that transmits broadcast video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, and video whose maximum resolution exceeds 1920 pixels horizontally x 1080 pixels vertically in 4 segments of the C layer. It may also be configured to transmit advanced digital terrestrial broadcasting services that can transmit.
  • the C layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Details of the transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
  • the hierarchy allocation shown in FIG. 4B (2) can be used as an example different from that shown in FIG. Hierarchical assignment may be used. Specifically, it is sufficient to transmit the current mobile reception service of digital terrestrial broadcasting in the above-mentioned one segment of horizontally polarized waves as layer A. (In addition, the current mobile reception service for digital terrestrial broadcasting may transmit the same service using the above-mentioned one segment of vertically polarized waves. In this case, this is also treated as layer A.) Furthermore, as layer B, horizontally polarized It is configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels in the above five segments of both wave and vertical polarization, a total of 10 segments.
  • the C layer it is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting, using the above seven segments of horizontally polarized waves.
  • the digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically may transmit the same service using the above 7 segments of vertical polarization. In this case, this is also the C layer.
  • the details of this transmission will be described later.
  • the transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100 of this embodiment.
  • the hierarchy assignment in FIG. 4B(2) can be used as an example different from FIG. 4B(1) in the single-polarized digital terrestrial broadcasting according to this embodiment.
  • the current mobile reception service of digital terrestrial broadcasting may be transmitted in the above-mentioned one segment as layer A.
  • the B layer may be configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 horizontal pixels x 1080 vertical pixels in the five segments.
  • the B layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting.
  • the transmitted wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100 of this embodiment.
  • the layer allocation shown in FIG. 4B (3) can be used in the layer division multiplexing terrestrial digital broadcasting according to this embodiment and the current terrestrial digital broadcasting.
  • layer B we have advanced terrestrial digital broadcasting services that can transmit video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically in 12 segments in the figure, and 1920 pixels horizontally x 1080 pixels vertically.
  • the current digital terrestrial broadcasting service that transmits video images may be configured to be transmitted.
  • the transmitted wave assigned to the segment hierarchy can be received by, for example, the third tuner/demodulator 130L of the broadcast receiving apparatus 100 of this embodiment.
  • the third tuner/demodulator 130L When used in current terrestrial digital broadcasting, it is sufficient to transmit the current mobile reception service of terrestrial digital broadcasting in 1 segment in the figure as the A layer, and to transmit the current terrestrial digital broadcasting mobile reception service in 12 segments in the figure as the B layer. It is sufficient to transmit a terrestrial digital broadcasting service that transmits video having a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically.
  • the transmitted wave assigned to the segment hierarchy can be received by, for example, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 of this embodiment.
  • FIG. 4C shows a system on the broadcasting station side that realizes generation processing of OFDM transmission waves, which are digital broadcast waves for dual-polarized terrestrial digital broadcasting, single-polarized terrestrial digital broadcasting, and hierarchical division multiplexed terrestrial digital broadcasting according to this embodiment.
  • the information source encoding unit 411 encodes video/audio/various data, etc., respectively.
  • the multiplexing unit/conditional reception processing unit 415 multiplexes the video/audio/various data etc. encoded by the information source encoding unit 411, performs appropriate processing corresponding to conditional reception, and outputs it as a packet stream. do.
  • a plurality of information source encoding units 411 and multiplexing units/conditional access processing units 415 can exist in parallel, and generate a plurality of packet streams.
  • the transmission path encoding unit 416 re-multiplexes the plurality of packet streams into one packet stream, performs transmission path encoding processing, and outputs it as an OFDM transmission wave.
  • the configuration shown in FIG. 4C is the same as the ISDB-T system as a configuration for realizing OFDM transmission wave generation processing, although the details of the information source encoding and transmission path encoding methods are different.
  • some of the plurality of information source encoding units 411 and multiplexing units/conditional access processing units 415 are configured for ISDB-T digital terrestrial broadcasting services, and some are configured for advanced digital terrestrial broadcasting services.
  • the transmission path encoding unit 416 may multiplex packet streams of a plurality of different digital terrestrial broadcasting services.
  • MPEG-2TS which is a TSP (Transport Stream Packet) stream defined by MPEG-2 Systems, is configured. Just generate it.
  • the multiplexing unit/conditional access processing unit 415 when configured for advanced terrestrial digital broadcasting services, an MMT packet stream, a TLV stream containing MMT packets, or a TSP stream specified by other systems may be used. Just generate it.
  • all of the multiple information source encoding units 411 and the multiplexing unit/conditional access processing unit 415 are configured for advanced terrestrial digital broadcasting services, and all packet streams multiplexed by the transmission line encoding unit 416 are It may also be a packet stream for digital terrestrial broadcasting services.
  • FIG. 4D shows an example of the configuration of the transmission path encoding section 416.
  • FIG. 4D (1) shows the configuration of the transmission path encoding unit 416 when generating only OFDM transmission waves for digital broadcasting of the current digital terrestrial broadcasting service.
  • the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. 4B (3).
  • the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, and is subjected to various types of processing such as byte interleaving, bit interleaving, time interleaving, and frequency interleaving. Interleaving processing is performed.
  • the signal is processed by IFFT (Inverse Fast Fourier Transform) together with the pilot signal, TMCC signal, and AC signal, and after a guard interval is added, it becomes an OFDM transmission wave through orthogonal modulation.
  • IFFT Inverse Fast Fourier Transform
  • outer code processing, power spreading processing, byte interleaving, inner code processing, bit interleaving processing, and mapping processing are configured so that they can be processed separately for each layer such as the A layer and the B layer.
  • Figure 4D (1) shows an example of three layers.
  • the mapping process is carried out by carriers. This is the modulation process.
  • the packet stream inputted from the multiplexing unit/conditional access processing unit 415 may be multiplexed with information such as TMCC information, mode, guard interval ratio, and the like.
  • the packet stream input to the transmission path encoding unit 416 may be a TSP stream defined by MPEG-2 Systems, as described above.
  • the OFDM transmission wave generated with the configuration of FIG. 4D(1) can be received by, for example, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 of this embodiment.
  • FIG. 4D (2) shows the configuration of the transmission path encoding unit 416 when generating OFDM transmission waves for dual-polarization terrestrial digital broadcasting according to this embodiment.
  • the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. 4B (1) or (2).
  • the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, byte interleaving, bit interleaving, time
  • Various interleaving processes such as interleaving and frequency interleaving are performed. Thereafter, it is processed by IFFT along with the pilot signal, TMCC signal, and AC signal, subjected to guard interval addition processing, and then subjected to orthogonal modulation to become an OFDM transmission wave.
  • outer code processing, power spreading processing, byte interleaving, inner code processing, bit interleaving processing, mapping processing, and time interleaving are performed for each layer such as layer A, layer B, and layer C. Configure so that they can be processed separately.
  • layer A, layer B, and layer C Configure so that they can be processed separately.
  • FIG. 4D (2) not only a horizontally polarized (H) OFDM transmission wave but also a vertically polarized (V) OFDM transmission wave is generated, and the processing flow is branched into two systems. do.
  • outer code, inner code, mapping, etc. shown in the configuration of FIG. 4D(2) is in addition to the processing compatible with the configuration of FIG. 4D(1). More advanced processing not adopted can be used.
  • the part where processing is performed for each layer is compatible with current terrestrial digital broadcasting mobile reception services and video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically.
  • processing compatible with the configuration shown in FIG. 4D (1) is performed regarding processing of outer codes, inner codes, mapping, etc.
  • advanced digital terrestrial broadcasting is capable of transmitting video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels.
  • the layer that transmits the service may be configured to use more advanced processing, such as outer code, inner code, mapping, etc., which is not adopted in each process in the configuration of FIG. 4D (1).
  • the allocation of the hierarchy and the transmitted terrestrial digital broadcasting service can be switched using the TMCC information described later. It is desirable that processing such as coding and mapping be configured to be switchable using TMCC information.
  • byte interleaving, bit interleaving, and time interleaving are the same as in the current terrestrial digital broadcasting. Processing that is compatible with the service may be performed, or different, more advanced processing may be performed. Alternatively, for layers that transmit advanced digital terrestrial broadcasting services, some interleaving may be omitted.
  • the source input stream may be a TSP stream defined by the MPEG-2 system, which is currently used in digital terrestrial broadcasting, among the packet streams input to the transmission path encoding unit 416.
  • the input stream that is the source of the layer that transmits the advanced digital terrestrial broadcasting service configured as shown in FIG. It may be a stream defined by a system other than the TSP stream defined by MPEG-2 systems, such as .
  • TSP streams defined by MPEG-2 Systems may be adopted.
  • the current mobile reception service of digital terrestrial broadcasting and video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically are transmitted.
  • stream formats and processing compatible with current digital terrestrial broadcasting will be maintained.
  • the current receiving device of the current digital terrestrial broadcasting service receives one of the horizontally polarized OFDM transmission waves and the vertically polarized OFDM transmission waves generated in the configuration shown in Figure 4D (2).
  • the broadcasting of the digital terrestrial broadcasting service It becomes possible to receive and demodulate signals correctly.
  • the maximum resolution is the number of pixels exceeding 1920 pixels horizontally x 1080 pixels vertically. It is possible to transmit an advanced digital terrestrial broadcasting service that can transmit video images, and the broadcast signal of the advanced digital terrestrial broadcasting service can be received and demodulated by the broadcast receiving apparatus 100 according to the embodiment of the present invention. becomes.
  • both the broadcast receiving device compatible with advanced terrestrial digital broadcasting services and the existing receiving device of terrestrial digital broadcasting services can receive and demodulate digital broadcasts favorably. Broadcast waves can be generated.
  • the transmission path encoding unit 416 shown in FIG. 4D (2) when generating OFDM transmission waves for single-polarized terrestrial digital broadcasting according to this embodiment, the transmission path encoding unit 416 shown in FIG. 4D (2) generates horizontally polarized (H) OFDM transmission waves. It is sufficient to consist of only one of a system for generating a vertically polarized (V) OFDM transmission wave and a system for generating a vertically polarized (V) OFDM transmission wave.
  • the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. Unlike the case, only one of the horizontally polarized OFDM transmission wave and the vertically polarized OFDM transmission wave is transmitted.
  • Other configurations, operations, etc. are the same as in the case of generating OFDM transmission waves for dual-polarization terrestrial digital broadcasting described above.
  • FIG. 4D (3) shows the configuration of the transmission path encoding unit 416 when generating OFDM transmission waves for hierarchical division multiplexing digital terrestrial broadcasting according to this embodiment.
  • the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, byte interleaving, bit interleaving, time Various interleaving processes such as interleaving and frequency interleaving are performed.
  • the pilot signal, TMCC signal, and AC signal are processed by IFFT, and after a guard interval is added, they undergo orthogonal modulation to become an OFDM transmission wave.
  • a modulated wave transmitted in the upper layer and a modulated wave transmitted in the lower layer are respectively generated, and after multiplexing, an OFDM transmission wave that is a digital broadcast wave is generated.
  • the processing system shown in the upper part of the configuration of FIG. 4D (3) is a processing system for generating modulated waves transmitted in the upper layer, and the processing system shown in the lower part generates modulated waves transmitted in the lower layer.
  • This is a processing system for The data transmitted by the processing system for generating the modulated wave transmitted in the upper layer of Figure 4D (3) is based on the current terrestrial digital broadcasting mobile reception service and the maximum resolution is 1920 pixels horizontally x 1080 pixels vertically.
  • the modulated wave transmitted in the upper layer of FIG. 4D(3) has, for example, the segment configuration of FIG. 4B(3) similarly to the transmitted wave of FIG. 4D(1). Therefore, the modulated waves transmitted in the upper layer of FIG. 4D (3) are used for the current mobile reception service of digital terrestrial broadcasting and for the current digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically.
  • the modulated wave transmitted in the lower layer of FIG. 4D (3) is, for example, an advanced terrestrial device that can transmit video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels with all 13 segments in the A layer. It may also be allocated to digital broadcasting services. Or, with the segment configuration shown in Figure 4B (3), the current mobile reception service of digital terrestrial broadcasting is transmitted in the A layer of 1 segment, and the pixels exceeding 1920 pixels horizontally x 1080 pixels vertically in the B layer of 12 segments. An advanced digital terrestrial broadcasting service that can transmit video with a maximum resolution of In the latter case, as in FIG. 4D (2), the configuration may be such that the processing from outer code processing to time interleaving processing can be switched for each layer such as the A layer and the B layer. In the layer that transmits the mobile reception service of current digital terrestrial broadcasting, it is necessary to maintain processing compatible with current digital terrestrial broadcasting, which is similar to the explanation in FIG. 4D (2).
  • an OFDM transmission wave which is a terrestrial digital broadcast wave
  • the technology for separating the modulated waves transmitted in the upper layer from the OFDM transmission waves is also installed in the existing reception equipment of the current digital terrestrial broadcasting service, so the current terrestrial digital broadcasting mobile reception services and the current digital terrestrial broadcasting service's broadcast signals that transmit images with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically cannot be correctly received by the existing receiving equipment of the current digital terrestrial broadcasting service. Received and demodulated.
  • the broadcast signals of advanced digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels, which are included in the modulated waves transmitted in the lower layer, are It becomes possible to receive and demodulate the broadcast receiving apparatus 100 according to the embodiment of the invention.
  • both the broadcast receiving device compatible with advanced terrestrial digital broadcasting services and the existing receiving device of terrestrial digital broadcasting services can suitably receive and demodulate digital broadcasts.
  • Broadcast waves can be generated.
  • FIG. 4D(3) unlike the configuration of FIG. 4D(2), there is no need to use a plurality of polarized waves, and it is possible to generate an OFDM transmission wave that can be received more easily.
  • the OFDM transmission wave generation processing according to FIG. 4D (1), FIG. 4D (2), and FIG. 4D (3) of this embodiment is compatible with the distance between SFN stations and resistance to Doppler shift in mobile reception.
  • three types of modes with different numbers of carriers are prepared. Note that another mode having a different number of carriers may be further prepared. In a mode with a large number of carriers, the effective symbol length becomes longer, and with the same guard interval ratio (guard interval length/effective symbol length), the guard interval length becomes longer, making it possible to provide resistance to multipaths with long delay time differences. It is.
  • the carrier spacing becomes wide, and it is possible to make it less susceptible to the influence of inter-carrier interference due to Doppler shift that occurs in mobile reception and the like.
  • FIG. 4E shows an example of transmission parameters for each segment of OFDM segments identified by the mode of the system according to the present embodiment.
  • the carrier modulation method in the figure refers to the modulation method of the "data" carrier.
  • the SP signal, CP signal, TMCC signal, and AC signal employ a different modulation method than that of the "data" carrier.
  • These signals are signals in which noise immunity is more important than the amount of information, so a small-value constellation with fewer states (BPSK or DBPSK (ie, two states), a modulation method that performs mapping is adopted to improve resistance to noise.
  • the value on the left side of the diagonal line is the value when QPSK, 16QAM, 64QAM, etc. is set as the carrier modulation method
  • the value on the right side of the diagonal line is the value when DQPSK is set as the carrier modulation method. It is a value.
  • the underlined parameters are incompatible with the current mobile reception service of digital terrestrial broadcasting.
  • the modulation methods of "data" carriers such as 256QAM, 1024QAM, and 4096QAM are not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG.
  • 256QAM, 1024QAM, and 4096QAM which are modulation methods for the "data” carrier, are not used.
  • QPSK 4 states
  • 16QAM (16 states) are compatible with current digital terrestrial broadcasting services.
  • further multi-level modulation methods such as 256QAM (number of states: 256), 1024QAM (number of states: 1024), and 4096QAM (number of states: 4096) may be applied.
  • a modulation method different from these modulation methods may be employed.
  • BPSK number of states: 2
  • SP or CP pilot symbol
  • DBPSK number of states: 2
  • the LDPC code is not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, LDPC code is not used.
  • An LDPC code may be applied as an inner code to data transmitted in a layer corresponding to advanced digital terrestrial broadcasting services.
  • the BCH code is not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, BCH code is not used.
  • the BCH code may be applied as an outer code to data transmitted in a layer corresponding to advanced terrestrial digital broadcasting services.
  • FIG. 4F shows the transmission signal parameters for each physical channel (6 MHz bandwidth) of the OFDM broadcast wave generation processing according to FIGS. 4D (1), 4D (2), and 4D (3) of this embodiment.
  • An example is shown.
  • the parameters shown in FIG. 4F are compatible with current digital terrestrial broadcasting services.
  • all segments of the modulated waves transmitted in the lower layer of Figure 4D (3) are assigned to advanced digital terrestrial broadcasting services, it is not necessary to maintain compatibility with the current digital terrestrial broadcasting services in the modulated waves. do not have. Therefore, in this case, parameters other than the parameters shown in FIG. 4F may be used for the modulated wave transmitted in the lower layer of FIG. 4D(3).
  • the carrier of the OFDM transmission wave according to this embodiment includes a carrier for transmitting data such as video and audio, a carrier for transmitting pilot signals (SP, CP, AC1, AC2) that serve as demodulation standards, There is a carrier on which a TMCC signal, which is information such as a carrier modulation format and a convolutional coding rate, is transmitted. For these transmissions, a number of carriers corresponding to 1/9 of the number of carriers for each segment are used.
  • a concatenated code is used for error correction, with a shortened Reed-Solomon (204,188) code for the outer code and a punctured code with a constraint length of 7 and a coding rate of 1/2 as the mother code for the inner code.
  • Adopt convolutional codes Encoding different from the above may be used for both the outer code and the inner code.
  • the information rate varies depending on parameters such as carrier modulation format, convolutional coding rate, and guard interval ratio.
  • 204 symbols constitute one frame, and one frame includes an integer number of TSPs. Transmission parameter switching is performed at this frame boundary.
  • Pilot signals that serve as demodulation standards include SP (Scattered Pilot), CP (Continual Pilot), AC (Auxiliary Channel) 1, and AC2.
  • FIG. 4G shows an example of how pilot signals and the like are arranged within a segment in the case of synchronous modulation (QPSK, 16QAM, 64QAM, 256QAM, 1024QAM, 4096QAM, etc.).
  • the SP is inserted into a synchronous modulation segment and is transmitted once every 12 carriers in the carrier number (frequency axis) direction and once every 4 symbols in the OFDM symbol number (time axis) direction. Since the amplitude and phase of SP are known, they can be used as a reference for synchronous demodulation.
  • FIG. 4H shows an example of how pilot signals and the like are arranged within a segment in the case of differential modulation (DQPSK, etc.).
  • CP is a continuous signal inserted at the left end of a differential modulation segment and is used for demodulation.
  • AC1 and AC2 carry information on the CP, and in addition to playing the role of pilot signals, they are also used to transmit information for broadcasters. AC1 and AC2 may also be used to transmit other information.
  • FIGS. 4G and 4H are examples for mode 3, and the carrier numbers range from 0 to 431, but in mode 1 and mode 2, carrier numbers range from 0 to 107 or 0, respectively. 215. Furthermore, carriers for transmitting AC1, AC2, and TMCC may be determined in advance for each segment. Note that carriers for transmitting AC1, AC2, and TMCC are randomly arranged in the frequency direction in order to reduce the influence of periodic dips in transmission path characteristics due to multipath.
  • the TMCC signal transmits information (TMCC information) related to the demodulation operation of the receiver, such as the layer configuration and the transmission parameters of the OFDM segment.
  • the TMCC signal is transmitted on a carrier for TMCC transmission defined within each segment.
  • FIG. 5A shows an example of bit allocation for TMCC carriers.
  • the TMCC carrier consists of 204 bits (B0 to B203).
  • B0 is the demodulation reference signal for the TMCC symbol and has predetermined amplitude and phase references.
  • B1 to B16 are synchronization signals, each consisting of a 16-bit word. Two types of synchronization signals, w0 and w1, are defined, and w0 and w1 are sent out alternately for each frame.
  • B17 to B19 are used to identify the segment type, and identify whether each segment is a differential modulation section or a synchronous modulation section.
  • TMCC information is written in B20 to B121.
  • B122 to B203 are parity bits.
  • the TMCC information of the OFDM transmission wave includes, for example, system identification, transmission parameter switching index, activation control signal (startup flag for emergency warning broadcasting), current information, next information, frequency conversion process identification, It may be configured to include information to assist demodulation and decoding operations of the receiver, such as physical channel number identification, main signal identification, 4K signal transmission layer identification, and additional layer transmission identification.
  • Current information indicates the current hierarchical configuration and transmission parameters
  • next information indicates the hierarchical configuration and transmission parameters after switching. Transmission parameter switching is performed on a frame-by-frame basis.
  • FIG. 5B shows an example of bit allocation of TMCC information.
  • FIG. 5C shows an example of the configuration of transmission parameter information included in the current information/next information.
  • the connected transmission phase correction amount is control information used in the case of terrestrial digital audio broadcasting ISDB-TSB (ISDB for Terrestrial Sound Broadcasting), etc., which uses a common transmission method.
  • ISDB-TSB ISDB for Terrestrial Sound Broadcasting
  • detailed explanation of the coupled transmission phase correction amount will be omitted.
  • FIG. 5D shows an example of bit allocation for system identification. Two bits are allocated to the system identification signal.
  • "00" is set.
  • "01" is set.
  • "10" is set.
  • the advanced digital terrestrial television broadcasting system transmits 2K broadcast programs (1920 horizontal pixels x 1080 vertical pixels, 4K broadcast programs (not limited to broadcast programs with video exceeding 1920 pixels horizontally x 1080 pixels vertically, and broadcast programs with video exceeding 3840 pixels horizontally x 2160 pixels vertically). It is possible to transmit simultaneously within the same service.
  • the transmission parameter switching index is used to notify the receiver of the switching timing by counting down when switching transmission parameters. This index normally has a value of "1111", and when switching transmission parameters, it is subtracted by 1 for each frame starting 15 frames before switching. The switching timing is the next frame synchronization when "0000" is sent. The value of the index returns to "1111" after "0000". Any one of the parameters such as the system identification of the TMCC information, the transmission parameter information, the frequency conversion process identification, the main signal identification, the 4K signal transmission layer identification, the additional layer transmission identification, etc. included in the current information/next information shown in FIG. 5B. When switching between the above, a countdown is performed. When switching only the activation control signal of TMCC information, no countdown is performed.
  • the activation control signal (activation flag for emergency warning broadcasting) is set to ⁇ 1'' when activation control is being performed to the receiver during emergency warning broadcasting, and is set to ⁇ 0'' when activation control is not performed. do.
  • the partial reception flag for each current information/next information is set to "1" when the segment at the center of the transmission band is set for partial reception, and to "0" otherwise.
  • segment 0 is configured for partial reception, its layer is defined as layer A. If next information does not exist, the partial reception flag is set to "1".
  • FIG. 5E shows an example of bit allocation for the carrier modulation mapping method (data carrier modulation method) in each layer transmission parameter for each current information/next information.
  • this parameter is "000", it indicates that the modulation method is DQPSK. "001” indicates that the modulation method is QPSK. "010” indicates that the modulation method is 16QAM. “011” indicates that the modulation method is 64QAM. "100” indicates that the modulation method is 256QAM. "101” indicates that the modulation method is 1024QAM. “110” indicates that the modulation method is 4096QAM. If there is no unused hierarchy or next information, "111" is set in this parameter.
  • each parameter may be set according to the organization information of each layer for each current information/next information.
  • the number of segments indicates the number of segments in each layer using a 4-bit numerical value. If there is no unused hierarchy or next information, "1111" is set. Note that settings such as the mode and guard interval ratio are independently detected on the receiver side, so they do not need to be transmitted using TMCC information.
  • FIG. 5F shows an example of bit allocation for frequency conversion processing identification.
  • Frequency conversion processing identification indicates whether frequency conversion processing (in the case of dual-polarization transmission method) or frequency conversion amplification processing (in the case of hierarchical division multiplexing transmission method), which will be described later, has been performed in the conversion unit 201T or conversion unit 201L in FIG. 2A. In this case, set "0". If frequency conversion processing or frequency conversion amplification processing is not being performed, "1" is set.
  • this parameter is set to "1" when transmitted from a broadcasting station, and when the conversion section 201T or conversion section 201L executes frequency conversion processing or frequency conversion amplification processing, the conversion section 201T or conversion section 201L
  • the configuration may also be such that rewriting to "0" is performed at the time. In this way, when the second tuner/demodulator 130T or the third tuner/demodulator 130L of the broadcast receiving device 100 receives the frequency conversion processing identification bit as "0", the OFDM It can be identified that frequency conversion processing or the like has been performed after the transmission wave is sent out from the broadcasting station.
  • the frequency conversion processing identification bit may be set or rewritten for each of a plurality of polarizations. For example, if both of the plurality of polarized waves are not frequency-converted by the conversion unit 201T in FIG. 2A, the frequency conversion process identification bits included in both OFDM transmission waves may be left as "1". In addition, if only one of the plurality of polarized waves is frequency-converted by the conversion unit 201T, the frequency conversion processing identification bit included in the OFDM transmission wave of the frequency-converted polarized wave is set to “0” in the conversion unit 201T. ”.
  • the frequency conversion processing identification bit included in the frequency-converted OFDM transmission waves of both polarized waves is set to “0” in the conversion unit 201T. Just rewrite it. In this way, in the broadcast receiving apparatus 100, it is possible to identify whether or not frequency conversion is to be performed for each polarized wave among a plurality of polarized waves.
  • the frequency conversion processing identification bit is not defined in current digital terrestrial broadcasting, so it will be ignored by digital terrestrial broadcasting receiving devices that are already used by users.
  • the bits may be introduced into a new terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels, which is an improvement on the current terrestrial digital broadcasting.
  • the first tuner/demodulator 130C of the broadcast receiving apparatus 100 may also be configured as a first tuner/demodulator compatible with the new terrestrial digital broadcasting service.
  • the conversion unit 201T or conversion unit 201L in FIG. 2A performs frequency conversion processing or frequency conversion amplification processing on the OFDM transmission wave, It may be set to "0" in advance. Note that if the received broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter may be configured to be set to "1".
  • FIG. 5G shows an example of bit allocation for physical channel number identification.
  • the physical channel number identification consists of a 6-bit code, and identifies the physical channel number (13 to 52 ch) of the received broadcast wave. If the received broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter is set to "111111".
  • the physical channel number identification bit is not defined in current digital terrestrial broadcasting, and current digital terrestrial broadcasting receiving devices acquire the physical channel number of the broadcast wave specified by the broadcasting station from the TMCC signal, AC signal, etc. I could't.
  • the broadcast receiving apparatus 100 by using the physical channel number identification bit of the received OFDM transmission wave, the OFDM transmission wave is It is possible to understand the physical channel number set by the broadcasting station.
  • the physical channels 13ch to 52ch have a bandwidth of 6 MHz per channel and are previously assigned to a frequency band of 470 to 710 MHz. Therefore, the fact that the broadcast receiving device 100 can grasp the physical channel number of the OFDM transmission wave based on the physical channel number identification bit means that the frequency band in which the OFDM transmission wave was transmitted in the air as a terrestrial digital broadcast wave can be grasped. It means that it is possible.
  • each of a plurality of pairs of polarized waves in the bandwidth that originally constitutes one physical channel is assigned the corresponding physical channel number. It is sufficient to arrange identification bits and give the same physical number.
  • the converter 201T in FIG. 2A may convert only the frequency of one of the plurality of polarized waves. As a result, if the frequencies of the plurality of polarized waves received by the broadcast receiving apparatus 100 differ from each other, the plurality of polarized waves with different frequencies can be recognized as originally a pair.
  • the broadcast receiving device will not be able to demodulate advanced digital terrestrial broadcasting using both polarizations of dual-polarization digital terrestrial broadcasting.
  • the broadcasting station can It can be identified as a transmission wave that was transmitted as a polarized wave pair that constituted one physical channel. This makes it possible to realize advanced demodulation of dual-polarization terrestrial digital broadcasting using the plurality of transmission waves exhibiting the same value.
  • FIG. 5H shows an example of bit allocation for main signal identification.
  • the main signal identification bit is placed in bit B117.
  • the OFDM transmission wave to be transmitted is a transmission wave of dual polarization terrestrial digital broadcasting
  • this parameter is set to "1" in the TMCC information of the transmission wave transmitted with the main polarization.
  • This parameter is set to "0" in the TMCC information of the transmission wave transmitted with the secondary polarization.
  • Transmission waves that are transmitted with main polarization are vertically polarized signals and horizontally polarized signals that are polarized in the same direction as the polarization direction used for transmission of current digital terrestrial broadcasting services. Refers to wave signals.
  • horizontal polarization is the main polarization
  • vertical polarization is the secondary polarization in dual-polarization terrestrial digital broadcasting services. becomes.
  • vertical polarization is the main polarization
  • horizontal polarization is the secondary polarization. becomes.
  • the broadcast receiving device 100 that receives the transmission wave of the dual polarization terrestrial digital broadcasting according to the embodiment of the present invention, by using the main signal identification bit, the received transmission wave is transmitted in the main polarization at the time of transmission. It is possible to identify whether the signal was being transmitted using a secondary polarization. For example, if the primary polarization and secondary polarization identification processing is used, during the initial scan described later, the transmission wave transmitted with the primary polarization is first scanned, and the transmission wave transmitted with the primary polarization is scanned first. After the initial scan of the transmitted wave is completed, it becomes possible to perform processing such as performing an initial scan of the transmitted wave transmitted with the secondary polarization.
  • the initial scan of the advanced digital terrestrial broadcasting service can be performed after the initial scan of the current digital terrestrial broadcasting service is completed, and the settings made by the initial scan of the current digital terrestrial broadcasting service can be This is suitable because it can be reflected in the settings based on the initial scan of the broadcasting service.
  • the meanings of the main signal identification bits "1" and "0" may be defined in the opposite way to the above explanation.
  • the polarization direction identification bit may be used as one parameter of the TMCC information. Specifically, the broadcasting station sets the polarization direction identification bit to "1" for transmission waves transmitted with horizontal polarization, and the polarization direction identification bit is set on the broadcasting station side for transmission waves transmitted with vertical polarization. It is sufficient to set it to "0". In the broadcast receiving apparatus 100 that receives the transmission wave of the dual-polarization terrestrial digital broadcasting according to the embodiment of the present invention, by using the polarization direction identification bit, it is possible to determine which polarization direction the received transmission wave is at the time of transmission.
  • the first signal second signal identification bit may be used as one parameter of the TMCC information.
  • one of horizontally polarized waves and vertically polarized waves is defined as the first polarized wave
  • the broadcast signal of the transmission wave transmitted with the first polarized wave is defined as the first signal.
  • the station side may set the first signal second signal identification bit to "1".
  • the other polarized wave is defined as the second polarized wave
  • the broadcast signal of the transmission wave transmitted with the second polarized wave is defined as the second signal
  • the broadcast station side sets the first signal second signal identification bit. It is sufficient to set it to "0".
  • the broadcast receiving apparatus 100 that receives the transmission wave of the dual-polarized terrestrial digital broadcast according to the embodiment of the present invention, by using the first signal second signal identification bit, the received transmission wave is It is possible to identify whether the signal was being transmitted in the polarization direction.
  • the first and second signal identification bits are different from the concepts of "main polarization” and “secondary polarization” in the definition of the main signal identification bits described above.
  • the processing and effects in the broadcast receiving apparatus 100 are as follows: "main polarization" in the part related to the processing of the broadcast receiving apparatus 100 in the explanation of the main signal identification bits described above is changed to "first polarization". ⁇ wave'' and ⁇ sub-polarized wave'' may be read as ⁇ second polarized wave,'' so the explanation will be omitted again.
  • main signal identification, polarization direction identification, and first signal and second signal identification apply only when the broadcast wave is a single-polarization digital terrestrial broadcasting service according to this embodiment or when it is not an advanced digital terrestrial broadcasting service.
  • This parameter is not required and can be set to "1".
  • the upper and lower layer identification bits may be used as one parameter of the TMCC information instead of the above-mentioned main signal identification bits.
  • the above-mentioned upper and lower layer identification bit is set to "1”
  • the above-mentioned upper and lower layer identification bit is set to "1"
  • the broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter may be set to "1".
  • the broadcast receiving apparatus 100 receives transmission waves of layer division multiplexed terrestrial digital broadcasting, it determines whether the modulated waves were originally transmitted in the upper layer or not, based on the above-mentioned upper and lower layer identification bits. It is possible to identify whether it was the modulated wave that was being transmitted.
  • the initial scan of advanced digital terrestrial broadcasting services transmitted in the lower layer can be performed after the initial scan of the current digital terrestrial broadcasting service transmitted in the upper layer is completed, and the It becomes possible to reflect the settings made by the initial scan of the digital broadcasting service to the settings made by the initial scan of the advanced digital terrestrial broadcasting service.
  • the third tuner/demodulator 130L of the broadcast receiving apparatus 100 it can be used to switch the processing of the demodulator 133S and the demodulator 133L based on the identification result.
  • FIG. 5I shows an example of bit allocation for 4K signal transmission layer identification.
  • the 4K signal transmission layer identification bits are for horizontally polarized signals and vertically polarized signals for each of the B layer and C layer. It is sufficient to indicate whether or not to transmit a 4K broadcast program using both signals.
  • One bit is assigned to each of the settings of the B layer and the C layer. For example, in the B layer and the C layer, if the 4K signal transmission layer identification bit for each layer is "0", the 4K broadcast program is broadcast using both the horizontally polarized signal and the vertically polarized signal in the layer concerned. It is sufficient to indicate that transmission is to be performed.
  • the 4K signal transmission layer identification bit for each layer is "1"
  • a 4K broadcast program that uses both horizontally polarized signals and vertically polarized signals is transmitted in that layer. Just show that there is no such thing.
  • the 4K signal can be transmitted using both the horizontally polarized signal and the vertically polarized signal in each layer in the B layer and the C layer. It is possible to identify whether or not to transmit a broadcast program.
  • the bits of the 4K signal transmission layer identification indicate that the 4K broadcast program is transmitted for each of the B layer and the C layer. It suffices to indicate whether or not to do so.
  • One bit is assigned to each of the settings of the B layer and the C layer. For example, in the B layer and the C layer, if the 4K signal transmission layer identification bit for each layer is "0", it may indicate that a 4K broadcast program will be transmitted in that layer. In the B layer and the C layer, when the 4K signal transmission layer identification bit for each layer is "1", it may indicate that the 4K broadcast program is not transmitted in that layer. In this way, the broadcast receiving apparatus 100 can use the 4K signal transmission layer identification bit to identify whether or not to transmit a 4K broadcast program in each layer in the B layer and the C layer. .
  • the bits of the 4K signal transmission layer identification indicate whether or not to transmit the 4K broadcast program in the lower layer. It is sufficient to indicate the following.
  • this parameter B119 is "0"
  • the 4K broadcast program is transmitted in the lower layer.
  • this parameter B119 is "1”
  • the 4K broadcast program is not transmitted in the lower hierarchy.
  • the broadcast receiving apparatus 100 can use the 4K signal transmission layer identification bit to identify whether or not to transmit a 4K broadcast program in the lower layer.
  • this parameter B118 may be undefined.
  • each of these parameters may be set to "1".
  • FIG. 5J shows an example of bit allocation for additional layer transmission identification.
  • the bits of the additional layer transmission identification are virtual when the broadcast wave to be transmitted is the dual-polarization terrestrial digital broadcasting service of this embodiment, and for each of the B layer and C layer of the transmission wave transmitted with the secondary polarization. It suffices if it indicates whether to use it as the D layer or the virtual E layer.
  • the bit placed in B120 is the D layer transmission identification bit, and if this parameter is "0", the B layer transmitted with the secondary polarization is used as the virtual D layer.
  • this parameter is "1"
  • the B layer transmitted by secondary polarization is not used as the virtual D layer, but is used as the B layer.
  • the bit placed in B121 is an E layer transmission identification bit, and if this parameter is "0", the C layer transmitted with the secondary polarization is used as the virtual E layer.
  • this parameter is "0"
  • the C layer transmitted with the secondary polarization is used as the virtual E layer.
  • this parameter is "1"
  • the C layer transmitted by the secondary polarization is not used as the virtual E layer, but is used as the C layer.
  • the parameters such as the carrier modulation mapping method, coding rate, and time interleaving length shown in FIG. 5C will be changed between the virtual D layer/virtual E layer and the B layer/C layer. It is possible to make them different. In this case, if the current/next information of parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer is transmitted using AC information (for example, AC1), etc. On the broadcast receiving apparatus 100 side, parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer can be grasped.
  • AC information for example, AC1
  • the current information/next information of the TMCC information transmitted in the secondary polarization It may be configured to switch the meaning of the transmission parameters of the B layer and/or C layer of information to the transmission parameters of the virtual D layer and/or the virtual E layer.
  • the virtual D layer and/or the virtual E layer when used, the A layer, B layer, and C layer are used in the main polarization, and the transmission parameters of these layers are the TMCC transmitted in the main polarization. It is sufficient to transmit the current information/next information.
  • the A layer, D layer, and E layer are used, and the transmission parameters of these layers may be transmitted using the current information/next information of the TMCC information transmitted in the secondary polarization.
  • the broadcast receiving apparatus 100 side can grasp parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer.
  • this parameter should be set to "1". It may be configured to do so.
  • the parameter for additional layer transmission identification may be stored in both the TMCC information of the main polarization and the TMCC information of the secondary polarization, but if it is stored in the TMCC information of the secondary polarization at least, the above-mentioned Both of these processes are possible.
  • the broadcast receiving apparatus 100 may ignore the D layer transmission identification bit.
  • the 4K signal transmission layer identification parameter indicates that the 4K broadcast program is transmitted in the C layer
  • the E layer transmission identification bit indicates that the C layer is used as the virtual E layer
  • Broadcast receiving apparatus 100 may be configured to ignore the E layer transmission identification bit.
  • the parameter is not "10”
  • all bits are set to "1".
  • the system identification parameter is not "10”, but due to some problem, the frequency conversion process identification bit, physical channel number identification bit, main signal identification bit, 4K signal transmission identification bit, or additional layer transmission identification bit
  • the broadcast receiving device 100 may be configured to ignore the bit that is not "1" and determine that all of these bits are “1”.
  • FIG. 5K shows an example of the "coding rate" bits shown in FIG. 5C, that is, bit allocation for error correction coding rate identification.
  • the advanced terrestrial digital broadcasting service of 4K broadcasting can be broadcast together with the terrestrial digital broadcasting service of 2K broadcasting.
  • the LDPC code can be used as the inner code.
  • the coding rate identification bit for error correction according to the present embodiment shown in FIG. 5K is not a coding rate identification bit dedicated to convolutional codes, but is It is also configured to correspond to
  • the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code
  • the coding rate can be set independently depending on whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code.
  • a group of coding rate options suitable for each coding method can be adopted as a digital broadcasting system.
  • the coding rate is 1/2 if the inner code is a convolutional code, and the coding rate is 2 if the inner code is an LDPC code. /3.
  • the identification bit is "001”, it indicates that the coding rate is 2/3 if the inner code is a convolutional code, and that the coding rate is 3/4 if the inner code is an LDPC code.
  • the identification bit is "010”, it indicates that the coding rate is 3/4 if the inner code is a convolutional code, and that the coding rate is 5/6 if the inner code is an LDPC code.
  • the identification bit When the identification bit is "011", it indicates that the coding rate is 5/6 if the inner code is a convolutional code, and that the coding rate is 2/16 if the inner code is an LDPC code. When the identification bit is "100”, it indicates that the coding rate is 7/8 if the inner code is a convolutional code, and that the coding rate is 6/16 if the inner code is an LDPC code. When the identification bit is "101", it is undefined if the inner code is a convolutional code, and it indicates that the coding rate is 10/16 if the inner code is an LDPC code.
  • the identification bit When the identification bit is "110", it indicates that it is undefined if the inner code is a convolutional code, and that the coding rate is 14/16 if the inner code is an LDPC code. If there is no unused hierarchy or next information, this parameter is set to "111".
  • the above-mentioned coding rate 2/3 may be substituted for the coding rate 81/120.
  • the coding rate 3/4 may be substituted for the coding rate 89/120.
  • the coding rate 5/6 may be substituted for the coding rate 101/120.
  • a coding rate of 8/16, a coding rate of 12/16, etc. may be assigned.
  • the identification of whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code is determined by whether the digital terrestrial broadcasting service is a current digital terrestrial broadcasting service or an advanced digital terrestrial broadcasting service. The identification may be performed using the results of identification. The identification may be performed using the identification bits described in FIG. 5D or FIG. 5I.
  • the target digital terrestrial broadcasting service is the current digital terrestrial broadcasting service, it is sufficient to identify that the inner code is a convolutional code.
  • the target digital terrestrial broadcasting service is an advanced digital terrestrial broadcasting service, it is sufficient to identify that the inner code is an LDPC code.
  • Another example of identifying whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code is to identify it based on the identification bits of the error correction method, which will be described later in FIG. 6I. It's okay.
  • the advanced digital terrestrial broadcasting service using dual-polarization transmission method even if the TMCC information of the transmission wave transmitted with horizontal polarization and the TMCC information of the transmission wave transmitted with vertical polarization are the same, It's okay and it can be different.
  • the advanced digital terrestrial broadcasting service using the layer division multiplex transmission system even if the TMCC information of the transmission wave transmitted in the upper layer and the TMCC information of the transmission wave transmitted in the lower layer are the same, It's okay and it can be different.
  • the aforementioned frequency conversion processing identification parameters, main signal identification parameters, additional layer transmission identification, etc. are described only in the TMCC information of the transmission waves transmitted in the secondary polarization and the transmission waves transmitted in the lower layer. It's okay to be.
  • parameters for frequency conversion processing identification parameters for main signal identification, parameters for polarization direction identification, parameters for first signal and second signal identification, parameters for upper and lower layer identification, and parameters for 4K signal transmission layer identification.
  • an additional layer transmission identification parameter are included in a TMCC signal (TMCC carrier) and transmitted.
  • these parameters may be included in an AC signal (AC carrier) and transmitted instead of the TMCC signal. That is, these parameters may be transmitted using a signal of a carrier (TMCC carrier, AC carrier, etc.) that is modulated using a modulation method that performs mapping with a smaller number of states than the data carrier modulation method.
  • the AC signal is an additional information signal related to broadcasting, such as additional information related to transmission control of modulated waves or seismic motion warning information.
  • additional information regarding transmission control of modulated waves can be transmitted using any AC carrier.
  • FIG. 6A shows an example of bit allocation for an AC signal.
  • the AC signal consists of 204 bits (B0 to B203).
  • B0 is the demodulation reference signal for the AC symbol and has predetermined amplitude and phase references.
  • B1 to B3 are signals for identifying the configuration of the AC signal.
  • B4 to B203 are used to transmit additional information related to modulated wave transmission control or seismic motion warning information.
  • FIG. 6B shows an example of bit allocation for AC signal configuration identification.
  • this parameter is set to "001" or "110".
  • the configuration identification parameter ('001' or '110') shall have the same code as the first 3 bits (B1 to B3) of the synchronization signal of the TMCC signal, and shall be transmitted at the same timing as the TMCC signal. Send alternately for each frame.
  • this parameter has a value other than the above, it indicates that additional information regarding transmission control of modulated waves is transmitted using B4 to B203 of the AC signal.
  • the parameters for identifying the configuration of the AC signal are "000” and "111", or “010” and “101", or “011” and "100", which are alternately transmitted for each frame.
  • B4 to B203 of the AC signal are used to transmit additional information related to modulated wave transmission control or seismic motion warning information.
  • Transmission of additional information regarding modulated wave transmission control may be performed using various bit configurations.
  • the frequency conversion processing identification, physical channel number identification, main signal identification, 4K signal transmission layer identification, additional layer transmission identification, etc. mentioned in the explanation of the TMCC signal can be used instead of or in addition to the TMCC signal.
  • Bits may be assigned to additional information regarding transmission control of a modulated wave of a signal for transmission. In this way, the broadcast receiving apparatus 100 can use these parameters to perform the various identification processes already described in the description of the TMCC signal.
  • Current/next information of transmission parameters regarding the hierarchy/virtual E hierarchy may be assigned. In this way, in the broadcast receiving apparatus 100, the transmission parameters of each layer can be acquired using these parameters, and the demodulation process of each layer can be controlled.
  • the seismic motion warning information includes a synchronization signal, start/end flag, update flag, signal identification, seismic motion warning detailed information, CRC, parity bit, and the like.
  • the synchronization signal is composed of a 13-bit code, and is the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal.
  • the 16-bit code that combines the configuration identification and the synchronization signal becomes a 16-bit synchronization word that is the same as the TMCC synchronization signal.
  • the start/end flag is composed of a 2-bit code as a flag for the start timing/end timing of the seismic motion warning information.
  • the start/end flag is changed from “11” to "00" at the start of sending out seismic motion warning information, and from "00" to "11” at the end of sending out seismic motion warning information.
  • the update flag consists of a 2-bit code, and each time there is a change in the contents of a series of seismic motion warning detailed information transmitted when the start/end flag is "00", the update flag is set to "1" with an initial value of "00”. ” is increased. After “11", the number returns to "00". When the start/end flag is "11", the update flag is also "11".
  • FIG. 6D shows an example of bit allocation for signal identification.
  • the signal identification consists of a 3-bit code, and is used to identify the type of detailed seismic motion warning information.
  • this parameter is "000”, it means “seismic motion warning detailed information (applicable area exists)”.
  • this parameter is "001”, it means “seismic motion warning detailed information (no applicable area)”.
  • this parameter is "010”, it means “test signal of earthquake motion warning detailed information (corresponding area exists)”.
  • this parameter is "011”, it means “test signal of earthquake motion warning detailed information (no applicable area)”.
  • this parameter is "111”, it means “no detailed earthquake motion warning information”.
  • the start/end flag is "00”
  • the signal identification is "000”, “001", “010", or “011”.
  • the start/end flag is "11
  • the signal identification is "111”.
  • the seismic motion warning detailed information is composed of an 88-bit code.
  • the seismic motion warning detailed information indicates information regarding the current time when the seismic motion warning information is sent and the area targeted for the seismic motion warning. It transmits information such as the latitude/longitude/intensity of the epicenter of the earthquake that is the target of information and earthquake motion warnings.
  • FIG. 6E shows an example of bit allocation of the seismic motion warning detailed information when the signal identification is "000", “001", “010", or "011".
  • the signal identification is "111”
  • FIG. 6F shows an example of bit allocation of the seismic motion warning detailed information when the signal identification is "111".
  • CRC is a code generated using a predetermined generating polynomial for B21 to B111 of the seismic motion warning information.
  • the parity bit is a code generated by the shortened code (187, 105) of the difference set cyclic code (273, 191) for B17 to B121 of the seismic motion warning information.
  • the broadcast receiving device 100 it is possible to perform various controls for dealing with an emergency situation using the parameters related to the seismic motion warning described in FIGS. 6C, 6D, 6E, and 6F. For example, it is possible to control the presentation of information related to earthquake motion warnings, control to switch display content with low priority to display related to earthquake motion warnings, control to terminate the application display and switch to display related to earthquake motion warnings or broadcast program video, etc. be.
  • FIG. 6G shows an example of bit allocation of additional information regarding modulated wave transmission control.
  • Additional information regarding transmission control of modulated waves includes a synchronization signal, current information, next information, parity bit, and the like.
  • the synchronization signal is composed of a 13-bit code, and is the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal.
  • the synchronization signal does not have to have the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal.
  • the 16-bit code that combines the configuration identification and the synchronization signal is a 16-bit synchronization word that conforms to the TMCC synchronization signal. becomes. It may be a 16-bit synchronization word different from the TMCC synchronization signal.
  • the current information indicates current information on transmission parameter additional information when transmitting a 4K broadcast program in the B layer or C layer, or transmission parameters regarding the virtual D layer or the virtual E layer.
  • the next information indicates transmission parameter additional information when transmitting a 4K broadcast program in the B layer or C layer, and information after switching of transmission parameters regarding the virtual D layer or the virtual E layer.
  • current information B18 to B30 is the current information of the B layer transmission parameter additional information, and indicates the current information of the transmission parameter additional information when transmitting a 4K broadcast program in the B layer.
  • current information B31 to B43 is current information of the C layer transmission parameter additional information, and indicates the current information of the transmission parameter additional information when transmitting a 4K broadcast program in the C layer.
  • B70 to B82 of the next information are information after switching the transmission parameters of the B layer transmission parameter additional information, and after switching the transmission parameters of the transmission parameter additional information when transmitting a 4K broadcast program in the B layer. It indicates information.
  • B83 to B95 of the next information are information after switching the transmission parameters of the C layer transmission parameter additional information, and information after switching the transmission parameters of the transmission parameter additional information when transmitting a 4K broadcast program in the C layer.
  • the transmission parameter additional information is a transmission parameter related to modulation that is added to the transmission parameter of the TMCC information shown in FIG. 5C and whose specifications are expanded. The specific contents of the transmission parameter additional information will be described later.
  • current information B44 to B56 is current information on transmission parameters for the virtual D layer when the virtual D layer is operated.
  • Current information B57 to B69 is current information on transmission parameters for the virtual E layer when operating the virtual E layer.
  • B96 to B108 of the next information are information after the transmission parameters for the virtual D layer are switched when the virtual D layer is operated.
  • Current information B109 to B121 is information after the transmission parameters for the virtual E layer are switched when the virtual E layer is operated.
  • the parameters stored in the transmission parameters for the virtual D layer and the transmission parameters for the virtual E layer may be the same as those shown in FIG. 5C.
  • the virtual D layer and the virtual E layer are layers that do not exist in current digital terrestrial broadcasting.
  • the TMCC information in FIG. 5B needs to maintain compatibility with current terrestrial digital broadcasting, so it is not easy to increase the number of bits. Therefore, in the embodiment of the present invention, the transmission parameters for the virtual D layer and the virtual E layer are stored in the AC information, as shown in FIG. 6G, instead of in the TMCC information.
  • the broadcast receiving device 100 may be configured to ignore any value contained in the transmission parameters shown in FIG. 6G for the unused virtual D layer or virtual E layer.
  • FIG. 6H shows a specific example of transmission parameter additional information.
  • the transmission parameter additional information can include error correction method parameters, constellation format parameters, and the like.
  • the error correction method is a setting that determines what kind of encoding method is used as the error correction method for the inner code and outer code when transmitting 4K broadcast programs (advanced terrestrial digital broadcasting services) in the B or C layer. shows.
  • FIG. 6I shows an example of bit allocation for the error correction method.
  • this parameter is "000”
  • a convolutional code is used as the inner code
  • a shortened RS code is used as the outer code when transmitting a 4K broadcast program on the B layer or the C layer.
  • this parameter is "001"
  • the LDPC code is used as the inner code
  • the BCH code is used as the outer code.
  • other combinations may be set and selected.
  • FIG. 6J shows an example of bit allocation in a constellation format.
  • this parameter is "000”
  • the carrier modulation mapping method selected by the transmission parameter of TMCC information is applied in a uniform constellation.
  • this parameter is one of "001" to "111”
  • the carrier modulation mapping method selected by the transmission parameter of the TMCC information is applied in a non-uniform constellation. Note that when applying a non-uniform constellation, the optimal value of the non-uniform constellation differs depending on the type of error correction method, its coding rate, etc.
  • the broadcast receiving apparatus 100 of this embodiment uses the non-uniform constellation used in the demodulation process as the parameter of the carrier modulation mapping method. This may be determined based on the parameters of the error correction method and its coding rate. This determination may be made by referring to a predetermined table stored in advance in the broadcast receiving apparatus 100.
  • the dual polarization transmission system is a system that shares some specifications with the current digital terrestrial broadcasting system. For example, by dividing 13 segments within the approximately 6 MHz band, which corresponds to one physical channel, 7 segments are used to transmit a 2K (horizontal 1920 pixels x vertical 1080 pixels) broadcast program, and 5 segments are used to transmit a 4K broadcast program.
  • One segment is allocated to each for mobile reception (so-called one-segment broadcasting). Furthermore, the five segments for 4K broadcasting use not only horizontally polarized signals but also vertically polarized signals to ensure a total transmission capacity of 10 segments using MIMO (Multiple-Input Multiple-Output) technology.
  • MIMO Multiple-Input Multiple-Output
  • 2K broadcast programs maintain image quality by optimizing the latest MPEG-2 Video compression technology, so that they can be received by current TV receivers
  • 4K broadcast programs use HEVC compression, which is more efficient than MPEG-2 Video. Image quality will be ensured through technology optimization and modulation/multi-value conversion. Note that the number of segments allocated to each broadcast may be different from the above.
  • FIG. 7A shows an example of a dual-polarization transmission system in an advanced terrestrial digital broadcasting service according to an embodiment of the present invention.
  • a frequency band of 470 MHz to 710 MHz is used to transmit broadcast waves for digital terrestrial broadcasting services.
  • the number of physical channels in the frequency band of 470 MHz to 710 MHz is 40 channels of 13 to 52 channels, and each physical channel has a bandwidth of 6 MHz.
  • a dual polarization transmission system according to an embodiment of the present invention uses both horizontally polarized signals and vertically polarized signals within one physical channel.
  • FIG. 7A shows two examples (1) and (2) regarding the allocation example of 13 segments.
  • a 2K broadcast program is transmitted using segments 1 to 7 (B layer) of the horizontally polarized signal.
  • a 4K broadcast program is transmitted using a total of 10 segments: horizontally polarized signal segments 8 to 12 (C layer) and vertically polarized signal segments 8 to 12 (C layer).
  • Segments 1 to 7 (layer B) of the vertically polarized signal may be used to transmit the same broadcast program as the 2K broadcast program transmitted by segments 1 to 7 (layer B) of the horizontally polarized signal.
  • the vertically polarized signal segments 1 to 7 (B layer) may be used to transmit a broadcast program different from the 2K broadcast program transmitted in the horizontally polarized signal segments 1 to 7 (B layer).
  • segments 1 to 7 (layer B) of the vertically polarized signal may be used for other data transmission or may be left unused.
  • Identification information on how to use segments 1 to 7 (layer B) of the vertically polarized signal is determined by the receiving device based on the parameters of the 4K signal transmission layer identification of the TMCC signal and the parameters of the additional layer transmission identification, etc., which have already been explained. transmission is possible. In the broadcast receiving apparatus 100, using these parameters, it is possible to identify how to handle segments 1 to 7 (B layer) of the vertically polarized signal.
  • a 2K broadcast program transmitted using the B layer of horizontally polarized signals and a 4K broadcast program transmitted using the C layer of both horizontal and vertically polarized signals are broadcast programs with the same content but transmitted at different resolutions. It may be a simulcast that transmits broadcast programs with different contents. Segment 0 of both the horizontal and vertical polarization signals transmits the same one-segment broadcast program.
  • the example (2) in FIG. 7A is a modification different from (1).
  • a 4K broadcast program is transmitted using a total of 10 segments, segments 1 to 5 of horizontally polarized signals (layer B) and segments 1 to 5 of vertically polarized signals (layer B).
  • a 2K broadcast program is transmitted using segments 6 to 12 (layer C) of horizontally polarized signals.
  • segments 6 to 12 (C layer) of the vertically polarized signal are used to transmit the same 2K broadcast program as the 2K broadcast program transmitted in segments 6 to 12 (C layer) of the horizontally polarized signal. It's okay.
  • Segments 6 to 12 (layer C) of the vertically polarized signal may be used to transmit a broadcast program different from the 2K broadcast program transmitted by segments 6 to 12 (layer C) of the horizontally polarized signal. Further, segments 6 to 12 (layer C) of the vertically polarized signal may be used for other data transmission or may be left unused. These pieces of identification information are also the same as in the example (1), and therefore will not be explained again.
  • FIG. 7B shows an example of the configuration of a broadcasting system for an advanced terrestrial digital broadcasting service using a dual-polarization transmission system according to an embodiment of the present invention.
  • This shows both the transmitting side system and the receiving side system of an advanced terrestrial digital broadcasting service using a dual-polarization transmission system.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the dual-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300T that is the equipment of the broadcasting station is horizontally polarized. This becomes a polarization shared transmitting antenna that can simultaneously transmit a wave signal and a vertically polarized signal.
  • the radio tower 300T that is the equipment of the broadcasting station
  • the horizontally polarized signal sent from the radio tower 300T is received by the horizontally polarized receiving element of the antenna 200T, which is a dual polarization receiving antenna, and is sent from the connector section 100F1 to the tuning/detection section 131H via the coaxial cable 202T1. is input.
  • the vertically polarized signal transmitted from the radio tower 300T is received by the vertically polarized wave receiving element of the antenna 200T, and is inputted from the connector section 100F2 to the channel selection/detection section 131V via the coaxial cable 202T2.
  • An F-type connector is generally used for a connector section that connects an antenna (coaxial cable) and a television receiver.
  • one of the connector parts that connects the antenna (coaxial cable) and the television receiver should be connected to the horizontally polarized signal.
  • the coaxial cable 202T1 for transmitting signals and the connector section 100F1 have a connector section having a different shape from the F-type connector.
  • the channel selection/detection section 131H and the channel selection/detection section 131V can determine whether the input broadcast signal is a horizontally polarized signal or a vertically polarized signal by referring to the main signal identification of the TMCC information of each input signal. All you have to do is identify it and control it.
  • antenna 200T and broadcast receiving apparatus 100 may be connected by one multicore coaxial cable.
  • FIG. 7C shows an example of a configuration different from the above-mentioned configuration of a broadcasting system for an advanced terrestrial digital broadcasting service using a dual-polarization transmission system according to an embodiment of the present invention.
  • the configuration shown in FIG. 7B in which the broadcast receiving device 100 includes two broadcast signal input connector sections and uses two coaxial cables to connect the antenna 200T and the broadcast receiving device 100, is advantageous in terms of equipment cost and This may not necessarily be suitable for handling during cable wiring, etc. Therefore, in the configuration shown in FIG. 7C, a conversion unit ( converter) 201T, and connection between the converter 201T and the broadcast receiving device 100 is made using a single coaxial cable 202T3.
  • converter converter
  • the broadcast signal input from the connector section 100F3 is demultiplexed and input to the channel selection/detection section 131H and the channel selection/detection section 131V.
  • the connector section 100F3 may have a function of supplying operating power to the conversion section 201T.
  • the conversion unit 201T may belong to equipment in an environment (for example, an apartment complex, etc.) in which the broadcast receiving device 100 is installed. Alternatively, it may be configured as a device integrated with the antenna 200T and installed in a house or the like.
  • the conversion unit 201T performs frequency conversion on either the horizontally polarized signal received by the horizontally polarized receiving element of the antenna 200T or the vertically polarized signal received by the vertically polarized receiving element of the antenna 200T. Perform processing. Through this processing, horizontally polarized signals and vertically polarized signals transmitted from radio tower 300T to antenna 200T using horizontally polarized waves and vertically polarized waves in the same frequency band are separated into different frequency bands and unified.
  • the broadcast receiving apparatus 100 It becomes possible to simultaneously transmit data to the broadcast receiving apparatus 100 using a single coaxial cable 202T3. If necessary, frequency conversion processing may be performed on both the horizontally polarized signal and the vertically polarized signal, but in this case as well, the frequency bands of both after frequency conversion must be different from each other. . Furthermore, the broadcast receiving apparatus 100 only needs to include one broadcast signal input connector section 100F3.
  • FIG. 7D shows an example of frequency conversion processing.
  • frequency conversion processing is performed on the vertically polarized signal.
  • the frequency band of the vertically polarized signal is set to 470MHz to 710MHz. Converts the frequency band to a frequency band of 770MHz to 1010MHz.
  • signals transmitted using horizontally polarized waves and vertically polarized waves in the same frequency band can be simultaneously transmitted to the broadcast receiving apparatus 100 using a single coaxial cable 202T3 without mutual interference. Become. Note that frequency conversion processing may be performed on the horizontally polarized signal.
  • the frequency conversion process is performed on the signal transmitted with the secondary polarization according to the result of referring to the main signal identification of the TMCC information.
  • the signal transmitted using the main polarization is more likely to be transmitted including the current digital terrestrial broadcasting service than the signal transmitted using the secondary polarization. Therefore, in order to better maintain compatibility with current digital terrestrial broadcasting services, it is recommended that the signals transmitted in the secondary polarization be frequency-converted without frequency-converting the signals transmitted in the main polarization. is suitable.
  • the frequency band of the signal transmitted by secondary polarization is higher than the frequency band of the signal transmitted by primary polarization in the converted signal. It is desirable to increase the As a result, in the initial scan of the broadcast receiving device 100, if the scan starts from the low frequency side and advances to the high frequency side, the signal transmitted with the main polarization will be sent before the signal transmitted with the secondary polarization. An initial scan can be performed. As a result, it is possible to more appropriately perform a process of reflecting settings based on the initial scan of the current digital terrestrial broadcasting service to settings based on the initial scan of the advanced digital terrestrial broadcasting service.
  • frequency conversion processing may be performed on all physical channels used in advanced terrestrial digital broadcasting services, but it may also be performed only on physical channels that use signal transmission using a dual-polarization transmission method. .
  • the frequency band after conversion by the frequency conversion process is preferably between 710 MHz and 1032 MHz. That is, when attempting to receive a terrestrial digital broadcasting service and a BS/CS digital broadcasting service at the same time, the broadcasting signal of the terrestrial digital broadcasting service received by the antenna 200T and the broadcasting signal of the BS/CS digital broadcasting service received by the antenna 200B. It is conceivable to mix the signals and transmit them to the broadcast receiving apparatus 100 via a single coaxial cable.
  • the BS/CS-IF signal uses a frequency band of approximately 1032 MHz to 2150 MHz, if the frequency band after conversion by the frequency conversion process is set to be between 710 MHz and 1032 MHz, the horizontally polarized signal It becomes possible to avoid interference between the broadcast signal of the terrestrial digital broadcast service and the broadcast signal of the BS/CS digital broadcast service while avoiding interference between the broadcast signal and the vertically polarized wave signal.
  • the frequency band of 770 MHz or less (corresponding to UHF channel 62 or less) is used for TV broadcast distribution by cable television stations. is used, it is more preferable that the frequency band after frequency conversion processing is between 770 MHz and 1032 MHz, which exceeds the band corresponding to 62 channels of UHF.
  • the bandwidth of the region between the frequency band before conversion and the frequency band after conversion by frequency conversion processing (part a in the figure) is set to be an integral multiple of the bandwidth of one physical channel (6 MHz). It is preferable to set it to .
  • frequency setting control can be easily performed when, for example, frequency scanning is performed on broadcast signals in a frequency band before conversion by frequency conversion processing and broadcast signals in a frequency band after conversion.
  • both horizontally polarized signals and vertically polarized signals are used to transmit 4K broadcast programs. Therefore, in order to correctly reproduce a 4K broadcast program, it is necessary for the receiving side to correctly understand the physical channel combination of the horizontally polarized broadcast signal and the vertically polarized broadcast signal. Even if frequency conversion processing is performed and broadcast signals transmitted in horizontally polarized waves and broadcast signals transmitted in vertically polarized waves on the same physical channel are input to the receiving device as signals in different frequency bands, In the broadcast receiving apparatus 100 of this embodiment, by appropriately referring to the parameters (for example, main signal identification and physical channel number identification) of the TMCC information shown in FIGS.
  • the broadcast receiving apparatus 100 of this embodiment can suitably receive, demodulate, and reproduce a 4K broadcast program.
  • FIGS. 7B, 7C, and 7D are cases in which horizontal polarization is the main polarization, horizontal polarization and vertical polarization may be reversed depending on the operation. do not have.
  • the broadcast waves of digital terrestrial broadcasting transmitted using the dual-polarization transmission method described above can be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • Broadcast receiving device 100 is capable of receiving signals transmitted using a pass-through transmission method.
  • the pass-through transmission method is a method in which a broadcast signal received by a cable television station or the like is transmitted to a CATV distribution system using the same signal method or after frequency conversion.
  • the pass-through method consists of two methods: (1) extracting the transmission signal band and adjusting the level of each terrestrial digital broadcasting signal output from the terrestrial reception antenna, and transmitting it to the CATV facility at the same frequency as the transmission signal frequency; and (2) terrestrial reception.
  • the device constituting the receiving amplifier for performing signal processing of the first method or the device constituting the receiving amplifier and frequency converter for performing signal processing of the second method is an OFDM signal processor (OFDM Signal Processor). OFDM-SP).
  • FIG. 7E shows an example of a system configuration when the first method of the pass-through transmission method is applied to the advanced terrestrial digital broadcasting service of the dual polarization transmission method.
  • FIG. 7E shows head end equipment 400C of a cable television station and broadcast receiving apparatus 100.
  • FIG. 7F shows an example of frequency conversion processing at that time.
  • the notation (H ⁇ V) in FIG. 7F indicates the state of the broadcast signal in which both the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization exist in the same frequency band, and (H ) indicates a broadcast signal transmitted by horizontal polarization, and (V) indicates a broadcast signal transmitted by vertical polarization.
  • the notations in FIGS. 7H and 7I below also have the same meaning.
  • the cable television station The head end equipment 400C When applying the pass-through transmission of the first method to the advanced terrestrial digital broadcasting service of the dual polarization transmission method according to the embodiment of the present invention, the cable television station The head end equipment 400C performs signal band extraction and level adjustment, and sends out at the same frequency as the transmission signal frequency.
  • signal band extraction and level adjustment are performed at the cable television station's headend equipment 400C, and frequency conversion processing similar to that explained in FIG. Transmission is performed after converting the broadcast signal into a frequency band higher than the frequency band of 470 MHz to 770 MHz, which is the band corresponding to UHF channels 13 to 62.
  • This processing prevents the frequency bands of horizontally polarized broadcast signals and vertically polarized broadcast signals from overlapping, allowing signal transmission over a single coaxial cable (or optical fiber cable). becomes.
  • the transmitted signal can be received by the broadcast receiving device 100 of this embodiment.
  • the process of receiving and demodulating the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization included in the signal in the broadcast receiving apparatus 100 of this embodiment is similar to the explanation of FIG. 7D. Therefore, further explanation will be omitted.
  • FIG. 7G shows an example of a system configuration when the second method of the pass-through transmission method is applied to the advanced terrestrial digital broadcasting service of the dual polarization transmission method.
  • FIG. 7G shows head end equipment 400C of a cable television station and broadcast receiving apparatus 100.
  • FIG. 7H shows an example of frequency conversion processing at that time.
  • the cable television station The head end equipment 400C When applying the pass-through transmission of the second method to the advanced terrestrial digital broadcasting service of the dual-polarization transmission method according to the embodiment of the present invention, the cable television station The head end equipment 400C performs signal band extraction and level adjustment, and after frequency conversion processing to the frequency set by the CATV facility manager, transmission is performed.
  • signal band extraction and level adjustment are performed at the cable television station's headend equipment 400C, and frequency conversion processing similar to that explained in FIG. After converting the broadcast signal into a frequency band higher than the frequency band of 470 MHz to 770 MHz, which is the UHF 13ch to 62ch band, transmission is performed.
  • the frequency conversion process shown in FIG. 7H differs from FIG.
  • the broadcast signal transmitted with horizontal polarization is not limited to the frequency band of 470MHz to 770MHz, which is the UHF channel 13ch to 62ch band, but also extends to lower frequency bands. Frequency conversion is performed to widen the range and rearrange the frequencies in the range of 90 MHz to 770 MHz.
  • This processing prevents the frequency bands of broadcast signals transmitted with horizontal polarization and broadcast signals transmitted with vertical polarization from overlapping, allowing signal transmission over a single coaxial cable (or optical fiber cable). becomes.
  • the transmitted signal can be received by the broadcast receiving device 100 of this embodiment.
  • the process of receiving and demodulating the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization included in the signal in the broadcast receiving apparatus 100 of this embodiment is similar to the explanation of FIG. 7D. Therefore, further explanation will be omitted.
  • the broadcast signal at the time of pass-through output after frequency conversion may be changed from FIG. 7H to the state shown in FIG. 7I.
  • signal band extraction and level adjustment are performed for both horizontally polarized broadcast signals and vertically polarized broadcast signals, and frequency conversion processing is performed to the frequency set by the CATV facility manager.
  • the transmission may be performed after performing the above.
  • the frequency is changed so that both the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization are rearranged in the range of 90 MHz to 770 MHz (range from VHF1ch to UHF62ch). Since this converter performs conversion and does not use a frequency band exceeding UHF62ch, the frequency band usage efficiency of the broadcast signal is higher than that in FIG. 7H.
  • the frequency band in which broadcast signals are rearranged is wider than the frequency band of 470 MHz to 710 MHz, which is the band of UHF channels 13 to 52 when receiving antennas, as shown in the example of Fig. It is also possible to alternately rearrange the vertically polarized broadcast signal and the vertically polarized broadcast signal. At this time, as shown in the example of Fig.
  • a pair of a broadcast signal transmitted with horizontal polarization and a broadcast signal transmitted with vertical polarization, which were the same physical channel at the time of antenna reception is If the channels are rearranged alternately in order, when the broadcast receiving apparatus 100 of this embodiment performs an initial scan from the low frequency side, the broadcast signals transmitted with horizontal polarization and vertical polarization, which were originally the same physical channel, can be Initial settings can be performed sequentially for pairs of broadcast signals transmitted by waves in units of originally the same physical channel, and initial scanning can be performed efficiently.
  • the broadcast waves of digital terrestrial broadcasting using the dual-polarization transmission method using the pass-through transmission method described above can also be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. However, it can also be received by the first tuner/demodulator 130C of the broadcast receiving apparatus 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • the single polarization transmission system according to the embodiment of the present invention is a system that shares some specifications with the current terrestrial digital broadcasting system, and uses either a horizontally polarized signal or a vertically polarized signal, This is a method of data transmission using SISO (Single-Input Single-Output) technology.
  • SISO Single-Input Single-Output
  • FIG. 7J shows an example of a single polarization transmission system in the advanced digital terrestrial broadcasting service according to the embodiment of the present invention.
  • a frequency band of 470 MHz to 710 MHz is used to transmit broadcast waves for digital terrestrial broadcasting services.
  • the number of physical channels in the frequency band is 40 channels ranging from 13 to 52 channels, and each physical channel has a bandwidth of 6 MHz.
  • 2K broadcasting service and 4K broadcasting service are transmitted simultaneously within one physical channel.
  • FIG. 7J shows two examples (1) and (2) regarding the allocation example of 13 segments.
  • segments 1 to 4 (layer B) are used to transmit a 4K broadcast program.
  • 2K broadcast programs are transmitted using segments 5 to 12 (C layer).
  • the 4K broadcast program transmitted using the B layer and the 2K broadcast program transmitted using the C layer may be simulcasting in which the same content is transmitted at different resolutions, or they may be broadcast programs with different content. It may also be something that transmits.
  • Example (2) is a modification different from (1).
  • segments 1 to 8 (layer B) are used to transmit a 2K broadcast program.
  • 4K broadcast programs are transmitted using segments 9 to 12 (C layer).
  • FIG. 7K shows an example of the configuration of a broadcasting system for advanced digital terrestrial broadcasting service using a single polarization transmission method according to an embodiment of the present invention.
  • This shows both the transmitting side system and the receiving side system of an advanced digital terrestrial broadcasting service using a single polarization transmission method.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the single-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300S, which is the equipment of the broadcasting station, is horizontally polarized. This becomes a single polarization transmitting antenna that can transmit either a wave signal or a vertically polarized signal.
  • the channel selection/detection section 131H of the second tuner/demodulation section 130T is excerpted and described, and other operating sections are omitted.
  • the single-polarized signal transmitted from the radio tower 300S is received by the antenna 200S, which is a single-polarized receiving antenna, and is input to the channel selection/detection section 131H from the connector section 100F3 via the coaxial cable 202S.
  • An F-type connector is generally used for a connector section that connects an antenna (coaxial cable) and a television receiver.
  • frequency conversion processing conversion unit
  • the broadcast waves of digital terrestrial broadcasting transmitted by the single polarization transmission method described above can be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • the broadcast waves of digital terrestrial broadcasting transmitted using the single polarization transmission method are transmitted in the layer of the current digital terrestrial broadcasting service (the layer that transmits 2K broadcasting in FIG. 7J).
  • the broadcast signal can also be received by the first tuner/demodulator 130C. Therefore, by adopting a double tuner configuration in which the second tuner/demodulator 130T and the first tuner/demodulator 130C are used simultaneously, it is possible to combine the broadcast signals transmitted in the layer of the advanced terrestrial digital broadcasting service with the current terrestrial digital It becomes possible to simultaneously receive and reproduce broadcast signals transmitted in the broadcast service layer.
  • FIG. 7L shows an example of the configuration of a broadcasting system for an advanced digital terrestrial broadcasting service using a single-polarization transmission method according to an embodiment of the present invention, which provides the aforementioned double tuner.
  • This shows both the transmitting side system and the receiving side system of an advanced digital terrestrial broadcasting service using a single polarization transmission method.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the single-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300S, which is the equipment of the broadcasting station, is horizontally polarized. This becomes a single polarization transmitting antenna that can transmit either a wave signal or a vertically polarized signal.
  • the radio tower 300S which is the equipment of the broadcasting station
  • the single-polarized signal transmitted from the radio tower 300S is received by the antenna 200S, which is a single-polarized receiving antenna, and is input to the broadcast receiving device 100 from the connector section 100F3 via the coaxial cable 202S.
  • the single polarized wave signal input to the broadcast receiving apparatus 100 is demultiplexed and input to the channel selection/detection section 131C and the channel selection/detection section 131H, respectively.
  • the tuning/detection unit 131C performs tuning/detection processing for broadcast waves of current digital terrestrial broadcasting services
  • the tuning/detection unit 131H performs tuning/detection processing for broadcast waves of advanced digital terrestrial broadcasting services. It will be done.
  • the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service are provided, it is possible to simultaneously receive the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service. becomes possible. In particular, efficient processing becomes possible in the channel setting section and the like.
  • the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service may be transmitted using the same physical channel or may be transmitted using different physical channels. . Further, the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service may or may not be a pair of simulcasting services.
  • FIG. 7L is an example of receiving a broadcast service of an advanced digital terrestrial broadcasting service using a single-polarization transmission method
  • a similar configuration is applicable to an advanced digital terrestrial broadcasting service using a dual-polarization transmission method. It can also be applied to the case of receiving a broadcast service.
  • the dual-polarization signal received by the antenna 200T which is a dual-polarization receiving antenna, and input to the broadcast receiving apparatus 100 from the connector 100F3 via the converter 201T is demultiplexed, and the dual-polarization signal is divided into two channels for channel selection and reception.
  • the tuning/detection unit 131C performs tuning/detection processing on the broadcast waves of the current digital terrestrial broadcasting service transmitted by either the horizontally polarized signal or the vertically polarized signal, and the tuning/detection unit 131H and the selection
  • the station/detection unit 131V performs tuning/detection processing on broadcast waves of the advanced digital terrestrial broadcasting service transmitted as horizontally polarized signals and vertically polarized signals.
  • the hierarchical division multiplex transmission system according to the embodiment of the present invention is a system that shares some specifications with the current terrestrial digital broadcasting system.
  • the broadcast waves of a 4K broadcast service with a low signal level are multiplexed and transmitted on the same channel as the broadcast waves of the current 2K broadcast service.
  • 2K broadcasting is received as before by suppressing the reception level of 4K broadcasting to below the required C/N.
  • 4K broadcasting while expanding the transmission capacity through modulation and multi-value modulation, etc., we will cancel the 2K broadcast waves and receive the remaining 4K broadcast waves using reception technology compatible with LDM (layer division multiplexing) technology. conduct.
  • FIG. 8A shows an example of a hierarchical division multiplexing transmission system in the advanced digital terrestrial broadcasting service according to the embodiment of the present invention.
  • the upper layer is made up of modulated waves of current 2K broadcasting
  • the lower layer is made up of modulated waves of 4K broadcasting
  • the upper layer and lower layer are multiplexed and output as a composite wave in the same frequency band.
  • the upper layer may use 64QAM or the like as a modulation method
  • the lower layer may use 256QAM or the like as a modulation method.
  • the 2K broadcast program transmitted using the upper layer and the 4K broadcast program transmitted using the lower layer may be simulcasting that transmits the same content broadcast program at different resolutions, or may be simulcasting that transmits the same content broadcast program at different resolutions, or The broadcast program may be transmitted.
  • the upper layer is transmitted with high power
  • the lower layer is transmitted with low power.
  • the difference (difference in power) between the modulated wave level of the upper layer and the modulated wave level of the lower layer is called an injection level (IL).
  • the injection level is a value set by the broadcasting station.
  • the injection level is generally expressed as a value expressed as a relative ratio (dB) of a difference in modulated wave level (difference in power) in logarithmic expression.
  • FIG. 8B shows an example of the configuration of a broadcasting system for an advanced digital terrestrial broadcasting service using a hierarchical division multiplex transmission system according to an embodiment of the present invention.
  • the configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the hierarchical division multiplex transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300L, which is the equipment of the broadcasting station, is located on the upper side. This is a transmitting antenna that sends out a broadcast signal that is a multiplex of 2K broadcasting in the hierarchy and 4K broadcasting in the lower hierarchy.
  • the broadcast receiving apparatus 100 only the channel selection/detection section 131L of the third tuner/demodulation section 130L is extracted and described, and the description of other operating sections is omitted.
  • the broadcast signal received by the antenna 200L is input to the channel selection/detection unit 131L from the connector unit 100F4 via the converter 201L and the coaxial cable 202L.
  • the converter 201L performs frequency conversion amplification processing on the broadcast signal. Also good.
  • the broadcast signal when installing the antenna 200L on the roof of an apartment building or the like and transmitting the broadcast signal to the broadcast receiving device 100 in each room using the long coaxial cable 202L, the broadcast signal will be attenuated and the channel selection/detection section 131L, there is a possibility that a problem may occur in which 4K broadcast waves in the lower hierarchy cannot be received correctly.
  • the conversion unit 201L performs frequency conversion and amplification processing on the 4K broadcast signal of the lower layer.
  • Frequency conversion amplification processing changes the frequency band of the 4K broadcast signal in the lower layer from a frequency band of 470 to 710 MHz (a band corresponding to UHF channels 13 to 52) to a frequency band of 770 to 1010 MHz, which exceeds the band corresponding to UHF channels 62, for example. frequency band.
  • processing is performed to amplify the 4K broadcast signal in the lower hierarchy to a signal level where the influence of cable attenuation is not a problem.
  • the tuning/detection section included in the third tuner/demodulation section 130L of the broadcast receiving apparatus 100 performs processing such as tuning/detection on the modulated wave of the upper layer (2K broadcast).
  • the channel selection/detection section 131L1 may be configured to perform tuning/detection, and the tuning/detection section 131L2 may perform processing such as tuning/detection on the modulated wave of the lower layer (4K broadcasting).
  • the frequency band after conversion by frequency conversion amplification processing is between 710 and 1032 MHz, which exceeds the band corresponding to 52 channels of UHF, or between 770 and 1032 MHz, which exceeds the band corresponding to 62 channels of UHF (retransmission by cable TV stations, etc.) ), and the bandwidth of the region between the frequency band before conversion and the frequency band after conversion by frequency conversion amplification processing is an integral multiple of the bandwidth of one physical channel (6 MHz). It is preferable to set the frequency conversion so that Since the explanation is the same as that of the present embodiment, the explanation will be omitted again.
  • the broadcast receiving apparatus 100 of this embodiment determines whether the received broadcast signal is a broadcast signal transmitted in a lower hierarchy or a broadcast signal transmitted in an upper hierarchy, based on the TMCC information explained in FIG. 5H. It is possible to identify using upper and lower hierarchy identification bits. Furthermore, the broadcast receiving apparatus 100 of the present embodiment uses the frequency conversion process identification bit of the TMCC information described in FIG. It is possible to identify the Furthermore, the broadcast receiving apparatus 100 of this embodiment uses the 4K signal transmission layer identification bit of the TMCC information described in FIG. 5I to determine whether the received broadcast signal transmits a 4K program in the lower layer. It is possible to identify.
  • the channel selection/detection section 131L of the third tuner/demodulation section 130L of the broadcast receiving apparatus 100 has a reception function compatible with LDM (layer division multiplexing) technology. Therefore, the conversion unit 201L shown in FIG. 8B is not necessarily required between the antenna 200L and the broadcast receiving apparatus 100.
  • the broadcast waves of digital terrestrial broadcasting transmitted by the hierarchical division multiplexing transmission method described above can be received and reproduced by the third tuner/demodulator 130L of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100.
  • the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
  • the broadcasting system of this embodiment is compatible with MPEG-2 TS, which is used in current digital terrestrial broadcasting services, as a media transport method for transmitting data such as video and audio.
  • MPEG-2 TS which is used in current digital terrestrial broadcasting services
  • the stream format transmitted by the OFDM transmission wave in Figure 4D (1) is MPEG-2 TS
  • the stream format transmitted in the layer where digital broadcasting services are transmitted is MPEG-2 TS.
  • the stream system obtained by demodulating the transmission wave in the first tuner/demodulator 130C of the broadcast receiving apparatus 100 in FIG. 2 is MPEG-2 TS.
  • the stream format corresponding to the layer in which the current digital terrestrial broadcasting service is transmitted is MPEG-2 TS.
  • the stream format corresponding to the layer in which the current digital terrestrial broadcasting service is transmitted is MPEG-2 TS.
  • MPEG-2 TS is characterized by multiplexing components such as video and audio that make up a program into one packet stream along with control signals and clocks.
  • MPEG-2 TS handles the clock as one packet stream, making it suitable for transmitting one content over one transmission path with guaranteed transmission quality, and is used in many current digital broadcasting systems. has been done.
  • MPEG-2 TS it is possible to realize two-way communication via a two-way network such as a fixed network/mobile network.
  • MPEG-2 TS links digital broadcasting services with functions that utilize a broadband network, and digitally performs acquisition of additional content via the broadband network, arithmetic processing in a server device, presentation processing by linking with a mobile terminal device, etc. It is compatible with broadcast communication cooperation systems that are combined with broadcast services.
  • FIG. 9A shows an example of a protocol stack of a transmission signal in a broadcasting system using MPEG-2 TS.
  • MPEG-2 TS PSI, SI, and other control signals are transmitted in section format.
  • Control signal for broadcasting system using MPEG-2 TS system The control information of the MPEG-2 TS system mainly includes tables used for program sequence information and tables used for purposes other than program sequence information. Tables are transmitted in section format, and descriptors are placed within the tables.
  • FIG. 9B shows a list of tables used in the program sequence information of the MPEG-2 TS broadcasting system.
  • the table shown below is used as the table used in the program sequence information.
  • FIG. 9C shows a list of tables used for purposes other than program sequence information in the MPEG-2 TS broadcasting system.
  • the table shown below is used for purposes other than program sequence information.
  • ECM Entitlement Control Message
  • EMM Entitlement Management Message
  • DCT Download Control Table
  • DLT Download Table
  • DIT Discontinuity Information Table
  • SIT Selection Information Table
  • SDTT Software Download Trigger Table
  • CDT Common Data Table
  • DSM-CC DSM-CC section
  • AIT Application Information Table
  • DCM Download Control Message
  • DMM Download Management Message
  • FIGS. 9D, 9E, and 9F show a list of descriptors used in the program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in the program sequence information.
  • Conditional Access Descriptor (2) Copyright Descriptor (3) Network Name Descriptor (4) Service List Descriptor (5) Stuffing Descriptor (6) Satellite Delivery System Descriptor (7) Terrestrial Delivery System Descriptor (8) Bouquet Name Descriptor (9) Service Descriptor (10) Country Availability Descriptor
  • Linkage Descriptor (12) NVOD Reference Descriptor (13) Time Shifted Service Descriptor (14) Short Event Descriptor (15) Extended Event Descriptor (16) Time Shifted Event Descriptor (17) Component Descriptor (18) Mosaic Descriptor (19) Stream Identifier Descriptor (20) CA Identifier Descriptor
  • Hyperlink Descriptor (31) Hyperlink Descriptor (32) Data Content Descriptor (33) Video Decode Control Descriptor (34) Basic Local Event Descriptor (35) Reference Descriptor (36) Node Relation Descriptor (37) Short Node Information Descriptor (38) STC Reference Descriptor (39) Partial Reception Descriptor (40) Series Descriptor
  • Event Group Descriptor (41) Event Group Descriptor (42) SI Transmission Parameter Descriptor (43) Broadcaster Name Descriptor (44) Component Group Descriptor (45) SI Prime TS Descriptor (46) Board Information Descriptor (47) LDT Linkage Descriptor (48) Connected Transmission Descriptor (49) TS Information Descriptor (50) Extended Broadcaster Descriptor
  • FIG. 9G shows a list of descriptors used in other than program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used for purposes other than program sequence information.
  • Partial Transport Stream Descriptor (2) Network Identification Descriptor (3) Partial Transport Stream Time Descriptor (4) Download Content Descriptor (5) CA_EMM_TS_Descriptor (CA_EMM_TS_Descriptor) (6) CA Contract Information Descriptor (7) CA Service Descriptor (8) Carousel Identifier Descriptor (9) Association Tag Descriptor (10) Deferred Association tags Descriptor (11) Network Download Content Descriptor (12) Download Protection Descriptor (13) CA Startup Descriptor (14) Descriptor set by the operator
  • FIG. 9H shows a list of descriptors used in the INT of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in INT. Note that the descriptors used in the above-mentioned program sequence information and descriptors used in other than the program sequence information are not used in INT.
  • Target Smartcard Descriptor (2) Target IP Address Descriptor (3) Target IPv6 Address Descriptor (4) IP/MAC Platform Name Descriptor (5) IP/MAC Platform Provider Name Descriptor (6) IP/MAC Stream Location Descriptor (7) Descriptor set by the operator
  • FIG. 9I shows a list of descriptors used in the AIT of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in AIT. Note that the descriptors used in the above-mentioned program sequence information and descriptors used in other than the program sequence information are not used in INT.
  • Application Descriptor (2) Transport Protocol Descriptor (3) Simple Application Location Descriptor (4) Application Boundary and Permission Descriptor (5) Autostart Priority Descriptor (6) Cache Control Info Descriptor (7) Randomized Latency Descriptor (8) External Application Control Descriptor (9) Playback Application Descriptor (10) Simple Playback Application Location Descriptor (11) Application Expiration Descriptor (12) Descriptor set by the operator
  • the broadcasting system of this embodiment can also support the MMT system as a media transport system for transmitting data such as video and audio.
  • the stream format transmitted in the layer where advanced digital terrestrial broadcasting services are transmitted is, in principle, the MMT format. It is.
  • the stream format corresponding to the hierarchy in which advanced digital terrestrial broadcasting services are transmitted is, in principle, the MMT format.
  • the method of the stream obtained by demodulating the transmission wave in the fourth tuner/demodulator 130B is also the MMT method.
  • an MPEG-2 TS stream may be operated in an advanced terrestrial digital broadcasting service.
  • the MMT system is based on MPEG-2 in response to changes in the environment related to content distribution, such as the recent diversification of content, the diversification of devices that use content, the diversification of transmission paths for distributing content, and the diversification of content storage environments.
  • This is a newly developed media transport method due to the limitations of the TS method.
  • the video and audio signals of the broadcast program are coded as MFU (Media Fragment Unit)/MPU (Media Processing Unit), placed on an MMTP (MMT Protocol) payload, converted into MMTP packets, and transmitted as IP packets.
  • MFU Media Fragment Unit
  • MPU Media Processing Unit
  • MMTP MMT Protocol
  • data content and subtitle signals related to broadcast programs are also in MFU/MPU format, put on an MMTP payload, converted into MMTP packets, and transmitted as IP packets.
  • UDP/IP User Datagram Protocol/Internet Protocol
  • TCP/IP Transmission Control Protocol
  • col/Internet Protocol a TLV multiplexing method may be used for efficient transmission of IP packets.
  • FIG. 10A shows the MMT protocol stack in the broadcast transmission path. Further, FIG. 10B shows an MMT protocol stack in a communication line.
  • the MMT system provides a mechanism for transmitting two types of control information: MMT-SI and TLV-SI.
  • MMT-SI is control information indicating the structure of a broadcast program. It is in the format of an MMT control message, is placed on an MMTP payload, converted into an MMTP packet, and transmitted as an IP packet.
  • TLV-SI is control information regarding multiplexing of IP packets, and provides information for channel selection and correspondence information between IP addresses and services.
  • TLV-SI and MMT-SI are prepared as control information.
  • TLV-SI consists of tables and descriptors. Tables are transmitted in section format, and descriptors are placed within the tables.
  • MMT-SI consists of three layers: messages that store tables and descriptors, tables that have elements and attributes that indicate specific information, and descriptors that indicate more detailed information.
  • FIG. 10C shows a list of tables used in TLV-SI of the MMT broadcasting system.
  • the table shown below is used as the TLV-SI table.
  • tables having the same meaning as the tables shown in FIGS. 9B and 9C may be further used.
  • FIG. 10D shows a list of descriptors used in TLV-SI of the MMT broadcasting system.
  • the following is used as a TLV-SI descriptor.
  • descriptors having the same meaning as the descriptors shown in FIGS. 9D, 9E, 9F, 9G, 9H, and 9I may also be used.
  • Service List Descriptor (2) Satellite Delivery System Descriptor (3) System Management Descriptor (4) Network Name Descriptor (5) Remote Control Key Descriptor (6) Descriptor set by the operator
  • FIG. 10E shows a list of messages used in MMT-SI of the MMT broadcasting system. In this embodiment, the following messages are used as MMT-SI messages.
  • PA Package Access
  • M2 section message (3) CA message (4) M2 short section message (5) Data transmission message (6) Message set by the operator
  • FIG. 10F shows a list of tables used in MMT-SI of the MMT broadcasting system.
  • the following MMT-SI table is used.
  • tables having the same meaning as the tables shown in FIGS. 9B and 9C may be further used.
  • MPT MMT Package Table
  • PLT Package List Table
  • LCT Layer Control Table
  • ECM Entitlement Control Message
  • EMM Entitlement Management Message
  • CAT MH
  • DCM Download Control Message
  • DMM Download Management Message
  • MH-EIT MH-Event Information Table
  • MH-AIT MH-Application Information Table
  • FIGS. 10G, 10H, and 10I show a list of descriptors used in MMT-SI of the MMT broadcasting system. In this embodiment, the following is used as the MMT-SI descriptor. Furthermore, descriptors having the same meaning as the descriptors shown in FIGS. 9D, 9E, 9F, 9G, 9H, and 9I may also be used.
  • Asset Group Descriptor (2) Event Package Descriptor (3) Background Color Descriptor (4) MPU Presentation Region Descriptor (5) MPU Timestamp Descriptor (6) Dependency Descriptor (7) Access Control Descriptor (8) Scrambler Descriptor (9) Message Authentication Method Descriptor (10) Emergency Information Descriptor
  • MH-MPEG-4 Audio Descriptor (12) MH-MPEG-4 Audio Extension Descriptor (13) MH-HEVC Descriptor (14) MH-Linkage Descriptor (15) MH-Event Group Descriptor (16) MH-Service List Descriptor (17) MH-Short Event Descriptor (18) MH-Extended Event Descriptor (19) Video Component Descriptor (20) MH-Stream Identifier Descriptor
  • MPU Extended Timestamp Descriptor (42) MPU Download Content Descriptor (43) MH-Network Download Content Descriptor (44) Application descriptor (MH-Application Descriptor) (45) MH-Transport Protocol Descriptor (46) MH-Simple Application Location Descriptor (47) Application Boundary and Permission Descriptor (MH-Application Boundary and Permission Descriptor) (48) MH-Autostart Priority Information Descriptor (MH-Autostart Priority Descriptor) (49) MH-Cache Control Info Descriptor (50) MH-Randomized Latency Descriptor
  • FIG. 10J shows the relationship between data transmission and typical tables in an MMT broadcasting system.
  • data can be transmitted through multiple routes, such as a TLV stream via a broadcast transmission path and an IP data flow via a communication line.
  • the TLV stream includes TLV-SI such as TLV-NIT and AMT, and an IP data flow that is an IP packet data flow.
  • the IP data flow includes a video asset including a series of video MPUs and an audio asset including a series of audio MPUs.
  • the IP data flow may include a subtitle asset including a series of subtitle MPUs, a text super asset including a series of text superimpose MPUs, a data asset including a series of data MPUs, and the like.
  • MPT MMT package table
  • the package ID and the asset ID of each asset included in the package may be written in association with each other in the MPT.
  • the assets that make up the package can be only those in the TLV stream, but as shown in FIG. 10J, they can also include assets that are transmitted in the IP data flow of the communication line.
  • This can be realized by including location information of each asset included in the package in the MPT, so that the broadcast receiving device 100 can grasp the reference destination of each asset.
  • the location information for each asset is as follows: (1) Data multiplexed in the same IP data flow as MPT (2) Data multiplexed in IPv4 data flow (3) Data multiplexed in IPv6 data flow (4) Multiplexed in broadcast MPEG2-TS (5) Data multiplexed in MPEG2-TS format within the IP data flow (6) Data at a specified URL It is possible to specify various data to be transmitted via various transmission routes. .
  • the MMT broadcasting system also has the concept of an event.
  • An event is a concept indicating a so-called program that is handled by an MH-EIT that is sent while being included in an M2 section message.
  • a series of data included in the duration period from the disclosure time stored in the MH-EIT is included in the concept of the event. This is the data included.
  • the MH-EIT can be used in the broadcast receiving device 100 for various processing on an event-by-event basis (for example, program guide generation processing, control of recording reservations and viewing reservations, copyright management processing such as temporary storage, etc.). I can do it.
  • the broadcast receiving apparatus 100 which is compatible with the current terrestrial digital broadcasting, is compatible with the terrestrial digital broadcasting (advanced terrestrial digital broadcasting, or advanced terrestrial digital broadcasting and current terrestrial digital broadcasting) according to the embodiment of the present invention.
  • the broadcast receiving apparatus 100 which is transmitted simultaneously on a separate layer from broadcasting, it has the function of searching (scanning) all receivable channels at the receiving point and creating a service list (receivable frequency table) based on the service ID. There is a need.
  • MFN Multi Frequency Network
  • the broadcast receiving device 100 acquires the service list stored in the TLV-NIT. There is no need to create a service list. Therefore, for advanced BS digital broadcasting or advanced CS digital broadcasting received by the fourth tuner/demodulator 130B, initial scanning and rescanning described later are not necessary.
  • the broadcast receiving apparatus 100 has a rescan function in preparation for the opening of a new station, installation of a new relay station, change of reception point of a television receiver, and the like.
  • the broadcast receiving apparatus 100 can notify the user to that effect.
  • FIG. 11A shows an example of an operation sequence of channel setting processing (initial scan/rescan) of the broadcast receiving apparatus 100 according to the embodiment of the present invention. Note that although the figure shows an example where MPEG-2 TS is adopted as the media transport method, the processing is basically the same when the MMT method is adopted.
  • the reception function control unit 1102 first sets the residential area (selects the area where the broadcast receiving device 100 is installed) based on the user's instruction (S101). At this time, instead of the user's instructions, the residential area may be automatically set based on the installation position information of the broadcast receiving device 100 acquired through predetermined processing.
  • the installation position information acquisition process information may be acquired from a network connected to the LAN communication unit 121, or information regarding the installation position may be acquired from an external device connected to the digital interface unit 125.
  • the initial value of the frequency range to be scanned is set, and the tuner/demodulators (the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator If the demodulation unit 130L is not distinguished, it is described like this.The same applies hereafter).) (S102).
  • the tuner/demodulator executes tuning based on the instruction (S103), and if it succeeds in locking to the set frequency (S103: Yes), the process proceeds to S104. If the lock is not successful (S103: No), the process advances to S111. In the process of S104, the C/N is confirmed (S104), and if the C/N is greater than or equal to a predetermined value (S104: Yes), the process proceeds to S105 and a reception confirmation process is performed. If the C/N is not higher than the predetermined value (S104: No), the process proceeds to S111.
  • the reception function control unit 1102 first obtains the BER of the received broadcast wave (S105). Next, by acquiring and comparing the NIT, it is confirmed whether the NIT is valid data or not (S106). If the NIT acquired in the process of S106 is valid data, the reception function control unit 1102 acquires information such as the transport stream ID and original network ID from the NIT. Furthermore, distribution system information regarding the physical conditions of the broadcast transmission path corresponding to each transport stream ID/original network ID is acquired from the terrestrial distribution system descriptor. Additionally, a list of service IDs is acquired from the service list descriptor.
  • the reception function control unit 1102 checks the service list stored in the reception device to check whether the transport stream ID acquired in the process of S106 has already been acquired (S107). . If the transport stream ID acquired in the process of S106 is not already acquired (S107: No), the various information acquired in the process of S106 is associated with the transport stream ID and added to the service list (S108). If the transport stream ID acquired in the process of S106 has already been acquired (S107: Yes), compare the BER acquired in the process of S105 with the BER when the transport stream ID already recorded in the service list was acquired. Execute (S109). As a result, if the BER obtained in S105 is better (S109: Yes), the service list is updated using the various information obtained in S106 (S110). If the BER obtained in step S105 is not better (S109: No), the various information obtained in step S106 is discarded.
  • the remote control key ID may be acquired from the TS information descriptor and the remote control key may be associated with a typical service for each transport stream. This process enables one-touch channel selection, which will be described later.
  • the reception function control unit 1102 confirms whether the current frequency setting is the final value of the frequency range to be scanned (S111). If the current frequency setting is not the final value of the frequency range to be scanned (S111: No), the frequency value set in the tuner/demodulator is increased (S112), and the processes of S103 to S110 are repeated. If the current frequency setting is the final value of the frequency range to be scanned (S111: Yes), the process advances to S113.
  • the service list created (added/updated) in the above process is presented to the user as a result of the channel setting process (S113). Further, if there is a duplication of remote control keys, the user may be notified of this fact and urged to change the remote control key settings (S114).
  • the service list created/updated through the above processing is stored in a nonvolatile memory such as the ROM 103 or the storage unit 110 of the broadcast receiving device 100.
  • FIG. 11B shows an example of the data structure of NIT.
  • “transpotrt_stream_id” corresponds to the above-mentioned transport stream ID
  • “original_network_id” corresponds to the original network ID.
  • FIG. 11C shows an example of the data structure of the ground distribution system descriptor. "guard_interval”, “transmission_mode”, “frequency”, etc. in the figure correspond to the above-mentioned distribution system information.
  • FIG. 11D shows an example of the data structure of a service list descriptor.
  • “service_id” in the figure corresponds to the above-mentioned service ID.
  • FIG. 11E shows an example of the data structure of the TS information descriptor.
  • "remote_control_key_id” in the figure corresponds to the above-mentioned remote control key ID.
  • the broadcast receiving device 100 may control the frequency range to be scanned as described above to be changed as appropriate depending on the broadcast service to be received. For example, when the broadcast receiving device 100 is receiving broadcast waves of the current digital terrestrial broadcasting service, it is controlled to scan the frequency range of 470 MHz to 770 MHz (corresponding to physical channels 13 ch to 62 ch). That is, the initial value of the frequency range is set to 470 MHz to 476 MHz (center frequency 473 MHz), the final value of the frequency range is set to 764 MHz to 770 MHz (center frequency 767 MHz), and the frequency value is increased by +6 MHz in the process of S112. control so that
  • the frequency range of 470 MHz to 1010 MHz (frequency conversion processing shown in FIG. 7D or frequency conversion amplification processing shown in FIG. 8C) control to scan). That is, the initial value of the frequency range is set to 470 MHz to 476 MHz (center frequency 473 MHz), the final value of the frequency range is set to 1004 MHz to 1010 MHz (center frequency 1007 MHz), and the frequency value is increased by +6 MHz in the process of S112.
  • the broadcast receiving device 100 can control the selection of the frequency range to be scanned based on the system identification, frequency conversion processing identification, etc. of the TMCC information.
  • the channel selection/detection One of the channel selection/detection section 131H and the channel selection/detection section 131V may scan the frequency range of 470 MHz to 770 MHz, and the other may scan the frequency range of 770 MHz to 1010 MHz. (if frequency conversion processing is applied to the transmitted wave with polarization).
  • frequency conversion processing is applied to the transmitted wave with polarization.
  • the operation sequence of FIG. 11A may be advanced in parallel in both the channel selection/detection section 131H and the channel selection/detection section 131V to synchronize the loop of frequency up S112 in the operation sequence of FIG. 11A.
  • the configuration is configured so that the pair of horizontally polarized signal and vertically polarized signal transmitted on the same physical channel is received in parallel.
  • control information and the like inside the packet stream of the advanced terrestrial digital service transmitted as a pair of the horizontally polarized signal and the vertically polarized signal can be decoded and acquired during the loop processing. This is preferable because scanning and creation of the service list proceed efficiently.
  • the broadcast receiving apparatus 100 has the configuration shown in FIG. 8B and has a so-called double tuner configuration in which a plurality of tuners/demodulators (channel selection/detection units) are further provided (for example, a plurality of third tuner/demodulators 130L are provided).
  • configuration the configuration shown in FIG. 8D may also be used, and when receiving an advanced terrestrial digital broadcasting service using a hierarchical division multiplex transmission method, one of the double tuners scans the frequency range of 470 MHz to 770 MHz, and the other Alternatively, the frequency range of 770 MHz to 1010 MHz may be scanned (if frequency conversion amplification processing is performed). By controlling in this manner, it is possible to reduce the time required for channel setting, as described above.
  • the terrestrial digital broadcasting service that is transmitted in either the upper layer or the lower layer in the configuration shown in FIG. 8B is the current terrestrial digital broadcasting service. be. Therefore, for example, among the frequency range of 470 MHz to 770 MHz and the frequency range of 770 MHz to 1010 MHz, the first tuner/demodulator 130C scans the frequency range in which the current digital terrestrial broadcasting service is transmitted, and scans the frequency range of the other frequency range.
  • the third tuner/demodulator 130L may perform scanning in parallel. In this case as well, it is possible to reduce the time required for channel setting, similar to the above-described parallel scanning using the double tuner of the third tuner/demodulator 130L.
  • the third tuner/demodulator 130L selects two points in total, one for each frequency range, for example, 470 MHz to 476 MHz (center frequency 473 MHz) and 770 MHz to 776 MHz (center frequency 773 MHz). Identification is possible by performing reception, acquiring TMCC information transmitted on each frequency, and referring to parameters (for example, system identification parameters) stored in the TMCC information.
  • both horizontally polarized signals and vertically polarized signals are transmitted.
  • a channel that has a broadcast program that is transmitted using Write it on the list In addition, in the case of a 2K broadcast program on the B layer shown in the same figure, if the same broadcast program is transmitted on the B layer of horizontally polarized signals and the B layer of vertically polarized signals, the same transport Even if an ID is detected, it is sufficient to store it in the service list as one channel.
  • the broadcast receiving device 100 has program selection functions such as one-touch tuning using the one-touch key on the remote controller, channel up/down tuning using the channel up/down key on the remote controller, and channel up/down tuning using the remote controller's channel up/down key. It has functions such as direct channel selection by directly inputting the 3-digit number used. Any channel selection function may be performed using the information stored in the service list generated by the above-mentioned initial scan/rescan. In addition, after selecting a channel, information on the selected channel is displayed on a banner, etc.
  • ⁇ Direct channel selection processing example> (1) When direct channel selection is selected, the system waits for input of a 3-digit number. (2-1) If the input of the 3-digit number is not completed within the predetermined time (about 5 seconds), the device returns to the normal mode and displays the channel information of the currently selected service. (2-2) When the input of the 3-digit number is completed, it is determined whether the channel exists in the service list of the receivable frequency table, and if not, a message such as ⁇ This channel does not exist'' is displayed. do. (3) If a channel exists, perform channel selection processing, set last mode, and display channel information after channel selection.
  • the channel selection operation is performed based on SI, and if it is determined that broadcasting is suspended, it may also have a function to display and notify the user of this fact.
  • FIG. 12A shows an example of an external view of a remote controller used to input operation instructions to the broadcast receiving apparatus 100 according to the embodiment of the present invention.
  • the remote control 180R includes a power key 180R1 for powering on/off (standby on/off) the broadcast receiving device 100, and cursor keys (up, down, left, right) 180R2 for moving the cursor up, down, left and right. , a determination key 180R3 for determining the item at the cursor position as a selection item, and a return key 180R4.
  • the remote control 180R also includes a network switching key (altitude terrestrial digital, terrestrial digital, advanced BS, BS, CS) 180R5 for switching the broadcast network received by the broadcast receiving apparatus 100.
  • the remote control 180R has one-touch keys (1 to 12) 180R6 used for one-touch tuning, a channel up/down key 180R7 used for channel up/down tuning, and a 3-digit number input for direct tuning. It is equipped with 10 keys used for. In the example shown in the figure, the 10 key is also used as the one-touch key 180R6, and during direct channel selection, it is possible to input a 3-digit number by operating the one-touch key 180R6 after pressing the direct key 180R8. .
  • the remote control 180R also includes an EPG key 180R9 for displaying a program guide and a menu key 180RA for displaying a system menu.
  • the program guide and system menu can be operated in detail using the cursor key 180R2, enter key 180R3, and return key 180R4.
  • the remote control 180R includes a d key 180RB used for data broadcasting services and multimedia services, a cooperation key 180RC for displaying a list of broadcasting and communication cooperation services and their compatible applications, and color keys (blue, red, green). , yellow) 180RD.
  • a d key 180RB used for data broadcasting services and multimedia services
  • a cooperation key 180RC for displaying a list of broadcasting and communication cooperation services and their compatible applications
  • color keys blue, red, green). , yellow
  • cursor key 180R2 For data broadcasting services, multimedia services, broadcasting communication cooperation services, etc., detailed operations are possible using the cursor key 180R2, enter key 180R3, return key 180R4, and color key 180RD.
  • the remote control 180R also has a video key 180RE for selecting related video, an audio key 180RF for switching audio ES and bilingual, and a key 180RF for switching on/off of subtitles and switching the subtitle language. and a subtitle key 180RG.
  • the remote controller 180R also includes a volume key 180RH for increasing/decreasing the volume of audio output, and a mute key 180RI for switching on/off the audio output.
  • the remote control 180R of the broadcast receiving apparatus 100 includes an "altitude terrestrial digital key”, “terrestrial digital key”, “altitude BS key”, “BS key”, and “CS key” as network switching keys 180R5.
  • "altitude terrestrial digital key” and “terrestrial digital key” are used in advanced terrestrial digital broadcasting service, for example, when simultaneous broadcasting of 4K broadcast program and 2K broadcast program is carried out in different layers, "altitude terrestrial digital key” In the pressed state, priority is given to selecting a 4K broadcast program when selecting a channel, and when the "terrestrial digital key” is pressed, priority is given to selecting a 2K broadcast program when selecting a channel.
  • the broadcast receiving device 100 when performing channel selection by one-touch tuning, channel up/down tuning, direct tuning, etc., displays the selected channel by displaying a banner or the like. It has the function of displaying information.
  • FIG. 12B shows an example of a banner display when selecting a channel.
  • Banner display 192A1 is an example of a banner display that is displayed when a 2K broadcast program is selected. For example, the program name, program start time/end time, network type, remote control direct channel selection key number, and service logo are displayed. All you have to do is display the 3-digit number.
  • the banner display 192A2 is an example of a banner display that is displayed when a 4K broadcast program is selected. A mark symbolizing "altitude" will also be displayed.
  • a display may be displayed to indicate this. In the example of the banner display 192A2, for example, it is displayed that down-conversion processing from UHD resolution to HD resolution and downmix processing from 22.2ch to 5.1ch have been performed.
  • a more sophisticated advanced digital broadcasting service that takes into consideration compatibility with the current digital broadcasting service It becomes possible to provide transmission technology and reception technology for broadcasting services. That is, it is possible to provide a technique for more suitably transmitting or receiving advanced digital broadcasting services.
  • Example 2 [Advanced audio signal]
  • the audio signal in current systems is a channel-based signal that corresponds to a speaker.
  • Channel-based signals include those of 5.1ch and those of 22.2ch (here, "ch” is an abbreviation for "channel”).
  • audio signals including object-based signals and HOA (Higher Order Ambisonics) signals are handled.
  • An object-based signal is an audio signal that allows the receiver to change the playback position, such as the voice of a narrator, such as placing it on the right or left side.
  • the playback position is not fixed and may be changed dynamically.
  • the HOA method signal is a signal that expands the sound field as a sum of spherical harmonics. Since there is an upper limit to the transmission capacity, we use expansion up to a finite degree. Since the channel base signal is basically recorded at a microphone position corresponding to a standard speaker arrangement, it is suitable for audio reproduction using a group of speakers arranged at or near the standard arrangement. On the other hand, the HOA method records spatial sound field information independently of a specific speaker arrangement, so it is suitable for supporting any speaker arrangement.
  • FIGS. 13A, 13B, and 13C Examples of standard speaker placement are shown in FIGS. 13A, 13B, and 13C.
  • the speaker group is divided into three groups, upper layer, middle layer, and lower layer, depending on the height of the installation position.
  • the arrangement of each group is as shown in FIGS. 13B and 13C.
  • FIG. 13B shows the arrangement of a 22.2ch speaker system
  • FIG. 13C shows the arrangement of a 7.1ch speaker system.
  • the number below the decimal point in the channel number display is the number of channels of the low frequency signal
  • the corresponding speakers are LFE1, LFE2, and LFE.
  • the other signal channels are called main channels.
  • a 5.1ch speaker system is obtained by removing the upper layer speaker from the 7.1ch speaker system.
  • the speaker system In the current system, if the speaker system has the same number of speakers as the number of channels of the channel-based audio signal, it will be played as is, and if the number of speakers is different from the number of speakers in the speaker system, the format will be converted to match the number of speakers in the speaker system. Reproduce. In particular, when the number of speakers is smaller than the number of audio signal channels, this format conversion is called downmix. Format conversion is also performed when the position of the speaker system assumed when creating the audio signal differs from the actual position of the speaker system. The sound to be output from the actual speaker position is synthesized by weighting and adding the audio signals of each channel.
  • each speaker is placed at the same distance from the expected standard viewing position of the viewer, so there is no need to adjust the playback time, but if the actual speaker is located at the viewer's position If they are not placed at the same distance from each other, the playback time may also be adjusted.
  • the format conversion is as shown in Equation 1 below.
  • s (ch) n (t) is a channel-based audio signal transmitted by broadcasting/communication
  • n is a signal number
  • the number of channel-based signals is N (ch) .
  • t is time.
  • p (ch) m (t) is an audio signal input to a speaker
  • m is a speaker number
  • the number of speakers is M.
  • g (ch) mn is a weighting coefficient for the channel base signal.
  • ⁇ t m is the delay time adjustment time according to the deviation from the distance R o between the speaker farthest from the standard viewing position and the standard viewing position.
  • weighting coefficient g (ch) mn when the speakers are not equidistant is corrected to be (R m /R o ) times the weighting coefficient g (ch) mn when the speakers are equidistant, It is also possible to balance the volume between the speakers.
  • the transmitted signals are weighted and summed, similar to the format conversion formula for channel-based signals. As shown in Equations 3 and 4, the signal is converted into a signal input to the speaker.
  • the meanings of the symbols are the same as those for channel-based signals; the superscript (ch) indicates a symbol for a channel-based signal, (obj) indicates a symbol for an object-based signal, and (HOA) indicates that the signal corresponds to the HOA system signal.
  • the audio signal p m (t) input to the speaker system is given by the following equation 5.
  • the weighting coefficient g (*) mn is determined by the relationship between the speaker arrangement and the standard viewing position, but the weighting coefficient g (obj) mn for the object-based signal is determined by taking into account the playback position of each individual object. Ru. Note that content common to all signals is expressed using a superscript (*).
  • FIG. 14A shows the positional relationship with the broadcast receiving apparatus 100 when headphones are used.
  • the listening position is the midpoint of the line segment connecting the left and right audio output sections of the headphones.
  • the sound field created by the audio signal is based on a reference coordinate system in which the center direction of the receiver screen is the reference direction.
  • the audio output section of the headphones changes its position within this reference coordinate system due to the rotation of the user's head (FIG. 14B). Therefore, the weighting coefficient g (*) mn for synthesizing the input signals to the audio output section of the headphone is calculated by taking into account the position of the audio output section of the headphone within the reference coordinate system at that time.
  • the position of the audio output section of the headphones is determined by, for example, recording the position where the user is facing the center of the receiver screen based on user input, and detecting subsequent changes in the direction of the user's face using a gyro sensor etc. installed in the headphones. It can be obtained by Note that although FIG. 14B shows the arrangement on a plane, the position in the height direction may be considered.
  • FIG. 15A shows a configuration example of the audio decoder 10000 when the audio signal to be transmitted is only a channel base signal.
  • a core decoder 10001 decodes an audio bitstream multiplexed and transmitted through broadcasting or communication into signals for each channel.
  • the format converter 10002 performs the above format conversion and outputs an audio signal for speakers and an audio signal for headphones. Output to external equipment may be performed wirelessly.
  • FIG. 15B is a configuration example of an audio decoder 10100 that supports advanced audio signals.
  • the core decoder 10101 decodes an audio bitstream that has been multiplexed and transmitted through broadcasting or communication into individual signals.
  • each signal is a channel base signal, an object base signal, and an HOA method signal. Even in the case of advanced audio signals, output to external equipment may be performed wirelessly.
  • the channel base signal is converted by the format converter 10102 into a signal for each speaker according to Equation 1 according to the speaker arrangement.
  • the channel base signal is also converted to a signal for headphones.
  • arrangement information stored in the receiver is used.
  • FIG. 16 shows an example of speaker arrangement information.
  • the placement information is based on the speaker type (main channel or low frequency channel), azimuth position, height position (elevation angle, inclination angle), and viewer head position, depending on the number that distinguishes the speaker. It consists of a distance of
  • the azimuth position is the angular position when a positive value turns to the left, and the angular position when a negative value turns to the right, assuming that the front direction is 0° when viewed from the viewing position. be.
  • the position in the height direction is defined as 0° in the horizontal direction when viewed from the position of the viewer's head, and a positive value represents an angle of elevation, and a negative value represents an angle of inclination.
  • the above-mentioned weighting coefficient g (ch) mn is set based on this information and the configuration of the channel base signal. Further, the delay time adjustment of ⁇ t m is performed based on the distance information, but if there is no distance information, the delay time adjustment is not performed.
  • the speaker arrangement information here is displayed using polar coordinates, it may also be displayed using orthogonal coordinates.
  • This speaker placement information may be standard placement information such as a 5.1ch speaker system, or may be speaker placement information specific to the receiver. Alternatively, the location information of the speaker system customized by the receiver user may be used. At this time, the user-customized arrangement information is registered before viewing the program, so that the user can set which arrangement information is to be used. Furthermore, it may be possible to switch between the speaker system provided in the receiver and the speaker system customized by the user. Furthermore, the speaker system used for each program may be reserved. Alternatively, a speaker system may be set for each type of program, each time slot, and each viewer. It becomes possible to reproduce audio according to the program content and the viewing environment at the time, improving convenience for the user.
  • the above g1, g2, g3, g4, g5, and g6 are weighting coefficients (downmix coefficients), and default values are shown in FIG. 17A.
  • This downmix coefficient is transmitted as metadata of the audio signal, but the default value is used until it is received.
  • This conversion formula and the default values of the weighting coefficients are the same as those used in a system that handles audio signals with only channel-based signals.
  • the signal When converting an audio signal with a larger number of channels than 5.1ch to a signal to be output to a 2ch speaker system, the signal is first downmixed to a 5.1ch signal and then downmixed to a 2ch signal.
  • An example of a conversion formula for downmixing from 5.1ch to 2ch is shown below.
  • Lt' L+g7*C+g8*Ls (Formula 12)
  • Rt' R+g7*C+g8*Rs (Formula 13)
  • the above g7 and g8 are weighting coefficients (downmix coefficients), and default values are shown in FIG. 17B. This downmix coefficient is transmitted as metadata of the audio signal, but the default value is used until it is received.
  • This conversion formula and the default values of the weighting coefficients are the same as those used in a system that handles audio signals with only channel-based signals.
  • By sharing processing between systems it is also possible to share signal processing units, which leads to a reduction in the overall system size in the case of a shared receiver.
  • the speaker systems used include a built-in speaker built into the receiver, an external speaker connected by wire, and an external speaker connected wirelessly.
  • the speaker to be used may be selected by remote control operation (for example, selection using arrow buttons) or by a linked terminal such as a smart phone.
  • FIG. 18 shows an example of a selection menu for speaker settings.
  • FIG. 18 shows that the "external speaker 1" system is selected.
  • the connection with the external speaker system may be a combination of wired connection and wireless connection.
  • the user definition is a speaker system that combines speakers of each system. For example, it is a system that combines a built-in speaker with an external speaker for expansion. By preparing a selection menu, the optimal speaker system can be easily selected according to the viewer's preference and viewing environment at the time, improving convenience for the viewer.
  • the arrangement information shown in FIG. 16 is required.
  • This placement information may be input by the viewer himself or may be downloaded from the site of the receiver manufacturer or speaker manufacturer.
  • This arrangement information may be downloaded by the receiver upon receiving identification information such as the model number of the speaker system.
  • placement information recorded on the speaker body may be transmitted to the receiver.
  • the arrangement information may be created and corrected by measuring the actual arrangement state of the speakers through a cooperative operation between the receiver and the speakers.
  • the arrangement state may be measured using, for example, a distance measuring device such as a camera or UWB (Ultra Wideband) provided in the receiver or the speaker or both.
  • the weighting coefficients used for format conversion may be provided externally. This weighting coefficient may be inputted by the user himself, inputted into the receiver through communication from the speaker system, or acquired by the receiver from the server.
  • the speaker system may have an audio conversion function.
  • the receiver may adjust the signal output to the speaker system in response to a request from the speaker system. Adjustments in the signals to be output include, for example, adjustments to the number of channels of channel base signals, the number of object base signals, the number of HOA system signals, the range of metadata necessary for their reproduction, and the like. If there is a limit to the signals that can be output depending on the program (for example, the number of channels), the output signal may be adjusted for each program.
  • the external speaker system also includes an audio bitstream decoder, the audio bitstream may be output to the externally connected speaker system through the bitstream output controller 10106 shown in FIG. 15B. By properly adjusting the output signal, the operation of the audio output is guaranteed.
  • the object base signal is a signal for each sound source, and is separated from the bitstream signal into a signal for each sound source by the core decoder 10101.
  • the object renderer 10103 calculates an output signal for each speaker based on the accompanying playback position information.
  • the actual placement information of the speaker system is also taken into consideration.
  • the output signal to the speaker system may also be calculated. This allows some processing to be shared.
  • weighting coefficients for calculating the output signal to the speaker may be given from the outside.
  • the weighting coefficients may be set by the user, may be obtained through communication with the speaker system, or may be obtained from the server. At this time, since the weighting coefficients vary depending on the sound source position, they are given in a table format or a function format.
  • the 22.2ch signal that takes into account the sound source position is processed within the receiver, it is also possible to provide only the conversion coefficient for converting the 22.2ch signal to the external speaker signal. good.
  • Object-based signals are divided into those that allow the viewer to specify the playback position of the sound source and those that do not.
  • This position designation and whether or not the designated position can be changed are transmitted as metadata for each sound source, and the receiver changes processing depending on the designation of whether or not the designated position can be changed.
  • a parameter is transmitted to the receiver that describes information as to whether or not the user is permitted to set the playback position for each object-based signal.
  • FIG. 19 is an example of metadata.
  • FIG. 19 shows a case where there are three types of sound sources: narration, vocals, and guitar.
  • the position of the narration voice can be changed, and a replacement sound source is also prepared.
  • Replacement sound sources are distinguished by sub-ID.
  • sub-ID a is Japanese
  • b is English
  • c is French.
  • the user instruction may be given for each program, or may be set to always change to a specific language.
  • the replacement signal may be transmitted using a general-purpose user area.
  • the default playback position of the narration audio may be set by the output level of each speaker of a standard speaker system, such as a 22.2 channel system, or may be set as the direction as seen from the standard viewer position. good.
  • Examples of playback position data are shown in FIGS. 20A, 20B, and 20C.
  • the playback position can be expressed in terms of speaker output level (Figure 20A), polar coordinates with the standard viewing position as the origin ( Figure 20B), rectangular coordinates with the standard viewing position as the origin ( Figure 20C), etc. There is.
  • FIG. 20A shows the output levels of two speakers, the output levels of three or more speakers may be set.
  • the orthogonal coordinate display in Figure 20C means that the direction parallel to the screen and to the right is the positive direction of the X-axis, the direction perpendicular to the screen from the origin to the screen is the positive direction of the Y-axis, and the vertically upward direction is the positive direction of the Z-axis. direction.
  • preset data 1 is the default playback position, and it is changed to other preset data according to a user instruction.
  • a configuration may be adopted in which a playback position defined by the user can be used in response to a user instruction.
  • FIG. 21 shows an example of selection of the playback position of the narration audio by the user.
  • preset positions are indicated by buttons and selected using the arrow keys on the remote control.
  • the example in FIG. 21 shows a state where the left position is selected.
  • the playback position of the narration audio may be selected in a far or near direction.
  • a narration sound corresponding to the button selection state may be played.
  • the volume of the narration may also be set independently. For example, on the narration volume setting screen, use the volume change button on the remote control to set the volume.
  • FIG. 22 shows an example of setting the narration position by the user.
  • the playback position is not selected from preset positions, but is set freely within a predetermined range.
  • the black circles in the figure represent the set positions, and the user adjusts the positions using the arrow buttons on the remote control.
  • the narration volume setting is the same as in FIG. 21.
  • the range that can be set by the user may be set for each type of sound source, and the display of the setting scale may be limited to the range that can be set by the user. Furthermore, if there is a possibility that the volume may become too loud depending on the playback position, the range of volume settings may also be limited. By being able to change the reproduction position and volume of the object-based signal as described above, it is possible to realize audio reproduction more suitable for the user.
  • FIG. 24A is an example of stream data in which the playback position is specified by the output level of the speaker
  • FIG. 24B is an example of stream data in which the playback position is specified in polar coordinates
  • FIG. 24C is an example of stream data in which the playback position is shown in orthogonal coordinates.
  • This is an example of stream data specified by .
  • the playback time on the stream data may be set arbitrarily or may be synchronized with the frame of the program image.
  • the volume may be adjusted for each sound source separately from the overall volume adjustment. In this way, by specifying the playback position and volume for each sound source, it is possible to achieve excellent sound image localization and to playback audio that matches the user's preferences.
  • the signal of the HOA system is a signal in which a sound field is expanded by spherical harmonic functions from the 0th order to a certain order.
  • the order of the spherical harmonics is a non-negative integer, and there are 2n+1 nth-order spherical harmonics. Therefore, the number of signals in the HOA method expanded by spherical harmonics up to the nth order is (n+1) 2 .
  • the highest order of the spherical harmonics used for expansion and the number of HOA system signals are summarized in FIG. 25.
  • the order of the HOA signal As can be seen from the figure, the number of HOA signals increases rapidly as the order increases. Since the number of channel base signals corresponding to the 22.2ch speaker system is 24, a 4th order HOA system signal consisting of 25 signals has an information amount comparable to this.
  • the HOA method signal is first separated into a signal expanded by spherical harmonics by the core decoder 10101, and then converted into a signal to be output to the speaker using the HOA method dedicated decoder 10104 based on the speaker arrangement information. be done.
  • This conversion may be performed by generating an output signal for a speaker system to be used directly, or by converting the signal into a signal for 22.2ch and then performing the conversion in accordance with the conversion of the channel base signal. This allows some processing to be shared.
  • weighting coefficients for calculating the output signal to the speaker may be given from the outside.
  • the weighting coefficients may be set by the user, may be obtained through communication with the speaker system, or may be obtained from the server.
  • the difference between this HOA system signal and the channel-based signal is the difference in the density of information in the viewing space.
  • the information density of the HOA system signal is isotropic and uniform, whereas the information density of the channel base signal is high in the front direction where many speakers are arranged.
  • there is no problem with audio playback using channel-based signals but if you have a speaker system that differs significantly from the standard speaker arrangement, or headphones whose orientation in space can change significantly, it is better to use HOA signals. You can enjoy suitable playback sound.
  • FIG. 26 shows an audio signal selection screen for each output device.
  • the channel base signal is expressed as "front-oriented type” and the HOA method signal is expressed as "omnidirectional type".
  • the channel base signal is selected as the audio signal output from the speaker
  • the HOA system signal is selected as the audio signal output from the headphones.
  • This setting may be performed for each program, or may be a common setting regardless of the program. Further, as a default setting, a channel base signal may be used for the speaker, and an HOA signal may be used for the headphone. Furthermore, if the signal set for the output device is not provided for the program, another signal may be used.
  • high-order signals may be transmitted via the Internet.
  • one method is to transmit signals from the 0th order to the 4th order by broadcasting, and to transmit the signals of the 5th order and above via the Internet. Even if the signal via the Internet is interrupted for some reason, it will not be a major problem because the audio will not be completely lost. Further, the signals via the Internet may be downloaded all at once to the receiver before the start of broadcasting. In this way, the influence of internet failures during viewing can be eliminated.
  • the HOA system signal is not transmitted during broadcasting, and the HOA system signal is transmitted only via the Internet. If the HOA signal is interrupted due to an Internet failure, the channel-based signal can be used instead, so that the audio will not be completely lost. In this case as well, the signals via the Internet may be downloaded all at once to the receiver before the start of broadcasting. In this way, the influence of internet failures during viewing can be eliminated.
  • the method of transmitting part of the audio signal via the Internet may be used for channel-based signals and object-based signals.
  • signals of 24 channels corresponding to 22.2ch may be transmitted by broadcasting, and additional channels may be transmitted over the Internet. This makes the information density more isotropic and can also be used as an alternative to HOA type signals.
  • object-based signal some of the signals for each of a plurality of sound sources may be transmitted over the Internet.
  • the audio output described above is based on the premise that audio is output to the system to be primarily used, whether it is an internal speaker system or an external speaker system. However, when multiple users use the system, some users may wish to hear a different sub-audio for themselves. In order to meet such a request, it is possible to add it to the main system and perform audio playback with another output device. For example, you can listen to secondary audio using your smartphone as a linked device.
  • the output device at this time may be selected as appropriate, such as a speaker, normal earphones, or open-air earphones.
  • FIG. 27 shows an example of selecting audio to be played back on the smart phone 10300, which is a linked device.
  • On represents reproduction, and off represents non-regeneration. Playback and non-playback are switched according to the user's preferences.
  • the overall audio is the audio of the channel-based signal
  • the individual audio is the audio of the object-based signal.
  • the overall sound is the sound of the audience, etc., and may be played back by a receiver or by a separate linked device, depending on the user's preference.
  • the selectable audio settings are not limited to this example, and may be selected as appropriate. By being able to freely select audio playback using playback devices near the user in this way, it becomes possible to play back audio that better suits the user's preferences.
  • FIG. 28A shows an example of parameters regarding the number of signals to be transmitted and the signal acquisition destination.
  • Each of the channel-based signals, object-based signals, and HOA system signals has parameters that describe the number of signals and the acquisition destination address (URL) in the case of Internet acquisition.
  • URL acquisition destination address
  • FIG. 29A shows an example in which audio signals to be transmitted are displayed for each program in the electronic program guide.
  • FIG. 29A shows the number of signals for each type of audio signal.
  • signals in parentheses represent signals obtained via the Internet. If information cannot be obtained from the Internet, a display will be displayed to indicate this.
  • FIG. 29B shows an example in which a strikethrough line is displayed superimposed on an audio signal transmitted via the Internet when information cannot be obtained from the Internet. Alternatively, if information cannot be acquired, audio signals transmitted via the Internet may not be displayed.
  • the electronic program guide may include explanatory information regarding the sound sources that the user can select for reproduction.
  • the user can understand the degree of richness of the program audio, and this can be used as auxiliary information for program selection. Furthermore, it may be possible to reserve settings for audio playback to be used when viewing a program before the program starts. Furthermore, for frequently used settings, the setting contents may be recorded in the receiver and the recording may be called up to easily perform audio playback settings when viewing a program or making a reservation. This improves usability for the user.
  • the selection of the speaker system, the type of signal source being reproduced, and the number of signals may be displayed on the screen (FIG. 30).
  • the display may be performed all the time, or may be displayed for a predetermined time only when the power is turned on, when changing the channel, or when changing the state. Further, this display may be prohibited by the user. In this way, by displaying the current state, the user can appropriately grasp the state.
  • Some special headphones perform processing on audio signals taking into account the sound transmission characteristics of the user's head. This processing may be performed by an external device, or may be performed by a receiver by adding a processing function to the mixer and distributor 10105 shown in FIG. 15B. At this time, the programs, parameters, etc. necessary for the processing may be obtained from the headphone manufacturer's server. This makes it possible to achieve more precise audio playback.
  • first control information digital_recording_control_data
  • second control information copy_restriction_mode
  • control information such as MPEG-2 TS program sequence information (for example, PMT) and MMT MPT.
  • Control information indicating content copy control is stored and transmitted from the broadcast station side to the broadcast receiving apparatus 100.
  • the first control information is, for example, 2-bit data, where 00 indicates "copy is possible without any restrictions", 10 indicates “copy is possible for only one generation”, and 11 indicates "copy is possible without restrictions”. If so, it may be configured to indicate "copy prohibited”.
  • the first control information (digital_recording_control_data) is 00 and indicates that it is ⁇ copyable without any constraints,'' it can be combined with another 1-bit control information to indicate that it is ⁇ copiable without constraints and that encryption is enabled during storage and output.'' Two types of states may be identified: ⁇ encryption processing required'' and ⁇ copy possible without constraints and encryption processing not required during storage and output''.
  • the second control information (copy_restriction_mode) is, for example, 1-bit data, and is configured so that if it is 0, it indicates that "only one generation can be copied", and if it is 1, it indicates that "copy with a limited number is allowed”. Just do it.
  • “limited number of copies allowed” is a copy control state that permits copying a predetermined number of times; for example, if nine copies are allowed + one move is allowed, this is a so-called "dubbing 10".
  • the second control information functions only when the first control information (digital_recoding_control_data) is 10, indicating that "only one generation can be copied.” In other words, if the first control information (digital_recording_control_data) is 10, indicating that "only one generation can be copied", the copy control of the content is "limited number of copies can be made" or "only one generation can be copied".
  • the broadcast receiving apparatus 100 of the present embodiment stores the content in the storage section 110 and stores the content on a removable recording medium according to the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode). It may be configured to control recording to, output to external equipment, copying to external equipment, moving to external equipment, etc.
  • the storage processing target is not limited to the storage unit 110 inside the broadcast receiving device 100, but may also include records that have been subjected to protection processing such as encryption processing so that they can be played only by the broadcast receiving device 100.
  • the storage process targets include external recording devices that are in a state where they can be recorded and reproduced only by the broadcast receiving device 100.
  • control information such as the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) will be described below.
  • the broadcast receiving apparatus 100 of this embodiment stores content that indicates "copyable without restrictions" by the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT. ) unit 110, recording on a removable recording medium, outputting to an external device, copying to an external device, and moving to an external device may be performed without restriction.
  • first control information digital_recording_control_data
  • this example applies to content that indicates "only one generation can be copied" by a combination of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS or the MPT of the MMT.
  • the broadcast receiving apparatus 100 enables encrypted storage of content in the storage (storage) unit 110, but when outputting the stored content to an external device for viewing, copying of "re-copy prohibited" is required. It will be encrypted and output together with the control information.
  • so-called move processing to an external device processing of copying the content to the external device and making the content in the storage unit 110 of the broadcast receiving device 100 unplayable by erasing processing or the like) is possible.
  • this embodiment applies to content that indicates "copyable with limited number” by a combination of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS and the MPT of the MMT.
  • the broadcast receiving apparatus 100 is capable of encrypting and accumulating the content in the storage (storage) unit 110, but when outputting the accumulated content to an external device for viewing, copy control of "prohibit re-copying" is required. It will be encrypted and output along with the information. However, it may be possible to enable a predetermined number of copies and moves to an external device. In the case of the so-called "dubbing 10" regulation, nine copies and one move process may be performed to the external device.
  • the broadcast receiving apparatus 100 of this embodiment stores content that is indicated as "copy prohibited” by the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT to the storage unit 110. Copying is prohibited.
  • the broadcast receiving apparatus 100 has a "temporary storage" mode that allows data to be stored in the storage unit 110 only for a predetermined time or a predetermined time specified by control information included in the broadcast signal. In this case, even if the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT indicates "copy prohibited", the content cannot be transferred to the storage unit 110. can be temporarily held.
  • output for viewing to the aforementioned external device may be performed via the video output unit 193 in FIG. 2A, the digital I/F unit 125, the LAN communication unit 121, or the like.
  • the copying or moving process to the external device described above may be performed via the digital I/F section 125, the LAN communication section 121, etc. in FIG. 2A.
  • the broadcast receiving apparatus 100 of the present embodiment is equipped with a function to perform more suitable copyright protection processing (content protection processing) corresponding to audio content that includes the advanced audio signal.
  • FIG. 28B shows the data structure of the audio component descriptor (audio_component_descriptor) included in the PMT of MPEG-2 TS and MPT of MMT in the digital broadcasting system of this embodiment.
  • the audio component descriptor in FIG. 28B stores data describing various information regarding the audio component of the content transmitted by the digital broadcasting system of this embodiment.
  • information on the type of audio component included in the target content can be stored in the component_type (component type) data shown in FIG. 28B.
  • component_type component type
  • FIG. 28C shows a list of audio signal types that can be indicated by some bits in the audio component type data. For example, definitions similar to those used in conventional digital broadcasting are used for 00000 to 10001. These definitions include definitions of relatively popular audio signals such as single mono, stereo, 5.1ch, 7.1ch, and 22.2ch. Note that LFE (Low Frequency Effect) shown in FIG. 28C indicates a bass enhancement channel.
  • LFE Low Frequency Effect
  • definitions of 10010 to 11010 are newly added as audio signal types that can be indicated by some bits in the audio component type data.
  • 7.1.4ch which is 7.1ch plus four channels arranged above, is defined as data 10010.
  • 7.1.4ch+4obj which includes four object signals in 7.1.4ch
  • data 10100 7.1.4ch+6obj, which includes six object signals in 7.1.4ch
  • data 10101 22.2ch+4obj, which includes four object signals in 22.2ch
  • HOA1 which is an HOA system signal with one signal, is defined.
  • HOA4 which is an HOA system signal with four signals
  • HOA9 which is an HOA system signal with nine signals
  • HOA16 which is an HOA system signal with 16 signals
  • HOA25 which is an HOA system signal with 25 signals
  • the digital broadcasting system of this embodiment is capable of transmitting and receiving audio content that includes advanced audio signals.
  • audio signals ranging from a single monaural signal with audio component type data of 00001 to a 7.1.4ch signal with audio component type data of 10010 are channel-based signals.
  • the audio signal from the 7.1.4ch+4obj signal with audio component type data 10011 to the 22.2ch+4obj signal with audio component type data 10101 is the audio signal of the object-based signal.
  • the object-based signal is included in the advanced audio signal in this embodiment.
  • the HOA1 signal with audio component type data of 10110 to the HOA25 signal with audio component type data of 11010 are audio signals of the HOA system.
  • the HOA audio signal is included in the advanced audio signal in this embodiment.
  • audio component type data is stored in the audio component descriptor included in the PMT of MPEG-2 TS and MPT of MMT, and the data is By transmitting the content to the receiving device 100, the broadcast receiving device 100 can identify whether the audio signal of the transmitted content is an advanced audio signal or not.
  • FIGS. 31A to 32B a control example of copyright protection processing (content protection processing) for audio content in the broadcast receiving apparatus 100 will be described using FIGS. 31A to 32B.
  • the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) shown in FIGS. 31A to 32B are control information indicating copy control of the content, and are the first control information regarding the content.
  • the definitions of (digital_recording_control_data) and the second control information (copy_restriction_mode) are as already explained.
  • the decoded audio output (analog) shown in FIGS. 31A to 32B is output processing of an audio signal in a decoded analog signal state.
  • the audio signal decoded by the core decoder in FIG. 15B is converted into an analog signal by a D/A converter (digital-to-analog converter) not shown, and the analog video/video signal provided in the expansion interface section 124 in FIG.
  • D/A converter digital-to-analog converter
  • the decoded audio output (digital) shown in FIGS. 31A to 32B is output processing of an audio signal in a decoded digital signal state. This is a process in which, for example, the audio signal decoded by the core decoder in FIG. 15B is outputted as a digital signal to an external device or the like via the audio output unit 196 in FIG. 2A. Note that the decoded audio signal may be output as a digital signal to an external device or the like via the digital I/F section 125, the LAN communication section 121, or the like.
  • stream output (IP interface) shown in FIGS. 31A to 32B is output processing of an audio signal in a stream format digital signal state. This is, for example, a process in which the core decoder in FIG. 15B does not perform decoding and outputs a digital signal in a stream format to an external device via the bitstream output controller 10106.
  • the broadcast receiving device 100 may output to the external device as an IP interface output via the LAN communication unit 121 in FIG. 2A, for example.
  • FIG. 31A is a table showing a control example of copyright protection processing when the broadcast receiving apparatus 100 outputs the audio component of the content as it is without accumulating it.
  • the definitions of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) regarding the content have already been described.
  • the third control information (audio component_type) is information on the audio component type explained in FIG. Identify whether the type is a channel-based signal, object-based signal, or HOA type signal. Furthermore, when the information on the audio component type indicates a value corresponding to undefined, the broadcast receiving apparatus 100 also identifies that fact.
  • the third control information indicates an advanced audio signal, that is, if the third control information indicates an object-based signal or an HOA method.
  • the control is performed so that the signal is output in a copy-prohibited state, regardless of the combination of the first control information value and the second control information value.
  • the content of the advanced audio signal is more valuable than the content of the audio signal of the channel-based signal. Therefore, in the control example of FIG.
  • the content of object-based audio signals or HOA-based audio signals is copied by decoded audio output in the digital signal state rather than the audio signal content of channel-based signals.
  • the control is more restrictive. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal.
  • the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are as shown in FIG. 31A.
  • the output is controlled so that it can be copied without any restrictions.
  • the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are as shown in FIG. 31A.
  • the first control information indicates "copy", as shown in FIG. If "prohibited” is indicated, the output is controlled so that copying is prohibited.
  • control for decoded audio output in a digital signal state shown in FIG. 31A is performed when the output unit for decoded audio output is an output unit compatible with a copy control method such as SCMS (Serial Copy Management System). It is something that is done. If the output part of the decoded audio output does not support copy control, no matter what combination of the first control information value, the second control information value, and the third control information value, the copy Allows output without restrictions.
  • SCMS Serial Copy Management System
  • the copy control state settings in the control example for audio signal output in a digital signal state in a stream format via an IP interface are as shown in FIG. 31A. Since this is the same as the control example for output, repeated explanation will be omitted.
  • the combination of the first control information value and the second control information value indicates that "copying is possible without any constraints".
  • content of object-based audio signals or HOA method audio signals rather than the content of audio signals of channel-based signals, etc.
  • Copy control is more restrictive. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal.
  • DTCP in the audio signal output in the stream format digital signal state via the IP interface shown in FIG. 31A, DTCP or DTCP2 is used as the copy control method.
  • DTCP also applies to content where the value of the first control information and the value of the second control information indicate that there are copy restrictions such as "only one generation can be copied”, “limited number of copies can be copied", and "copy prohibited”. Control is performed to prohibit output to external devices that are not compatible with DTCP2. Control in this respect is different from the example of control regarding decoded audio output in the digital signal state.
  • the control example in FIG. 31B is a control example of copyright protection processing when the broadcast receiving apparatus 100 outputs the audio component of the content as it is without storing it, and is a modification of the control example in FIG. 31A.
  • the explanation of the points similar to the control example of FIG. 31A will be omitted, and only the points that are different from the control example of FIG. 31A will be explained.
  • control of decoded audio output in an analog signal state and the control of audio signal output in a stream format digital signal state via an IP interface are the same as the control example of FIG. 31A.
  • control on decoded audio output in the digital signal state is different from the control example of FIG. 31A.
  • the combination of the value of the first control information and the value of the second control information is "copyable without constraint". If this is the case, no matter which value the third control information indicates, the decoded audio output in the digital signal state can be output in a copyable state without any restrictions. Furthermore, even if the combination of the value of the first control information and the value of the second control information indicates that "only one generation can be copied" or "copy with a limited number of copies is possible", the value of the third control information No matter which value is indicated, only one generation is output in a copyable state in the decoded audio output in the digital signal state.
  • the digital signal state is The decoded audio output will be output with copying prohibited. That is, in the control of decoded audio output in the digital signal state in the control example of FIG. There is no difference in copy restrictions when outputting audio depending on copy control of audio signal content. Even if this is a decoded audio output in a digital signal state of the content of an object-based audio signal or an HOA-based audio signal, which is an advanced audio signal, for example, it may be processed by the mixer and distributor 10105 in FIG. 15B.
  • the content of the object-based audio signal or the HOA method is more important than the audio signal content of the channel-based signal. It is preferable that the copy control of the content of the audio signal is made more restrictive.
  • control example shown in FIG. 31B of this embodiment described above also makes it possible to realize more suitable copy control that corresponds to the value of content according to the type of audio signal.
  • FIG. 32A is a table showing a control example of copyright protection processing when audio components of content are stored in the broadcast receiving apparatus 100 and then output.
  • the storage process is performed by storing content including audio components in the storage unit 110 of the broadcast receiving device 100.
  • the copy control state of the content in the storage state is controlled differently depending on the combination of the value of the first control information and the value of the second control information. Specifically, if the first control information indicates that the content can be copied without restrictions, regardless of the value of the second control information, the content can be copied without restrictions. Accumulate in the state. At this time, regardless of the value of the third information, the content may be stored in a state where it can be copied without restriction. Further, when the first control information indicates that "only one generation can be copied" and the second control information indicates that "only one generation can be copied", the content is stored in a state where re-copying is prohibited.
  • the content may be stored in a state where re-copying is prohibited.
  • the content when the first control information indicates that "copying is allowed for only one generation" and the second control information indicates that "copying is permitted with a limited number of copies," the content is stored in a state where copying is allowed with a limited number of copies.
  • the content may be stored in a state where copying is possible with a limited number of copies.
  • the first control information indicates "copy prohibited”
  • the content is configured so that it can be stored in a temporary storage state regardless of the value of the second control information. Also good.
  • the content may be stored in a temporary storage state.
  • the copy control state of the content in the storage state is controlled differently depending on the combination of the value of the first control information and the value of the second control information.
  • the decoded audio output in the analog signal state of the stored content can be copied without any restrictions. Control the output with . This is because analog signals are more susceptible to deterioration due to copy processing than digital signals, and copying does not need to be restricted as severely as digital signals.
  • the third control information indicates an advanced audio signal, that is, the third control information is an object.
  • control is performed so that it is output in a copy prohibited state, regardless of which copy control state is in the storage state.
  • the content of the advanced audio signal is more valuable than the content of the audio signal of the channel-based signal. Therefore, in the control example of FIG.
  • the audio signal of the channel base signal is For content such as object-based audio signals or HOA-based audio signals, copy control in decoded audio output in a digital signal state for stored content is more restrictive than for content. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal.
  • the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A.
  • the copy control state in is a control state in which copying is possible without restrictions
  • control is performed to output in a state in which copying is possible without restrictions.
  • the decoded audio output in the digital signal state for the content after storage the content whose third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A.
  • the copy control state in is a control state in which re-copying is prohibited or a control state in which limited copies are allowed, control is performed so that only one generation is output in a copyable state.
  • the content whose third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A. If the copy control state in is a temporary storage control state, control is performed to output in a copy prohibited state.
  • the output unit for decoded audio output is compatible with a copy control method such as SCMS (Serial Copy Management System). This is done when the output section is If the output section of the decoded audio output does not support copy control, no matter what the copy control state is in the storage state or what value the third control information has, copy restrictions will be applied. Allows output without.
  • SCMS Serial Copy Management System
  • the copy control state in the storage state is the control for prohibiting re-copying. If the control state is in a control state with copy restrictions, such as a control state in which copying is possible with a limited number of copies, or a control state in which temporary storage is possible, re-copying is prohibited regardless of the value of the third control information. Output with .
  • the copy control state in the storage state is that copying is possible without restriction.
  • the copy control state at the time of output is switched according to the third control information.
  • content in which the third control information indicates an audio signal of a channel base signal or content in which the third control information indicates undefined is allowed to be output in a state where it can be copied without restriction.
  • content in which the third control information indicates an object-based audio signal or an HOA-based audio signal, which is an advanced audio signal is controlled to be output in a state where re-copying is prohibited.
  • DTCP or DTCP2 is used as the copy control method.
  • DTCP and DTCP2 are not supported for content whose copy control state in the storage state is a control state with copy restrictions, such as a control state of prohibiting re-copying, a control state of copying with limited quantity, or a control state of temporary storage. control to prohibit output to external devices that are not connected.
  • the control example in FIG. 32B is a control example of copyright protection processing when the audio component of content is stored in the broadcast receiving apparatus 100 and then output, and is a modification of the control example in FIG. 32A.
  • the explanation of the points similar to the control example of FIG. 32A will be omitted, and only the points that are different from the control example of FIG. 32A will be explained.
  • control of decoded audio output in an analog signal state for content after storage and the control of audio signal output in a stream format digital signal state via an IP interface for content after storage are performed.
  • the control example in FIG. 32A is the same as the control example in FIG. 32A.
  • the control for outputting decoded audio in the digital signal state for the content after storage is different from the control example of FIG. 32A.
  • the control example of FIG. 32B is different from the control example of FIG.
  • the copy control state in the storage state is a control state in which copying is possible without restriction, No matter what value the third control information indicates, the decoded audio output in the digital signal state of the content after storage can be output in a copyable state without any restrictions.
  • the copy control state in the storage state is a control state in which re-copying is prohibited or a control state in which limited copying is allowed, no matter which value the third control information indicates, the storage In the decoded audio output in the digital signal state for the later content, only one generation is output in a copyable state.
  • the copy control state in the storage state is a temporary storage control state
  • no matter which value the third control information value indicates the decoded audio output in the digital signal state for the content after storage is performed.
  • Output in a copy-prohibited state That is, in the control of decoded audio output in the digital signal state in the control example of FIG.
  • the reason for this control is the same as the reason for controlling the decoded audio output in the digital signal state in the control example of FIG. 31B, which has already been explained, so a repeated explanation will be omitted.
  • the following control may be performed when outputting via the IP interface.
  • the first control information (digital_recording_control_data) and second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS and the MPT of the MMT specify that "only one generation can be copied” and "a limited number of copies can be made".
  • copy_restriction_mode included in the PMT of the MPEG-2 TS and the MPT of the MMT specify that "only one generation can be copied" and "a limited number of copies can be made”.
  • copy restrictions such as "Copy Prohibited” are indicated, to an external device via the LAN communication unit 121, the IP address of the external device that is the destination of the transmission packet from the broadcast receiving device 100.
  • copy control information indicates copy restrictions such as ⁇ only one generation can be copied'', ⁇ a limited number of copies can be made'', and ⁇ copying is possible without any restrictions and encryption processing is required when storing and outputting''.
  • the IP address of the external device that is the destination of the transmission packet from the broadcast receiving device 100 is the same as that of the broadcast receiving device. If the IP address of the external device is outside the same subnet as the IP address of broadcast receiving device 100, it may be prohibited.
  • the IP address of the external device that is the destination of the transmission packet from the broadcast receiving apparatus 100 is the one that receives the broadcast. This is possible only when the IP address of the device 100 is within the same subnet, and is prohibited when the IP address of the external device is outside the same subnet as the IP address of the broadcast receiving device 100. However, if the external device is connected within the same subnet as the IP address of the broadcast receiving device 100 within a predetermined period, and the external device is registered as a device that can be viewed even outside the same subnet as the IP address of the broadcast receiving device 100.
  • the storage (storage) of the broadcast receiving device 100 in the external device may be such that the content stored in the unit 110 can be outputted as video and audio for viewing. In this case, the video and audio output for viewing is performed after encrypting the content.
  • FIG. 33 is a diagram illustrating an example of a reverberation sound processing flow when headphones are used.
  • the broadcast receiving apparatus 100 obtains the reverberant sound signal of each sound based on the sound field characteristics, generates a reverberant sound-added sound signal in which the reverberant sound is added to the original sound, and This is an example of mixing the reverberant sound-added audio signal and transmitting it to headphones for sound reproduction.
  • the sound field characteristics are the acoustic characteristics of the sound field where audio is recorded, that is, the spatial transfer function of sound including reverberant sound.
  • sound field characteristics are not limited to methods that directly indicate the spatial transfer function of sound, but also include the shape of structures such as walls at the recording location, and the reflection and absorption characteristics of sound by each structure. It may also be reference data for estimating the transfer function.
  • information representing the sound field characteristics and information representing the position of each sound source in the sound field is broadcast metadata included in the broadcast wave, for example, metadata of each object-based audio signal included in the broadcast wave. It is described in The broadcast receiving device 100 acquires sound field characteristics from broadcast metadata included in received broadcast waves, reflects the sound field characteristics during audio rendering, generates a reverberated sound signal, and produces sound with a high sense of presence. enable playback.
  • the audio received and played back by broadcast waves is difficult to reflect the reverberation characteristics of concert halls, theaters, etc., and may lack a sense of realism.
  • This control example is particularly effective in solving such problems.
  • the broadcast receiving device 100 uses information obtained from the broadcast program name and its additional information that can specify the name of the recording location, such as a hall or theater, to determine the sound field characteristics via a network such as the Internet. You can get it. For example, information in which hall names or theater names are associated with their sound field characteristics is stored in advance in the information server 900 shown in FIG. 1. Broadcast receiving device 100 acquires the sound field characteristics associated with the specified hall name or theater name from information server 900 via the network through the communication unit. Furthermore, the broadcast receiving device 100 may receive reverberant sound as an object-based sound source, adjust its intensity, frequency characteristics, etc. according to the viewer's preferences and viewing environment, and synthesize it with other sounds. .
  • headphones are an example of an audio output device having multiple speakers.
  • the audio output device may be a head mounted display, a speaker system, or the like, as described later.
  • An audio recording location is a location where audio is converted into an electrical signal using a microphone, pickup, etc. and recorded, such as a concert hall or theater where a performance or play is held.
  • 20010 corresponds to the object renderer 10103.
  • the processing when using headphones described in the above (Example 2) [Advanced Audio Signal] corresponds to spatial position correction 20012 and 20022. Further, reverberation sound addition 20013, 20023 and head transfer correction 20014, 20024 are performed.
  • the background sound 20021 may contain reverberation. Therefore, in this control example, with respect to adding reverberant sound to the individual sound 20011, when adding reverberant sound to the background sound 20021, the amount of reverberant sound added may be reduced or even if the reverberant sound is not added. good.
  • Reverberation addition 20013 and 20023 are calculations based on sound field characteristics extracted from broadcast metadata, such as object audio (individual audio for each sound source) signal metadata, or space simulated from the structure of the recording location. This is done by calculation using a transfer function. More specifically, the reverberation sound additions 20013 and 20023 are performed, for example, by convolution integration of impulse responses to each object audio and background sound, filtering processing of frequency attenuation characteristics for each delay time, and the like. Note that if the sound field characteristics are not described in the metadata of the object audio signal, the theater name may be extracted from the additional information of the program, and the sound field characteristics of the theater may be obtained from the information server 900.
  • the broadcast receiving device 100 acquires the sound source position information in the theater from the metadata of the object audio signal, and when the viewing position, which is the virtual viewer's position in the theater, is specified, the broadcast receiving device 100 acquires the sound source position information in the theater from the metadata of the object audio signal. Based on this, the spatial transfer function can be determined. Based on the obtained spatial transfer function, the broadcast receiving device 100 generates a signal of the reverberant sound so that the reverberant sound becomes close to the sound when the sound of each sound source is listened to at the viewing position. By generating such a reverberation-added audio signal, it is possible to reproduce audio with a higher sense of presence. Note that if the sound field characteristics cannot be obtained directly, the spatial transfer function may be determined by simulation from information on the internal structure and materials of the theater.
  • the theater's internal structure/material information includes, for example, elements such as room size, shape, ceiling height, and interior material.
  • the head-related transfer function shows the relationship between sounds coming from each direction and sounds reaching the entrances of the external auditory canals in both ears. Head transfer characteristics vary from person to person depending on the shape of the auricle and other factors. If there is no personally registered head-related transfer function, the voice is converted by head-related transfer correction 20014 based on the standard head-related transfer function.
  • the sound emitted from the headphones reaches the eardrum through the ear canal, and the listener recognizes the sound.
  • the ear canal transfer function from the headphones to the eardrum may vary not only depending on individual listeners, but also depending on whether the headphones are closed-type headphones that cover the pinna of the ear or earphone-type headphones that are inserted into the ear canal.
  • the head transfer correction 20014 may perform not only correction processing based on the head transfer function but also correction processing based on the ear canal transfer function. In this case, for example, it is desirable to specify the head-related transfer function by acquiring the type information of the headphones and the identification information of the listener, and to perform correction processing based on the specified head-related transfer function.
  • FIG. 33 illustrates the case where there is one individual sound and one background sound, but if there are multiple individual sounds or background sounds, similar processing is performed individually to create each sound after processing. All you have to do is mix. Furthermore, for individual voices of narration, the strength or presence or absence of reverberation sound addition processing may be changed depending on the type of individual voice, such as reducing or eliminating reverberation to make it easier to listen to. In addition, although we have explained an example in which spatial position correction 20012, reverberation sound addition 20013, and head transfer correction 2014 are sequentially processed, it is also possible to calculate a composite transfer function that combines them and process individual sounds with the composite transfer function. good.
  • the head-related transfer correction may be processed all at once after the mixing 20015 processing. , may be processed by the headphones 910.
  • the headphones 910 When processing is performed using the headphones 910, there is no need for the broadcast receiving apparatus 100 to know the type and orientation of the headphones 910, so if there are multiple viewers, it becomes easy to respond to each viewer individually.
  • reverberant sound may be approximated and transmitted as one sound source of object audio. Furthermore, the intensity of the reverberation sound may be adjusted separately from the background sound according to the viewer's preference.
  • FIG. 34A is a diagram showing an example of the audio output setting menu.
  • each tag indicates a setting target. By selecting a tag, the viewer can open the setting menu for the setting target associated with that tag.
  • the tag "Built-in SP" is the setting of the TV's built-in speaker
  • the tag "Optical IF” is the setting of the optical digital interface
  • the tag "ARC” is the setting of the HDMI Audio Return Channel
  • the tag "HP1” is the setting of the Head Phone 1 (Bluetooth (registered trademark)). connection) settings and the tag “HP2" are respectively associated with the settings of Head Phone 2 (assuming a 3.5 ⁇ mini jack).
  • FIG. 34A shows a state in which the tag "HP1" is selected and the settings menu 20032 for Head Phone 1 is opened.
  • "model number hhh” represents the model number of the headphones.
  • the broadcast receiving apparatus 100 may obtain standard head-related transfer function information used for the head-related transfer correction 20014 in FIG. 33 from the information server 900 based on the model number of this headphone.
  • "Channel base On” indicates that the channel base audio signal is turned on.
  • the " ⁇ ” mark is a mark for setting On/Off, and when you move the cursor over this mark, the options "On” and “Off” will appear, allowing viewers to set On/Off.
  • “>Details” is a button for opening a detailed settings menu, and when this button is pressed, a detailed settings menu for the corresponding setting target is opened.
  • Object-based On indicates that the object-based audio signal is turned on.
  • a summary display is displayed showing how many object sounds there are in total and how many of them are on.
  • “3 out of 5 objects selected” is displayed, indicating that there are a total of 5 object sounds, and 3 of them are turned on.
  • "Object Base Off” is displayed, it means that all object sounds are turned off.
  • An example of the detailed setting menu for object-based signals is the detailed menu shown in FIG. In the detailed menu shown in FIG. 27, it is possible to set On/Off for each individual voice.
  • Theatre AAA is the name of the theater to which reverberation sound is added.
  • FIG. 34B is a diagram showing an example of a detailed menu for reverberation sound settings.
  • the viewing position for viewing the content is the same position as the listening point, which is the position for listening to the sound, and the position for viewing the video. However, if the display size changes, etc. There may also be cases where they are different.
  • the listening point will be used instead of the viewing position.
  • a reverberation sound detail setting field 20033 is displayed on the left side of the screen, and a seating layout diagram of the theater where the performance is performed from above is displayed on the right side of the screen.
  • the reverberation sound detailed setting field 20033 is provided with a reverberation sound setting area for setting reverberation sound and a listening point setting area for setting listening point.
  • the name of the selected theater and the effect strength of the reverberation sound are displayed, and it is possible to select the desired one from the options using the " ⁇ " and " ⁇ " marks. If the theater name is extracted from program information, such as program metadata, the theater name is automatically selected based on the program information.
  • FIG. 34B shows an example in which theater AAA is set. It is also possible to select an alternative theater other than theater AAA.
  • An alternative theater is a theater other than the AAA theater where the performance is being performed, such as a BBB theater. When an alternate theater is selected, a simulation of the acoustics of that alternate theater is reproduced. Since the sound field characteristics of an alternative theater, such as the BBB theater, cannot be obtained from the broadcast program, the name of the theater that can be obtained from the information server 900 can be searched and set.
  • the effect strength of the reverberant sound is initially set to the default or stored previous setting.
  • the effect strength of the reverberation sound can be selected from, for example, 10 levels, and the default value recommended by the program or audio metadata is 5.
  • An effect strength value of 0 means no reverberation.
  • FIG. 34B shows an example in which 5 is set as the value of the effect strength. Note that if the metadata of the reverberation sound information cannot be obtained, for example, "-" is displayed.
  • the settings of the selected listening point are displayed in the listening point setting area.
  • the setting of the listening point can be selected from, for example, three types: "Standard (A)," "Synchronized with video,” and "(B) setting.”
  • Standard (A) is a setting in which the default setting indicated by program information or audio metadata is selected.
  • Synchroze with video is a setting that also synchronizes the sound field when enlarging a video portion shown in FIG. 34D, which will be described later. Note that “synchronize with video” is also referred to as "linking of image and sound image” mode.
  • “(B) Setting” is a setting determined by the user by moving the cursor (B) on the seating arrangement map on the right to determine the listening point.
  • FIG. 34B shows an example in which the default listening point 20034 (represented by mark A) is displayed at the center of the seating arrangement map. Further, FIG. 34B shows an example in which "(B) setting" is selected and the listening point 20035 (represented by mark B) in this case is displayed at the front of the seating arrangement map.
  • the listening point can be moved by, for example, operating the cursor of the listening point using marks such as " ⁇ " and " ⁇ " displayed near the cursor.
  • FIG. 34C is a diagram illustrating an example of a banner display indicating the reverberation processing state.
  • FIG. 34C shows an example in which the entire stage of a program called Maxell Performance is displayed as a broadcast image, and shows a banner display 20041 when reverberation sound assuming an AAA theater is added.
  • FIG. 34D is a diagram illustrating an example of a banner display when a portion of a broadcast image is enlarged and displayed by a predetermined remote control operation or the like.
  • FIG. 34D shows an example in which the broadcast image of FIG. 34C is partially enlarged to display the dancers included in the broadcast image in a larger size. The enlarged part may be moved using the up, down, left, and right buttons on the remote control. If "Synchronize with video" is not selected as the listening point setting, that is, if "Image and sound image linkage" mode is Off, even if you partially enlarge the broadcast image, the sound spatial position correction will not be linked. The sound field remains unchanged.
  • the "image and sound image linkage” mode may be set independently when enlarging a portion of the image and when displaying the entire image.
  • the "image and sound image interlocking mode” may be set to On, and when the magnification is above a predetermined magnification (for example, 2x), the "image and sound image interlocking" mode may be set to Off.
  • FIG. 34E is a diagram for explaining viewing conditions before and after partially enlarged display of a broadcast image.
  • FIG. 34E shows an example of a view from above of a performance venue for a program to be broadcast.
  • FIG. 34E shows, as an example, an area 20051 displayed on the screen before partial enlargement display, and an area 20061 displayed on the screen after partial enlargement display. Note that the performer 20052 is shown as a front view to facilitate understanding of the explanation.
  • the screen area 20061 after partially enlarged display appears to have a smaller display area in real space than the screen area 20051 before partially enlarged display. It can be considered that the viewing point 20063 (represented by mark D) is approaching from point 20053 (represented by mark C). That is, the distance from the viewer to the position corresponding to the display area can be considered to be shortened from the distance 20054 before partial enlargement display to the distance 20064 after partial enlargement display.
  • the viewing angle 20065 after partially enlarged display of the broadcast image is larger than the viewing angle 20055 before partially enlarged display.
  • a state in which the processing mode changes the direction from which the sound comes in accordance with changes in the viewing angle is referred to as the "image and sound image linkage" mode being on.
  • the performer can be seen larger by partially enlarging the broadcast image, and the direction from which the sound is coming changes in conjunction with this. That is, in this case, the viewer can get a sense of getting closer to the stage, that is, the performers, and can recreate the experience of moving in a theater.
  • the processing mode that changes the direction of the sound in accordance with changes in the viewing angle is not in effect is referred to as the "image and sound image linkage" mode being off.
  • the performer can be seen larger by partially enlarging the broadcast image, but the direction from which the sound is coming does not change. That is, in this case, the viewer can get the feeling of viewing the stage through binoculars, and can recreate the experience of being at a fixed location in a theater.
  • switching the "image and sound image linkage" mode between On and Off may be performed directly with a remote control button, or may be performed through menu settings.
  • the sound field characteristics of the viewing environment are measured, and head transfer correction is performed to suppress the reverberant sound in the viewing environment based on the measured sound field characteristics. This makes it possible to further enhance the sense of being in the theater.
  • HMD viewing> Similar processing can be performed even when viewing using an HMD.
  • the HMD 920 receives video and audio data from the broadcast receiving device 100 and plays the video and sound, as if watching the monitor section 192 of the broadcast receiving device 100 and outputting the sound from the headphones 910. Provide a viewing environment where you can listen to.
  • the standard viewing state for television broadcasting may be established, that is, the state in which the video is displayed in front of the viewer.
  • the broadcast receiving device 100 has two modes for setting the viewing state of the viewer: a mode in which the attitude information of the HMD 920 is obtained from the HMD 920, a video is generated based on the attitude information and transmitted to the HMD 920, and a mode in which the viewing state is fixed to the standard viewing state.
  • a mode may also be provided.
  • the mode settings may be switched by menu settings or remote control button instructions.
  • the broadcast receiving apparatus 100 may be provided with a mode in which the display of the monitor unit 192 is synchronized with the HMD display, and may be configured to switch between synchronous and asynchronous modes in accordance with menu operations.
  • the broadcast receiving device 100 receives position information or acceleration information of the HMD 920 and detects changes in the position of the HMD 920 to understand the viewer's movements, and adjusts the display image to be enlarged or contracted according to the movement. You can also go there to bring out the sense of realism. For example, if a movement of the viewer approaching or stepping forward is detected, the displayed image is enlarged, and if a movement of the viewer moving away from the displayed image or backwards is detected, the displayed image is enlarged. Try to reduce the size.
  • the user When performing such enlargement/reduction control of the displayed image, the user may be able to set the enlargement/reduction ratio of the displayed image according to the distance at which the viewer approaches or moves away from the displayed image.
  • the user interface for example, remote control operation button or gesture movement detection, ) may be provided.
  • the broadcast receiving device 100 has a mode that changes the direction and sense of distance of the sound in accordance with the enlargement/reduction of the displayed image, that is, a mode in which the image and the sound image are linked, and the mode can be turned on/off.
  • a user interface may also be provided.
  • the video and sound image conversion processing according to the position information, acceleration information, posture information, etc. of the HMD 920 may be performed on the HMD 920 side instead of on the broadcast receiving device 100 side. If the HMD 920 performs video and audio conversion processing, even if the broadcast receiving device 100 provides the same video and audio data to each HMD 920 of multiple viewers, each viewer can independently This has the effect of allowing you to view images and sounds that match your position and posture.
  • Example 3 [Switching sound image localization mode] This embodiment relates to handling of switching of sound image localization mode.
  • the HMD Head Mound Display
  • earphones worn by the user are configured such that the sound image is localized to the image displayed on the screen of the broadcast receiving device 100 in response to changes in the front direction of the user's face.
  • the broadcast receiving apparatus 100 takes in information regarding the front direction of the user's face and performs audio processing so that sound is emitted from an audio output unit such as headphones.
  • variable sound image localization mode a mode in which a sound image is localized to the image displayed on the screen of the broadcast receiving apparatus 100 in response to a change in the front direction of the user's face.
  • the mode in which the broadcast receiving apparatus 100 executes audio output processing assuming that the front direction of the user's face is fixed is referred to as the fixed sound image localization mode.
  • FIG. 14A is a diagram showing a planar positional relationship between the broadcast receiving device 100, the HMD, an audio output unit such as headphones and earphones, and the user's head.
  • the standard viewing position in this case is the midpoint of the line segment connecting the left and right audio output sections.
  • the forward direction of a straight line passing through the midpoint of the line segment connecting the left and right audio output units, which is perpendicular to the line segment connecting the left and right audio output units, is the front direction of the user's face.
  • FIG. 14B shows a case where the standard viewing position is on a line (Y-axis) from the center of the screen of the broadcast receiving device 100 orthogonal to the longitudinal direction (X-axis) of the screen.
  • is the angle between the front direction of the user's face described in FIG. 14A and the center direction of the screen of the broadcast receiving device 100, and indicates how far the user's face is oriented sideways from the center direction of the screen.
  • is an example of the "azimuth angle" in the present invention.
  • the position of the audio output unit such as the HMD, earphones, and headphones is determined by, for example, recording the position where the user is facing the center of the receiver screen based on user input, and then recording the position where the user is facing the center of the receiver screen, and then recording the position of the audio output unit such as the HMD, earphones, and headphones. It can be obtained by detecting with a gyro sensor etc. mounted on the. Furthermore, the position of the audio output unit of the HMD, earphones, headphones, etc. can be calculated by the broadcast receiving apparatus 100 receiving position information from a system such as GPS via the left and right audio output units.
  • the audio output unit is configured to calculate the position of the audio output unit from position information provided by a system such as GPS, and output the calculated position information to the broadcast receiving device 100. It is also possible. Further, the broadcast receiving device 100 and the audio output unit can communicate position information by including other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the broadcast receiving device 100 sets " ⁇ " in FIG. 14B to zero degrees, calculates and outputs sound information including audio information. It has a configuration that makes it possible. That is, assuming that the user's front direction is facing the center of the screen, the broadcast receiving device 100 calculates sound information including audio information, and outputs the calculated sound information to an audio output unit such as an earphone or headphone. .
  • the audio output section The signal to be output is determined.
  • FIG. 35 is a flowchart showing the sound image localization mode switching operation of this embodiment.
  • step S201 it is determined whether the angle " ⁇ " formed by the screen center direction and the front direction of the user's face is fixed or non-fixed. If the angle " ⁇ " is fixed (step S201: Yes), the process proceeds to step S202, and if the angle " ⁇ " is not fixed (step S201: No), the process proceeds to step S204.
  • angle " ⁇ " formed by the screen center direction and the front direction of the user's face is fixed or non-fixed can be set and determined by various methods. Details will be described later.
  • step S203 the broadcast receiving device 100 outputs the calculation information calculated in the audio decoders 146S, 146U, etc. to an audio output unit such as an HMD, earphones, headphones, etc., and the audio output unit emits the calculation information to the user's hearing. .
  • step S204 the broadcast receiving device 100 receives information indicating that the angle “ ⁇ ” formed by the screen center direction and the front direction of the user's face is not fixed, and the broadcast receiving device 100 calculates “ ⁇ ”. Then, the calculated value of " ⁇ " is added to the attribute information of the sound information including the audio information to calculate the calculation information to be output to the audio output unit such as earphones and headphones.
  • the calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
  • FIG. 36A shows an example of an external view of a remote control 180R used to input a sound image localization mode setting instruction to the broadcast receiving apparatus 100 of this embodiment.
  • the remote controller 180R shown in FIG. 36A and the remote controller 180R shown in FIG. 12B have the same key arrangement.
  • the remote control 180R includes a power key 180R1 for powering on/off (standby on/off) the broadcast receiving device 100, and cursor keys (up, down, left, right) 180R2 for moving the cursor up, down, left and right. , a determination key 180R3 for determining the item at the cursor position as a selection item, and a return key 180R4. To switch the audio mode, press menu key 180RA.
  • FIG. 36B is a diagram showing an example of a banner display for setting the sound image localization mode displayed on the monitor unit 192 of the broadcast receiving device 100 when the menu key 180RA is pressed.
  • a banner display 192B1 indicating the type of menu is displayed on the monitor unit 192.
  • the shape of the banner display 192B1, the display position on the monitor unit 192, and the items of the banner display 192B1 can be arbitrarily set by the broadcast receiving apparatus 100.
  • the items of the banner display 192B1 include at least an item for setting the sound image localization mode (as an example, there is item information of "audio setting", but the name is not limited).
  • the banner display 192B1 With the banner display 192B1 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor to select "Audio Settings" and press the OK key 180R3 to display the banner. 192B2 is displayed. Items related to audio settings are displayed on the banner display 192B2. The items of the banner display 192B2 include at least an item for setting the sound image localization mode (as an example, there is item information of "sound image localization", but the name is not limited). With the banner display 192B2 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor to select "sound image localization" and press the enter key 180R3 to display the banner. 192B3 is displayed.
  • the items in the banner display 192B3 include at least an item for setting the sound image localization mode to "fixed” or "non-fixed” (for example, there is item information for "fixed” and “non-fixed”, but the name (For example, it is possible to change the name from “non-fixed” to "variable,” but the function of the sound image localization mode is the same.)
  • the remote control 180R transmits information (hereinafter referred to as "fixed information") that fixes the angle " ⁇ " formed by the center direction of the screen and the front direction of the user's face to the broadcast receiving device 100.
  • the "fixed information” can be communicated by the remote controller 180R and the broadcast receiving apparatus 100 being equipped with other communication units such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the remote control 180R performs the following operations. That is, the remote control 180R sends information (hereinafter referred to as "non-fixed information") that indicates that the angle " ⁇ " formed by the screen center direction and the front direction of the user's face is non-fixed to the broadcast receiving device 100. Send. "Non-fixed information" can be communicated by the remote control 180R and the broadcast receiving apparatus 100 being equipped with other communication units such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • a Bluetooth (registered trademark) communication unit such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • the broadcast receiving device 100 that has received the "fixed information” or “non-fixed information” sets the angle " ⁇ " formed by the center direction of the screen and the front direction of the user's face to a fixed value (for example, 0 degrees). ) or angle " ⁇ ", and calculates calculation information to be output to an audio output unit such as an HMD, earphones, headphones, etc.
  • the calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
  • the broadcast receiving apparatus 100 is able to calculate and output sound information including audio information by fixing " ⁇ " in FIG. 14B (zero degrees as an example). Therefore, when the front direction of the user's face cannot be accurately obtained, or when the user feels uncomfortable with the sound emission direction, it is possible to fix the sound image localization direction.
  • FIG. 37 is a flowchart showing the ⁇ correction operation when the sound image localization mode of this modification is non-fixed.
  • step S301 the broadcast receiving device 100 determines whether the reset button has been pressed. If the reset button is pressed (step S301: Yes), the broadcast receiving device 100 proceeds to step S302. If the reset button is not pressed (step S301: No), the broadcast receiving device 100 proceeds to step S303.
  • a reset button may be newly provided on the remote control 180R (not shown). Also, as described above, press the menu key 180RA, select a reset area (not shown) displayed on the monitor unit 192 of the broadcast receiving device 100 using the cursor keys (up, down, left, right) 180R2, By pressing the enter key 180R3, it may be determined that the reset button has been pressed.
  • the color key (blue, red, green, yellow) 180RD may be configured to be set as a reset button.
  • the reset button may be provided on the HMD, earphones, or headphones, and information on the press of the reset button may be output to the broadcast receiving device 100.
  • the information that the reset button has been pressed is realized by each device having other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
  • step S302 the broadcast receiving device 100 sets " ⁇ " as the direction of the user's face when the reset button is pressed as the new front direction of the user's face.
  • step S303 the broadcast receiving device 100 uses the audio decoders 146S, 146U, etc. to calculate audio information using " ⁇ " set as the new front direction of the user's face, and transmits the calculated calculation information to the HMD, earphones, etc. and output to an audio output unit such as headphones, and the audio output unit emits the calculated information to the user's auditory senses.
  • the broadcast receiving device 100 Even in the case of incorrect recognition, the reset button allows the user to correct the front direction of the user's face to the direction in which the user is actually facing.
  • the broadcast receiving device identifies the sound collection location of the program from the metadata and content of the program information, obtains the sound field information of that location from the server, and generates reverberant sound. , a synthesized sound is generated by adding the reverberation sound to the original sound, and the synthesized sound is outputted to the sound output device. Due to the operation of the broadcast receiving apparatus as described above, the viewer can experience the same sense of presence as if he or she were present at the sound collection location. In addition, when the broadcast receiving device specifies a listening point in the sound collection location, it generates a synthesized sound that corresponds to when listening to the sound from the sound source at that listening point. You can reproduce the sounds you hear.
  • the broadcast receiving device moves the listening point at the sound collection location in accordance with the partial enlargement operation of the broadcast image, and virtually changes the direction from which the sound comes, so the video and audio are synchronized, creating a more realistic feeling. You can enjoy a high quality experience.
  • Some or all of the functions of the present invention described above may be realized in hardware by, for example, designing an integrated circuit.
  • the functions may be realized by software by having a microprocessor unit or the like interpret and execute operating programs for realizing the respective functions.
  • Hardware and software may be used together.
  • the software for controlling the broadcast receiving apparatus 100 may be stored in advance in the ROM 103 and/or the storage unit 110 of the broadcast receiving apparatus 100 at the time of product shipment.
  • the information may be acquired from a server device on the Internet 800 via the LAN communication unit 121 after the product is shipped. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 124 or the like.
  • the software for controlling the mobile information terminal 700 may be stored in advance in the ROM 703 and/or the storage unit 710 of the mobile information terminal 700 at the time of product shipment.
  • the information may be acquired from a server device on the Internet 800 via the LAN communication unit 721 or the mobile telephone network communication unit 722 after the product is shipped. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface section 724 or the like.
  • control lines and information lines shown in the figures are those considered necessary for the explanation, and do not necessarily show all control lines and information lines on the product. In reality, almost all components may be considered to be interconnected.
  • 100 Broadcast receiving device, 101: Main control unit, 102: System bus, 103: ROM, 104: RAM, 110: Storage unit, 121: LAN communication unit, 124: Expansion interface unit, 125: Digital interface unit , 130C, 130T, 130L, 130B: tuner/demodulation section, 140S, 140U: decoder section, 180: operation input section, 191: video selection section, 192: monitor section, 193: video output section, 194: audio selection section, 195: Speaker section, 196: Audio output section, 180R: Remote controller, 200, 200C, 200T, 200S, 200L, 200B: Antenna, 201T, 201L, 201B: Conversion section, 300, 300T, 300S, 300L: Radio tower, 400C: head end of cable television station, 400: broadcasting station server, 500: service provider server, 600: mobile telephone communication server, 600B: base station, 700: mobile information terminal, 800: Internet, 800R: router device, 900 : Information server, 9

Abstract

Provided is technology for more suitably transmitting or receiving a high-level digital broadcast service. This broadcast reception device comprises: a broadcast reception unit that is capable of receiving signals for separate sound sources via broadcast waves and receives the broadcast waves; an audio output unit that is configured from a plurality of speakers and outputs audio in accordance with the signals for the separate sound sources transmitted from the broadcast reception device; and a control unit. The control unit determines the playback positions of audio resulting from the signals for the separate sound sources in accordance with arrangement information regarding the plurality of speakers, calculates signals corresponding to 22.2 ch audio channels on the basis of the signals for the separate sound sources, then converts the signals corresponding to 22.2 ch audio channels into signals for the audio output unit.

Description

放送受信装置、コンテンツ保護方法、残響音付加処理方法および放送受信装置の制御方法Broadcast receiving device, content protection method, reverberation sound addition processing method, and control method for broadcast receiving device
 本発明は、放送受信装置、コンテンツ保護方法、残響音付加処理方法および放送受信装置の制御方法に関する。 The present invention relates to a broadcast receiving device, a content protection method, a reverberation sound addition processing method, and a control method for a broadcast receiving device.
 従来のアナログ放送サービスに替わり、1990年代後半より各国でデジタル放送サービスが開始された。デジタル放送サービスは、誤り訂正技術を用いた放送品質の向上、圧縮符号化技術を用いた多チャンネル化およびHD(High Definition)化、BML(Broadcast Markup Language)やHTML5(Hyper Text Markup Language version5)を用いたサービスのマルチメディア化、等を実現した。 Digital broadcasting services began in various countries in the late 1990s to replace conventional analog broadcasting services. Digital broadcasting services include improving broadcast quality using error correction technology, multi-channeling and HD (High Definition) using compression coding technology, BML (Broadcast Markup Language) and HTML5 (Hyper Text Markup Language). version 5) We have realized multimedia services such as multimedia services.
 近年では、さらなる周波数使用効率の向上、高解像度化や高機能化を目的として、各国において、高度デジタル放送方式の検討が進められている。 In recent years, advanced digital broadcasting systems are being considered in various countries with the aim of further improving frequency usage efficiency, increasing resolution and functionality.
特開2016-144020号公報Japanese Patent Application Publication No. 2016-144020
 現行のデジタル放送はサービスを開始してから既に10年以上を経過しており、現行のデジタル放送サービスを受信可能な放送受信装置が充分に普及している。このため、現在検討を進めている高度デジタル放送サービスを開始するにあたっては、現行のデジタル放送サービスとの互換性を考慮する必要がある。即ち、現行のデジタル放送サービスの視聴環境を維持しつつ、映像信号のUHD(Ultra High Definition)化等を実現することが好ましい。 More than 10 years have already passed since the current digital broadcasting service started, and broadcast receiving devices capable of receiving the current digital broadcasting service have become sufficiently widespread. Therefore, when starting an advanced digital broadcasting service that is currently under consideration, it is necessary to consider compatibility with current digital broadcasting services. That is, it is preferable to realize UHD (Ultra High Definition) video signals while maintaining the viewing environment of the current digital broadcasting service.
 デジタル放送サービスでUHD放送を実現する技術として特許文献1に記載のシステムがある。しかしながら、特許文献1に記載のシステムは現行のデジタル放送に置き換えるものであり、現行のデジタル放送サービスの視聴環境の維持を考慮したものではない。 There is a system described in Patent Document 1 as a technology for realizing UHD broadcasting in digital broadcasting services. However, the system described in Patent Document 1 is intended to replace the current digital broadcasting, and does not take into consideration the maintenance of the viewing environment of the current digital broadcasting service.
 本発明の目的は、現行のデジタル放送サービスとの互換性も考慮した、より高機能な高度デジタル放送サービスをより好適に送信または受信する技術を提供することである。 An object of the present invention is to provide a technology for more appropriately transmitting or receiving advanced digital broadcasting services with higher functionality, taking into consideration compatibility with current digital broadcasting services.
 前記課題を解決するための手段として、特許請求の範囲に記載の技術を用いる。 As a means for solving the above problem, the technology described in the claims is used.
 一例を挙げるならば、音源別の信号を放送波を介して受信可能な放送受信装置であって、前記放送波を受信する放送受信部と、複数のスピーカにより構成され、前記放送受信装置から送信された前記音源別の信号に従って音声を出力する音声出力部と、制御部と、
を備え、前記制御部は、前記複数のスピーカの配置情報に従い、前記音源別の信号による音声の再生位置を決定し、前記音源別の信号をもとに22.2chの音声チャンネルに対応した信号を計算した後、前記22.2ch用の音声チャンネルに対応した信号を前記音声出力部用の信号に変換する、放送受信装置を用いれば良い。
To give an example, there is a broadcast receiving device capable of receiving signals for each sound source via broadcast waves, which includes a broadcast receiving section that receives the broadcast waves, and a plurality of speakers, and which transmits signals from the broadcast receiving device. an audio output unit that outputs audio according to the signal for each sound source, and a control unit;
The control unit determines a playback position of audio based on the signal for each sound source according to the arrangement information of the plurality of speakers, and generates a signal corresponding to the 22.2ch audio channel based on the signal for each sound source. After calculating, a broadcast receiving device may be used that converts a signal corresponding to the 22.2ch audio channel into a signal for the audio output section.
 本発明によれば、高度デジタル放送サービスをより好適に送信または受信する技術を提供することができる。 According to the present invention, it is possible to provide a technique for more suitably transmitting or receiving advanced digital broadcasting services.
本発明の一実施例に係る放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置のブロック図である。FIG. 1 is a block diagram of a broadcast receiving device according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置の第一チューナ/復調部の詳細ブロック図である。FIG. 2 is a detailed block diagram of a first tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置の第二チューナ/復調部の詳細ブロック図である。FIG. 3 is a detailed block diagram of a second tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置の第三チューナ/復調部の詳細ブロック図である。FIG. 3 is a detailed block diagram of a third tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置の第四チューナ/復調部の詳細ブロック図である。FIG. 3 is a detailed block diagram of a fourth tuner/demodulator of a broadcast receiving device according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置の第一デコーダ部の詳細ブロック図である。FIG. 2 is a detailed block diagram of a first decoder section of a broadcast receiving apparatus according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置の第二デコーダ部の詳細ブロック図である。FIG. 2 is a detailed block diagram of a second decoder section of a broadcast receiving apparatus according to an embodiment of the present invention. 本発明の一実施例に係る放送受信装置のソフトウェア構成図である。FIG. 1 is a software configuration diagram of a broadcast receiving device according to an embodiment of the present invention. 本発明の一実施例に係る放送局サーバの構成図である。FIG. 1 is a configuration diagram of a broadcasting station server according to an embodiment of the present invention. 本発明の一実施例に係るサービス事業者サーバの構成図である。FIG. 1 is a configuration diagram of a service provider server according to an embodiment of the present invention. 本発明の一実施例に係る携帯情報端末のブロック図である。FIG. 1 is a block diagram of a portable information terminal according to an embodiment of the present invention. 本発明の一実施例に係る携帯情報端末のソフトウェア構成図である。FIG. 1 is a software configuration diagram of a portable information terminal according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るセグメント構成を説明する図である。FIG. 2 is a diagram illustrating a segment configuration related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係る階層伝送における階層割り当てを説明する図である。FIG. 3 is a diagram illustrating layer allocation in layered transmission related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るOFDM伝送波の生成処理を説明する図である。FIG. 3 is a diagram illustrating the generation process of OFDM transmission waves related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係る伝送路符号化部の基本的な構成を説明する図である。FIG. 2 is a diagram illustrating the basic configuration of a transmission line encoding unit related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るOFDM方式のセグメントパラメータを説明する図である。FIG. 2 is a diagram illustrating segment parameters of an OFDM system related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係る伝送信号パラメータを説明する図である。FIG. 3 is a diagram illustrating transmission signal parameters related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係る同期変調セグメントのパイロット信号の配置を説明する図である。FIG. 2 is a diagram illustrating the arrangement of pilot signals of synchronous modulation segments in digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係る差動変調セグメントのパイロット信号の配置を説明する図である。FIG. 2 is a diagram illustrating the arrangement of pilot signals of differential modulation segments in digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCCキャリアのビット割り当てを説明する図である。FIG. 2 is a diagram illustrating bit allocation of a TMCC carrier related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報のビット割り当てを説明する図である。FIG. 3 is a diagram illustrating bit allocation of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の伝送パラメータ情報を説明する図である。FIG. 3 is a diagram illustrating transmission parameter information of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報のシステム識別を説明する図である。FIG. 3 is a diagram illustrating system identification of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報のキャリア変調マッピング方式を説明する図である。FIG. 2 is a diagram illustrating a carrier modulation mapping method for TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の周波数変換処理識別を説明する図である。FIG. 3 is a diagram illustrating frequency conversion processing identification of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の物理チャンネル番号識別を説明する図である。FIG. 3 is a diagram illustrating physical channel number identification of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の主信号識別を説明する図である。FIG. 3 is a diagram illustrating main signal identification of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の4K信号伝送階層識別を説明する図である。FIG. 3 is a diagram illustrating 4K signal transmission layer identification of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の追加階層伝送識別を説明する図である。FIG. 3 is a diagram illustrating additional layer transmission identification of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るTMCC情報の内符号の符号化率の識別を説明する図である。FIG. 3 is a diagram illustrating identification of a coding rate of an inner code of TMCC information related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号のビット割り当てを説明する図である。FIG. 3 is a diagram illustrating bit allocation of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の構成識別を説明する図である。FIG. 3 is a diagram illustrating configuration identification of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の地震動警報情報を説明する図である。FIG. 2 is a diagram illustrating seismic motion warning information of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の地震動警報情報の信号識別を説明する図である。FIG. 3 is a diagram illustrating signal identification of seismic motion warning information of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の地震動警報情報の地震動警報詳細情報を説明する図である。It is a figure explaining detailed seismic motion warning information of seismic motion warning information of an AC signal concerning digital broadcasting of one example of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の地震動警報情報の地震動警報詳細情報を説明する図である。It is a figure explaining detailed seismic motion warning information of seismic motion warning information of an AC signal concerning digital broadcasting of one example of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の変調波の伝送制御に関する付加情報を説明する図である。FIG. 3 is a diagram illustrating additional information regarding transmission control of a modulated wave of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の伝送パラメータ付加情報を説明する図である。FIG. 3 is a diagram illustrating additional transmission parameter information of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号の誤り訂正方式を説明する図である。FIG. 2 is a diagram illustrating an AC signal error correction method for digital broadcasting according to an embodiment of the present invention. 本発明の一実施例のデジタル放送に係るAC信号のコンスタレーション形式を説明する図である。FIG. 3 is a diagram illustrating a constellation format of an AC signal related to digital broadcasting according to an embodiment of the present invention. 本発明の一実施例に係る偏波両用伝送方式を説明する図である。FIG. 2 is a diagram illustrating a dual polarization transmission system according to an embodiment of the present invention. 本発明の一実施例に係る偏波両用伝送方式を用いた放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system using a dual polarization transmission system according to an embodiment of the present invention. 本発明の一実施例に係る偏波両用伝送方式を用いた放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system using a dual polarization transmission system according to an embodiment of the present invention. 本発明の一実施例に係る周波数変換処理を説明する図である。FIG. 3 is a diagram illustrating frequency conversion processing according to an embodiment of the present invention. 本発明の一実施例に係るパススルー伝送方式の構成を説明する図である。1 is a diagram illustrating the configuration of a pass-through transmission method according to an embodiment of the present invention. FIG. 本発明の一実施例に係るパススルー伝送帯域を説明する図である。FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention. 本発明の一実施例に係るパススルー伝送方式の構成を説明する図である。1 is a diagram illustrating the configuration of a pass-through transmission method according to an embodiment of the present invention. FIG. 本発明の一実施例に係るパススルー伝送帯域を説明する図である。FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention. 本発明の一実施例に係るパススルー伝送帯域を説明する図である。FIG. 3 is a diagram illustrating a pass-through transmission band according to an embodiment of the present invention. 本発明の一実施例に係る単偏波伝送方式を説明する図である。FIG. 2 is a diagram illustrating a single polarization transmission method according to an embodiment of the present invention. 本発明の一実施例に係る単偏波伝送方式を用いた放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system using a single polarization transmission method according to an embodiment of the present invention. 本発明の一実施例に係る単偏波伝送方式を用いた放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system using a single polarization transmission method according to an embodiment of the present invention. 本発明の一実施例に係る階層分割多重伝送方式を説明する図である。FIG. 2 is a diagram illustrating a layer division multiplex transmission system according to an embodiment of the present invention. 本発明の一実施例に係る階層分割多重伝送方式を用いた放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system using a hierarchical division multiplex transmission system according to an embodiment of the present invention. 本発明の一実施例に係る周波数変換増幅処理を説明する図である。FIG. 3 is a diagram illustrating frequency conversion amplification processing according to an embodiment of the present invention. 本発明の一実施例に係る階層分割多重伝送方式を用いた放送システムのシステム構成図である。1 is a system configuration diagram of a broadcasting system using a hierarchical division multiplex transmission system according to an embodiment of the present invention. MPEG-2 TSのプロトコルスタックを説明する図である。FIG. 2 is a diagram illustrating a protocol stack of MPEG-2 TS. MPEG-2 TSで使用するテーブルの名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of tables used in MPEG-2 TS. MPEG-2 TSで使用するテーブルの名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of tables used in MPEG-2 TS. MPEG-2 TSで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS. MPEG-2 TSで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS. MPEG-2 TSで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS. MPEG-2 TSで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS. MPEG-2 TSで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS. MPEG-2 TSで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of descriptors used in MPEG-2 TS. MMTの放送伝送路におけるプロトコルスタックを説明する図である。FIG. 2 is a diagram illustrating a protocol stack in an MMT broadcast transmission path. MMTの通信回線におけるプロトコルスタックを説明する図である。FIG. 2 is a diagram illustrating a protocol stack in an MMT communication line. MMTのTLV-SIで使用するテーブルの名称と機能を説明する図である。FIG. 2 is a diagram explaining the names and functions of tables used in TLV-SI of MMT. MMTのTLV-SIで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT TLV-SI. MMTのMMT-SIで使用するメッセージの名称と機能を説明する図である。FIG. 2 is a diagram illustrating the names and functions of messages used in MMT-SI of MMT. MMTのMMT-SIで使用するテーブルの名称と機能を説明する図である。FIG. 2 is a diagram illustrating the names and functions of tables used in MMT-SI of MMT. MMTのMMT-SIで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT. MMTのMMT-SIで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT. MMTのMMT-SIで使用する記述子の名称と機能を説明する図である。FIG. 2 is a diagram illustrating the names and functions of descriptors used in MMT-SI of MMT. MMT方式のデータ伝送と各テーブルの関係を説明する図である。FIG. 3 is a diagram illustrating the relationship between MMT data transmission and each table. 本発明の一実施例に係る放送受信装置のチャンネル設定処理の動作シーケンス図である。FIG. 3 is an operation sequence diagram of channel setting processing of the broadcast receiving device according to an embodiment of the present invention. ネットワーク情報テーブルのデータ構造を説明する図である。FIG. 3 is a diagram illustrating the data structure of a network information table. 地上分配システム記述子のデータ構造を説明する図である。FIG. 2 is a diagram illustrating the data structure of a ground distribution system descriptor. サービスリスト記述子のデータ構造を説明する図である。FIG. 3 is a diagram illustrating the data structure of a service list descriptor. TS情報記述子のデータ構造を説明する図である。FIG. 3 is a diagram illustrating the data structure of a TS information descriptor. 本発明の一実施例に係るリモートコントローラの外観図である。FIG. 1 is an external view of a remote controller according to an embodiment of the present invention. 本発明の一実施例に係るチャンネル選択時のバナー表示を説明する図である。FIG. 3 is a diagram illustrating a banner display when selecting a channel according to an embodiment of the present invention. スピーカ配置を説明する図である。It is a figure explaining speaker arrangement. スピーカ配置を説明する図である。It is a figure explaining speaker arrangement. スピーカ配置を説明する図である。It is a figure explaining speaker arrangement. ヘッドフォン使用時の位置関係を説明する図である。FIG. 3 is a diagram illustrating the positional relationship when headphones are used. ヘッドフォン使用時の位置関係を説明する図である。FIG. 3 is a diagram illustrating the positional relationship when headphones are used. チャンネルベース信号のみの音声信号に対する音声デコーダの構成例である。This is an example of the configuration of an audio decoder for an audio signal consisting of only channel-based signals. 高度な音声信号に対する音声デコーダの構成例である。This is an example of the configuration of an audio decoder for advanced audio signals. スピーカシステムの配置情報の例である。This is an example of placement information of a speaker system. 22.2chの信号から5.1chの信号へのダウンミックス係数のデフォルト値である。This is the default value of the downmix coefficient from the 22.2ch signal to the 5.1ch signal. 5.1chの信号から2chの信号へのダウンミックス係数のデフォルト値である。This is the default value of the downmix coefficient from a 5.1ch signal to a 2ch signal. 音声再生に用いるスピーカシステムを選択する画面の例である。This is an example of a screen for selecting a speaker system used for audio reproduction. オブジェクトベース信号のメタデータを説明する図である。FIG. 3 is a diagram illustrating metadata of an object-based signal. オブジェクトベース信号の再生位置を指定するメタデータの例である。This is an example of metadata that specifies the playback position of an object-based signal. オブジェクトベース信号の再生位置を指定するメタデータの例である。This is an example of metadata that specifies the playback position of an object-based signal. オブジェクトベース信号の再生位置を指定するメタデータの例である。This is an example of metadata that specifies the playback position of an object-based signal. オブジェクトベース信号の再生位置を選択する画面の例である。This is an example of a screen for selecting a playback position of an object-based signal. オブジェクトベース信号の再生位置を設定する画面の例である。This is an example of a screen for setting the playback position of an object-based signal. オブジェクトベース信号の再生位置を指定するメタデータの例である。This is an example of metadata that specifies the playback position of an object-based signal. オブジェクトベース信号の再生位置を指定するストリームデータの例である。This is an example of stream data that specifies the playback position of an object-based signal. オブジェクトベース信号の再生位置を指定するストリームデータの例である。This is an example of stream data that specifies the playback position of an object-based signal. オブジェクトベース信号の再生位置を指定するストリームデータの例である。This is an example of stream data that specifies the playback position of an object-based signal. HOA方式信号の信号数を説明する図である。It is a figure explaining the number of signals of HOA system signals. 出力デバイス毎に音声信号を選択する選択画面の図である。FIG. 3 is a diagram of a selection screen for selecting an audio signal for each output device. 連携機器における音声再生について説明する図である。FIG. 2 is a diagram illustrating audio playback in a cooperating device. 伝送される音声信号数と取得先を記述するパラメータを説明する図である。FIG. 3 is a diagram illustrating parameters describing the number of audio signals to be transmitted and the acquisition destination. 音声コンポーネント記述子のデータ構造を説明する図である。FIG. 3 is a diagram illustrating the data structure of an audio component descriptor. 音声コンポーネント種別のデータを説明する図である。FIG. 3 is a diagram illustrating audio component type data. 伝送される音声信号を電子番組表で表示する例である。This is an example in which transmitted audio signals are displayed in an electronic program guide. 伝送される音声信号を電子番組表で表示する例である。This is an example in which transmitted audio signals are displayed in an electronic program guide. 信号源と出力デバイスの表示例である。This is an example of displaying a signal source and an output device. 本実施例に係るコンテンツ保護処理の制御例の一例を説明する図である。FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment. 本実施例に係るコンテンツ保護処理の制御例の一例を説明する図である。FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment. 本実施例に係るコンテンツ保護処理の制御例の一例を説明する図である。FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment. 本実施例に係るコンテンツ保護処理の制御例の一例を説明する図である。FIG. 3 is a diagram illustrating an example of control of content protection processing according to the present embodiment. ヘッドフォン使用時の残響音処理フローの例を示す図である。FIG. 6 is a diagram illustrating an example of a reverberation sound processing flow when using headphones. 音声出力設定メニューの例を示す図である。FIG. 3 is a diagram showing an example of an audio output setting menu. 残響音設定の詳細メニューの例を示す図である。FIG. 7 is a diagram showing an example of a detailed menu for reverberation sound settings. 残響音処理状態を示すバナー表示の例を示す図である。FIG. 3 is a diagram illustrating an example of a banner display indicating a reverberation processing state. 放送画像の部分拡大表示におけるバナー表示の例を示す図である。FIG. 7 is a diagram illustrating an example of a banner display in partially enlarged display of a broadcast image. 放送画像の部分拡大表示前後の視聴条件を説明するための図である。FIG. 6 is a diagram for explaining viewing conditions before and after partially enlarged display of a broadcast image. 本実施例の音像定位モードの切替動作を示すフローチャートである。5 is a flowchart illustrating a sound image localization mode switching operation according to the present embodiment. 本実施例のリモートコントローラの外観の一例を示す図である。FIG. 2 is a diagram showing an example of the appearance of the remote controller of the present embodiment. 本実施例の音像定位モードを設定するためのバナー表示の一例を示す図である。FIG. 7 is a diagram showing an example of a banner display for setting the sound image localization mode of the present embodiment. 本変形例の音像定位モードが非固定の場合のθの修正動作を示すフローチャートである。12 is a flowchart showing an operation for correcting θ when the sound image localization mode of the present modification is non-fixed.
 以下、本発明の実施形態の例を、図面を用いて説明する。 Examples of embodiments of the present invention will be described below with reference to the drawings.
 (実施例1)
 [システム構成]
 図1は、放送システムの構成の一例を示すシステム構成図である。
(Example 1)
[System configuration]
FIG. 1 is a system configuration diagram showing an example of the configuration of a broadcasting system.
 放送システムは、例えば、放送受信装置100とアンテナ200、放送局の電波塔300と放送局サーバ400、サービス事業者サーバ500、移動体電話通信サーバ600と移動体電話通信網の基地局600B、携帯情報端末700、インターネット等のブロードバンドネットワーク800とルータ装置800R、情報サーバ900、ヘッドフォン910、HMD(Head Mound Display)920で構成される。なお、放送システムは、ヘッドフォン910およびHMD920のいずれか一方を備える構成であっても良いし、ヘッドフォン910およびHMD920の代わりに、図示しないスピーカシステムを備える構成であっても良い。また、インターネット800には、各種サーバ装置や通信機器がさらに接続されても良い。 The broadcasting system includes, for example, a broadcast receiving device 100 and an antenna 200, a radio tower 300 of a broadcasting station and a broadcasting station server 400, a service provider server 500, a mobile phone communication server 600, a base station 600B of a mobile phone communication network, and a mobile phone communication network. It is composed of an information terminal 700, a broadband network 800 such as the Internet, a router device 800R, an information server 900, headphones 910, and an HMD (Head Mound Display) 920. Note that the broadcasting system may include either one of the headphones 910 and the HMD 920, or may include a speaker system (not shown) instead of the headphones 910 and the HMD 920. Further, various server devices and communication devices may be further connected to the Internet 800.
 放送受信装置100は、高度デジタル放送サービスの受信機能を備えたテレビ受信機である。放送受信装置100は、さらに既存デジタル放送サービスの受信機能を備えても良い。さらに、デジタル放送サービス(既存デジタル放送サービスまたは高度デジタル放送サービス)にブロードバンドネットワークを利用した機能を連携させ、ブロードバンドネットワークを介した付加コンテンツの取得やサーバ装置における演算処理、携帯端末機器との連携による提示処理等をデジタル放送サービスと組み合わせる放送通信連携システムに対応可能である。放送受信装置100は、アンテナ200を介して、電波塔300から送出されたデジタル放送波を受信する。前記デジタル放送波は、電波塔300からアンテナ200に直接送信されても良いし、図示を省略した放送衛星や通信衛星等を経由して送信されても良い。ケーブルテレビ局が再送信した放送信号を、ケーブル回線等を経由して受信しても良い。また、放送受信装置100は、ルータ装置800Rを介してインターネット800と接続可能であり、インターネット800上の各サーバ装置との通信によるデータの送受信が可能である。 The broadcast receiving device 100 is a television receiver equipped with a reception function for advanced digital broadcasting services. Broadcast receiving device 100 may further include a receiving function for existing digital broadcasting services. Furthermore, by linking digital broadcasting services (existing digital broadcasting services or advanced digital broadcasting services) with functions using broadband networks, we will be able to obtain additional content via broadband networks, perform calculation processing on server devices, and link with mobile terminal devices. It is compatible with broadcast communication cooperation systems that combine presentation processing, etc. with digital broadcasting services. Broadcast receiving device 100 receives digital broadcast waves transmitted from radio tower 300 via antenna 200 . The digital broadcast wave may be directly transmitted from the radio tower 300 to the antenna 200, or may be transmitted via a broadcasting satellite, a communication satellite, etc. (not shown). A broadcast signal retransmitted by a cable television station may be received via a cable line or the like. Further, the broadcast receiving device 100 can be connected to the Internet 800 via the router device 800R, and can transmit and receive data through communication with each server device on the Internet 800.
 ルータ装置800Rは、インターネット800と無線通信または有線通信により接続され、また、放送受信装置100とは有線通信で、携帯情報端末700とは無線通信で接続される。これにより、インターネット800上の各サーバ装置と放送受信装置100と携帯情報端末700とが、ルータ装置800Rを介して、データの送受信を相互に行うことが可能となる。ルータ装置800Rと放送受信装置100と携帯情報端末700は、LAN(Local Area Network)を構成する。なお、放送受信装置100と携帯情報端末700との通信は、ルータ装置800Rを介さずに、BlueTooth(登録商標)やNFC(Near Field Communication)等の方式で直接行われても良い。 The router device 800R is connected to the Internet 800 by wireless or wired communication, and is also connected to the broadcast receiving device 100 by wired communication and to the mobile information terminal 700 by wireless communication. This enables each server device, broadcast receiving device 100, and portable information terminal 700 on the Internet 800 to mutually transmit and receive data via the router device 800R. The router device 800R, the broadcast receiving device 100, and the mobile information terminal 700 constitute a LAN (Local Area Network). Note that communication between the broadcast receiving device 100 and the mobile information terminal 700 may be performed directly using a method such as Bluetooth (registered trademark) or NFC (Near Field Communication) without going through the router device 800R.
 電波塔300は、放送局の放送設備であって、デジタル放送サービスに係る各種制御情報や放送番組のコンテンツデータ(動画コンテンツや音声コンテンツ等)等を含むデジタル放送波を送出する。また、放送局は放送局サーバ400を備える。放送局サーバ400は、放送番組のコンテンツデータおよび各放送番組の番組タイトル、番組ID、番組概要、出演者、放送日時、等のメタデータを記憶する。放送局サーバ400は、前記コンテンツデータやメタデータを、契約に基づいて、サービス事業者に対して提供する。サービス事業者に対するコンテンツデータおよびメタデータの提供は、放送局サーバ400が備えるAPI(Application Programming Interface)を通して行われる。 The radio tower 300 is a broadcasting facility of a broadcasting station, and transmits digital broadcast waves containing various control information related to digital broadcasting services, content data of broadcast programs (video content, audio content, etc.), and the like. The broadcast station also includes a broadcast station server 400. Broadcasting station server 400 stores content data of broadcast programs and metadata of each broadcast program, such as program title, program ID, program summary, performers, broadcast date and time, and the like. The broadcasting station server 400 provides the content data and metadata to the service provider based on a contract. Content data and metadata are provided to the service provider through an API (Application Programming Interface) included in the broadcast station server 400.
 サービス事業者サーバ500は、サービス事業者が放送通信連携システムによるサービスを提供するために用意するサーバ装置である。サービス事業者サーバ500は、放送局サーバ400から提供されたコンテンツデータおよびメタデータと、放送通信連携システム用に制作されたコンテンツデータおよびアプリケーション(動作プログラムおよび/または各種データ等)の記憶、管理および配信等を行う。また、テレビ受信機からの問い合わせに対して、提供可能なアプリケーションの検索や一覧の提供を行う機能も有する。なお、前記コンテンツデータおよびメタデータの記憶、管理および配信等と、前記アプリケーションの記憶、管理および配信等は、異なるサーバ装置が行うものであっても良い。放送局とサービス事業者は同一であっても良いし、異なる事業者であっても良い。サービス事業者サーバ500は、異なるサービスごとに複数用意されても良い。また、サービス事業者サーバ500の機能は、放送局サーバ400が兼ね備えるものであっても良い。 The service provider server 500 is a server device prepared by a service provider to provide services using a broadcasting and communication cooperation system. The service provider server 500 stores, manages, and stores content data and metadata provided by the broadcasting station server 400, as well as content data and applications (operating programs and/or various data, etc.) created for the broadcast communication cooperation system. Perform distribution, etc. It also has the ability to search for and provide a list of available applications in response to inquiries from television receivers. Note that storage, management, distribution, etc. of the content data and metadata, and storage, management, distribution, etc. of the application may be performed by different server devices. The broadcasting station and the service provider may be the same or different providers. A plurality of service provider servers 500 may be provided for different services. Furthermore, the functions of the service provider server 500 may also be provided by the broadcasting station server 400.
 移動体電話通信サーバ600はインターネット800と接続され、一方、基地局600Bを介して携帯情報端末700と接続される。移動体電話通信サーバ600は、携帯情報端末700の移動体電話通信網を介した電話通信(通話)およびデータ送受信を管理し、携帯情報端末700とインターネット800上の各サーバ装置との通信によるデータの送受信を可能とする。なお、携帯情報端末700と放送受信装置100との通信は、基地局600Bと移動体電話通信サーバ600、およびインターネット800、ルータ装置800Rを介して行われるものであっても良い。 Mobile telephone communication server 600 is connected to the Internet 800, and on the other hand, is connected to mobile information terminal 700 via base station 600B. The mobile phone communication server 600 manages telephone communication (calls) and data transmission and reception via the mobile phone communication network of the mobile information terminal 700, and transmits and receives data through communication between the mobile information terminal 700 and each server device on the Internet 800. It is possible to send and receive. Note that communication between the mobile information terminal 700 and the broadcast receiving device 100 may be performed via the base station 600B, the mobile telephone communication server 600, the Internet 800, and the router device 800R.
 情報サーバ900は、コンサートホールや劇場の音場環境などの情報を提供するサーバ装置である。情報サーバ900は、放送コンテンツに音場環境メタデータが無いまたは不足する場合にそれを補完する情報を提供する。例えば、放送コンテンツのタイトルやメタデータから演奏される劇場名が示されている場合、その劇場名から情報サーバに記憶されている情報を検索することにより、その劇場の音場環境の情報などが取得される。情報サーバ900は、公演場所の環境情報だけでなく、ヘッドフォン910での音再生を考慮した頭部伝達関数を提供しても良い。音場環境や頭部伝達関数は、難聴者を考慮した伝達関数を含んで提供されても良い。情報サーバ900は、視聴者の脳波を取得して、当該視聴者にふさわしい音場環境や頭部伝達関数をアレンジする機能を備えても良い。 The information server 900 is a server device that provides information such as the sound field environment of a concert hall or theater. The information server 900 provides information to supplement sound field environment metadata when the broadcast content does not have or is lacking in sound field environment metadata. For example, if the name of the theater where the performance will be performed is indicated from the title or metadata of the broadcast content, information about the sound field environment of the theater can be obtained by searching the information stored in the information server based on the theater name. be obtained. The information server 900 may provide not only environmental information of the performance location but also a head-related transfer function that takes sound reproduction through headphones 910 into consideration. The sound field environment and head-related transfer functions may be provided including transfer functions that take hearing-impaired people into consideration. The information server 900 may have a function of acquiring a viewer's brain waves and arranging a sound field environment and head-related transfer function suitable for the viewer.
 HMD920は、放送受信装置100から映像と音声データの提供を受けて視聴者に映像を表示し、音声を再生する。 The HMD 920 receives video and audio data from the broadcast receiving device 100, displays the video to the viewer, and reproduces the audio.
 [放送受信装置のハードウェア構成]
 図2Aは、放送受信装置100の内部構成の一例を示すブロック図である。
 放送受信装置100は、主制御部101、システムバス102、ROM103、RAM104、ストレージ(蓄積)部110、LAN通信部121、拡張インタフェース部124、デジタルインタフェース部125、第一チューナ/復調部130C、第二チューナ/復調部130T、第三チューナ/復調部130L、第四チューナ/復調部130B、第一デコーダ部140S、第二デコーダ部140U、操作入力部180、映像選択部191、モニタ部192、映像出力部193、音声選択部194、スピーカ部195、音声出力部196、で構成される。
[Hardware configuration of broadcast receiving device]
FIG. 2A is a block diagram showing an example of the internal configuration of broadcast receiving device 100.
The broadcast receiving apparatus 100 includes a main control section 101, a system bus 102, a ROM 103, a RAM 104, a storage section 110, a LAN communication section 121, an expansion interface section 124, a digital interface section 125, a first tuner/demodulation section 130C, and a first tuner/demodulation section 130C. Second tuner/demodulation section 130T, third tuner/demodulation section 130L, fourth tuner/demodulation section 130B, first decoder section 140S, second decoder section 140U, operation input section 180, video selection section 191, monitor section 192, video It is composed of an output section 193, an audio selection section 194, a speaker section 195, and an audio output section 196.
 主制御部101は、所定の動作プログラムに従って放送受信装置100全体を制御するマイクロプロセッサユニットである。システムバス102は主制御部101と放送受信装置100内の各動作ブロックとの間で各種データやコマンド等の送受信を行うための通信路である。 The main control section 101 is a microprocessor unit that controls the entire broadcast receiving apparatus 100 according to a predetermined operation program. The system bus 102 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 101 and each operational block within the broadcast receiving apparatus 100.
 ROM(Read Only Memory)103は、オペレーティングシステムなどの基本動作プログラムやその他の動作プログラムが格納された不揮発性メモリであり、例えばEEPROM(Electrically Erasable Programmable ROM)やフラッシュROMのような書き換え可能なROMが用いられる。また、ROM103には、放送受信装置100の動作に必要な動作設定値等が記憶される。RAM(Random Access Memory)104は基本動作プログラムやその他の動作プログラム実行時のワークエリアとなる。ROM103およびRAM104は主制御部101と一体構成であっても良い。また、ROM103は、図2Aに示したような独立構成とはせず、ストレージ(蓄積)部110内の一部記憶領域を使用するようにしても良い。 The ROM (Read Only Memory) 103 is a non-volatile memory in which basic operating programs such as an operating system and other operating programs are stored. used. Further, the ROM 103 stores operation setting values and the like necessary for the operation of the broadcast receiving apparatus 100. A RAM (Random Access Memory) 104 serves as a work area when basic operation programs and other operation programs are executed. The ROM 103 and the RAM 104 may be integrated with the main control unit 101. Further, the ROM 103 may not have an independent configuration as shown in FIG. 2A, but may use a part of the storage area within the storage section 110.
 ストレージ(蓄積)部110は、放送受信装置100の動作プログラムや動作設定値、放送受信装置100のユーザの個人情報等を記憶する。また、インターネット800を介してダウンロードした動作プログラムや前記動作プログラムで作成した各種データ等を記憶可能である。また、放送波から取得した、或いは、インターネット800を介してダウンロードした、動画、静止画、音声等のコンテンツも記憶可能である。ストレージ(蓄積)部110の一部領域を以ってROM103の機能の全部または一部を代替しても良い。また、ストレージ(蓄積)部110は、放送受信装置100に外部から電源が供給されていない状態であっても記憶している情報を保持する必要がある。したがって、例えば、フラッシュROMやSSD(Solid State Drive)等の半導体素子メモリ、HDD(Hard Disc Drive)等の磁気ディスクドライブ、等のデバイスが用いられる。 The storage unit 110 stores operating programs and operating settings of the broadcast receiving device 100, personal information of the user of the broadcast receiving device 100, and the like. Further, it is possible to store operating programs downloaded via the Internet 800 and various data created using the operating programs. It is also possible to store content such as moving images, still images, audio, etc., obtained from broadcast waves or downloaded via the Internet 800. All or part of the functions of the ROM 103 may be replaced by a partial area of the storage section 110. Further, the storage unit 110 needs to hold stored information even when power is not supplied to the broadcast receiving apparatus 100 from the outside. Therefore, for example, devices such as semiconductor element memories such as flash ROMs and SSDs (Solid State Drives), and magnetic disk drives such as HDDs (Hard Disc Drives) are used.
 なお、ROM103やストレージ(蓄積)部110に記憶された前記各動作プログラムは、インターネット800上の各サーバ装置や放送波からのダウンロード処理により、追加、更新および機能拡張することが可能である。 Note that each of the operating programs stored in the ROM 103 and the storage unit 110 can be added, updated, and expanded in function by downloading from each server device or broadcast wave on the Internet 800.
 LAN通信部121は、ルータ装置800Rを介してインターネット800と接続され、インターネット800上の各サーバ装置やその他の通信機器とデータの送受信を行う。また、通信回線を介して伝送される番組のコンテンツデータ(或いは、その一部)の取得も行う。ルータ装置800Rとの接続は有線接続であっても良いし、Wi-Fi(登録商標)等の無線接続であっても良い。LAN通信部121は符号回路や復号回路等を備える。また、放送受信装置100が、BlueTooth(登録商標)通信部やNFC通信部、赤外線通信部等、他の通信部をさらに備えていても良い。 The LAN communication unit 121 is connected to the Internet 800 via the router device 800R, and sends and receives data to and from each server device and other communication devices on the Internet 800. It also acquires the content data (or part of it) of the program transmitted via the communication line. The connection to the router device 800R may be a wired connection or a wireless connection such as Wi-Fi (registered trademark). The LAN communication unit 121 includes an encoding circuit, a decoding circuit, and the like. Furthermore, the broadcast receiving device 100 may further include other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
 第一チューナ/復調部130Cと第二チューナ/復調部130Tと第三チューナ/復調部130Lと第四チューナ/復調部130Bは、それぞれ、デジタル放送サービスの放送波を受信し、主制御部101の制御に基づいて所定のサービスのチャンネルに同調することによる選局処理(チャンネル選択)を行う。さらに、受信信号の変調波の復調処理や波形整形処理等、また、フレーム構造や階層構造の再構成処理、エネルギー逆拡散処理、誤り訂正復号処理、等を行い、パケットストリームを再生する。また、受信信号から伝送TMCC(Transmission Multiplexing Configuration Control)信号の抽出および復号処理を行う。 The first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B each receive broadcast waves of the digital broadcasting service, and Tuning processing (channel selection) is performed by tuning to a predetermined service channel based on the control. Furthermore, it performs demodulation processing and waveform shaping processing of the modulated wave of the received signal, reconstruction processing of the frame structure and hierarchical structure, energy despreading processing, error correction decoding processing, etc., and reproduces the packet stream. It also extracts and decodes a transmission multiplexing configuration control (TMCC) signal from the received signal.
 なお、第一チューナ/復調部130Cは、現行地上デジタル放送受信用アンテナであるアンテナ200Cが受信した現行の地上デジタル放送サービスのデジタル放送波が入力可能である。また、第一チューナ/復調部130Cは、後述する偏波両用地上デジタル放送の水平(H)偏波信号と垂直(V)偏波信号のうち一方の偏波の放送信号を入力して、現行の地上デジタル放送サービスと同じ変調方式を採用する階層のセグメントを復調することも可能である。また、第一チューナ/復調部130Cは、後述する単偏波地上デジタル放送の放送信号を入力して、現行の地上デジタル放送サービスと同じ変調方式を採用する階層のセグメントを復調することも可能である。また、第一チューナ/復調部130Cは、後述する階層分割多重地上デジタル放送の放送信号を入力して、現行の地上デジタル放送サービスと同じ変調方式を採用する階層のセグメントを復調することも可能である。 Note that the first tuner/demodulator 130C can input digital broadcast waves of the current digital terrestrial broadcasting service received by the antenna 200C, which is the current digital terrestrial broadcast receiving antenna. In addition, the first tuner/demodulator 130C inputs a broadcast signal of one polarization between a horizontal (H) polarization signal and a vertical (V) polarization signal of dual-polarization terrestrial digital broadcasting, which will be described later. It is also possible to demodulate layer segments that use the same modulation method as the digital terrestrial broadcasting service. In addition, the first tuner/demodulator 130C can input a broadcast signal of single-polarized digital terrestrial broadcasting, which will be described later, and demodulate the segment of the layer that uses the same modulation method as the current digital terrestrial broadcasting service. be. In addition, the first tuner/demodulator 130C can receive a broadcast signal of layer division multiplexed digital terrestrial broadcasting, which will be described later, and demodulate the segment of the layer that uses the same modulation method as the current digital terrestrial broadcasting service. be.
 第二チューナ/復調部130Tは、偏波両用地上デジタル放送受信用アンテナであるアンテナ200Tが受信した高度地上デジタル放送サービスのデジタル放送波を、変換部201Tを介して入力する。また、第二チューナ/復調部130Tは、単偏波地上デジタル放送受信用アンテナ(図示省略)が受信した高度地上デジタル放送サービスのデジタル放送波を入力しても良い。第二チューナ/復調部130Tが単偏波地上デジタル放送受信用アンテナ(図示省略)から高度地上デジタル放送サービスのデジタル放送波を入力する場合、変換部201Tは介さずとも良い。なお、偏波両用地上デジタル放送のデジタル放送波を受信するアンテナ200Tは、水平偏波信号を受信する素子と垂直偏波信号受信する素子とを備える。単偏波地上デジタル放送受信用アンテナ(図示省略)は、水平偏波信号を受信する素子と垂直偏波信号受信する素子の何れか一方を備える。単偏波地上デジタル放送受信用アンテナ(図示省略)は、現行地上デジタル放送受信用アンテナであるアンテナ200Cと共用されても良い。 The second tuner/demodulator 130T inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service received by the antenna 200T, which is a dual polarization digital terrestrial broadcast receiving antenna, via the converter 201T. Further, the second tuner/demodulator 130T may input digital broadcast waves of the advanced digital terrestrial broadcasting service received by a single-polarized digital terrestrial broadcast receiving antenna (not shown). When the second tuner/demodulator 130T inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service from a single-polarized digital terrestrial broadcast receiving antenna (not shown), the converter 201T may not be used. Note that the antenna 200T that receives digital broadcast waves of dual-polarization terrestrial digital broadcasting includes an element that receives horizontally polarized signals and an element that receives vertically polarized signals. A single-polarized terrestrial digital broadcast receiving antenna (not shown) includes either an element for receiving a horizontally polarized signal or an element for receiving a vertically polarized signal. The single-polarized digital terrestrial broadcast reception antenna (not shown) may be used in common with the antenna 200C, which is the current antenna for terrestrial digital broadcast reception.
 第三チューナ/復調部130Lは、階層分割多重地上デジタル放送受信用アンテナであるアンテナ200Lが受信した高度地上デジタル放送サービスのデジタル放送波を、変換部201Lを介して入力する。 The third tuner/demodulator 130L inputs the digital broadcast wave of the advanced digital terrestrial broadcasting service received by the antenna 200L, which is a hierarchical division multiplexing digital terrestrial broadcast receiving antenna, via the converter 201L.
 第四チューナ/復調部130Bは、BS/CS共用受信用アンテナであるアンテナ200Bが受信した高度BS(Broadcasting Satellite)デジタル放送サービスや高度CS(Communication Satellite)デジタル放送サービスのデジタル放送波を、変換部201Bを介して入力する。
 なお『チューナ/復調部』との表現は、チューナ機能と復調機能を備えた構成部を意味する。
The fourth tuner/demodulator 130B converts digital broadcast waves of an advanced BS (Broadcasting Satellite) digital broadcasting service or an advanced CS (Communication Satellite) digital broadcasting service received by the antenna 200B, which is a BS/CS shared receiving antenna, into a converter. 201B.
Note that the expression "tuner/demodulator" means a component having a tuner function and a demodulator function.
 また、アンテナ200C、アンテナ200T、アンテナ200L、アンテナ200B、変換部201T、変換部201L、変換部201Bは、放送受信装置100の一部を構成するものではなく、放送受信装置100が設置される建物等の設備側に属するものである。 Furthermore, the antenna 200C, the antenna 200T, the antenna 200L, the antenna 200B, the converter 201T, the converter 201L, and the converter 201B do not constitute part of the broadcast receiving device 100, but rather the building in which the broadcast receiving device 100 is installed. It belongs to the equipment side such as.
 また、上述の現行地上デジタル放送は、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスの放送信号である。 Additionally, the current terrestrial digital broadcasting described above is a broadcast signal of a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels.
 また、偏波両用地上デジタル放送(偏波両用伝送方式を採用した高度地上デジタル放送)及び単偏波地上デジタル放送(単偏波伝送方式を採用した高度地上デジタル放送)の詳細は後述するが、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスの放送信号である。偏波両用地上デジタル放送は、水平(H)偏波と垂直(V)偏波の複数の偏波を用いる地上デジタル放送であり、複数の偏波の両方の偏波において、分割された一部のセグメントで、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスを伝送する。単偏波地上デジタル放送は、水平(H)偏波と垂直(V)偏波の何れか一方の偏波を用いる地上デジタル放送であり、分割された一部のセグメントで、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスを伝送する。 In addition, details of dual-polarization terrestrial digital broadcasting (advanced terrestrial digital broadcasting that uses a dual-polarization transmission method) and single-polarization digital terrestrial broadcasting (advanced terrestrial digital broadcasting that uses a single-polarization transmission method) will be described later. This is a broadcast signal for a terrestrial digital broadcasting service that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels. Dual-polarization terrestrial digital broadcasting is terrestrial digital broadcasting that uses multiple polarizations, horizontal (H) and vertical (V). This segment transmits a terrestrial digital broadcasting service that can transmit video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels. Single-polarized terrestrial digital broadcasting is terrestrial digital broadcasting that uses either horizontal (H) polarization or vertical (V) polarization, and some segments are divided into 1920 pixels horizontally x vertically. A terrestrial digital broadcasting service capable of transmitting video with a maximum resolution of more than 1080 pixels will be transmitted.
 なお、本発明の各実施例の説明において、偏波両用地上デジタル放送について『複数の偏波』という表現を用いた場合、特に断りがない限り、水平(H)偏波と垂直(V)偏波の2つの偏波を意味するものである。また、単に『偏波』との表現を用いた場合でも『偏波信号』を意味する。また、複数の偏波の一方または両方の偏波において、分割された一部のセグメントで、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する上述の現行地上デジタル放送を同じ変調方式で伝送可能である。即ち、偏波両用地上デジタル放送では、本発明の各実施例の複数の偏波の異なるセグメントで、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行地上デジタル放送サービスと、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスとを同時に伝送することができる。また、単偏波地上デジタル放送は、分割された一部のセグメントで、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する上述の現行地上デジタル放送を同じ変調方式で伝送可能である。即ち、単偏波地上デジタル放送では、本発明の各実施例の異なるセグメントで、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行地上デジタル放送サービスと、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスとを同時に伝送することができる。 In the description of each embodiment of the present invention, when the expression "multiple polarizations" is used for dual-polarization terrestrial digital broadcasting, unless otherwise specified, horizontal (H) polarization and vertical (V) polarization are used. This refers to the two polarizations of the wave. Furthermore, even when simply using the expression "polarized wave", it means a "polarized signal". In addition, the current terrestrial digital broadcasting described above, which transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, in some divided segments in one or both of multiple polarizations, is also possible using the same modulation method. transmission is possible. That is, in dual-polarization terrestrial digital broadcasting, the current terrestrial digital broadcasting service, which transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, in multiple segments with different polarizations according to each embodiment of the present invention, and It is possible to simultaneously transmit a digital terrestrial broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 pixels x 1080 vertical pixels. In addition, single-polarized terrestrial digital broadcasting can transmit images with a maximum resolution of 1920 horizontal pixels x vertical 1080 pixels using the same modulation method as the above-mentioned current digital terrestrial broadcasting in some divided segments. . That is, in single-polarized digital terrestrial broadcasting, the current digital terrestrial broadcasting service transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically in different segments of each embodiment of the present invention, and 1920 pixels horizontally x 1080 pixels vertically. It is possible to simultaneously transmit a digital terrestrial broadcasting service that can transmit video whose maximum resolution is a number of pixels exceeding the number of pixels.
 また、階層分割多重地上デジタル放送(階層分割多重伝送方式を採用した高度地上デジタル放送)の詳細は後述するが、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスの放送信号である。階層分割多重地上デジタル放送は、信号レベルが異なる複数のデジタル放送信号を多重化するものである。なお、信号レベルが異なるデジタル放送信号とは、デジタル放送信号を送信する電力が異なることを意味する。本発明の各実施例の階層分割多重地上デジタル放送は、当該信号レベルが異なる複数のデジタル放送信号として、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行地上デジタル放送サービスの放送信号と、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスの放送信号とを同一物理チャンネルの周波数帯で階層多重して伝送可能である。即ち、本発明の各実施例の階層分割多重地上デジタル放送では、信号レベルの異なる複数の階層で、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行地上デジタル放送サービスと、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な地上デジタル放送サービスとを同時に伝送することができる。 In addition, details of layer division multiplex terrestrial digital broadcasting (advanced terrestrial digital broadcasting that employs layer division multiplex transmission method) will be described later, but it is capable of transmitting video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically. This is a broadcast signal of terrestrial digital broadcasting service. Hierarchical division multiplexing terrestrial digital broadcasting multiplexes a plurality of digital broadcasting signals with different signal levels. Note that digital broadcast signals with different signal levels mean that the power for transmitting the digital broadcast signals is different. The hierarchical division multiplexing terrestrial digital broadcasting of each embodiment of the present invention is the broadcasting of the current terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels as a plurality of digital broadcasting signals with different signal levels. It is possible to hierarchically multiplex and transmit signals and broadcast signals of digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels in the frequency band of the same physical channel. That is, in the layered multiplexed terrestrial digital broadcasting of each embodiment of the present invention, the current terrestrial digital broadcasting service transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically in multiple layers with different signal levels, and It is possible to simultaneously transmit a digital terrestrial broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 pixels x 1080 vertical pixels.
 なお、本発明の各実施例における放送受信装置は、高度なデジタル放送を好適に受信できる構成であれば良く、第一チューナ/復調部130Cと第二チューナ/復調部130Tと第三チューナ/復調部130Lと第四チューナ/復調部130Bのすべてを備えることが必須ではない。例えば、少なくとも第二チューナ/復調部130Tまたは第三チューナ/復調部130Lの一方を備えれば良い。また、より高度な機能を実現するために、第二チューナ/復調部130Tまたは第三チューナ/復調部130Lの一方に加えて、上記4つのチューナ/復調部の1つまたは複数をともに備えても良い。 Note that the broadcast receiving device in each embodiment of the present invention may have any configuration as long as it can suitably receive advanced digital broadcasting, and may include a first tuner/demodulator 130C, a second tuner/demodulator 130T, and a third tuner/demodulator. It is not essential to include all of the section 130L and the fourth tuner/demodulator 130B. For example, it is sufficient to include at least one of the second tuner/demodulator 130T or the third tuner/demodulator 130L. Further, in order to realize more advanced functions, one or more of the above four tuners/demodulators may be provided in addition to either the second tuner/demodulator 130T or the third tuner/demodulator 130L. good.
 また、アンテナ200Cとアンテナ200Tとアンテナ200Lは適宜兼用されても良い。また、第一チューナ/復調部130Cと第二チューナ/復調部130Tと第三チューナ/復調部130Lのうち、複数のチューナ/復調部が適宜兼用(或いは統合)されても良い。 Furthermore, the antenna 200C, the antenna 200T, and the antenna 200L may be used in combination as appropriate. Further, among the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L, a plurality of tuners/demodulators may be used in common (or integrated) as appropriate.
 第一デコーダ部140Sと第二デコーダ部140Uは、それぞれ、第一チューナ/復調部130Cや第二チューナ/復調部130Tや第三チューナ/復調部130Lや第四チューナ/復調部130Bから出力されたパケットストリーム、或いは、LAN通信部121を介してインターネット800上の各サーバ装置から取得したパケットストリームを入力する。第一デコーダ部140Sと第二デコーダ部140Uが入力するパケットストリームは、MPEG(Moving Picture Experts Group)-2 TS(Transport Stream)やMPEG-2 PS(Program Stream)、TLV(Type Length Value)、MMT(MPEG Media Transport)、等の形式のパケットストリームであって良い。 The first decoder section 140S and the second decoder section 140U each receive the output from the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B. A packet stream or a packet stream obtained from each server device on the Internet 800 via the LAN communication unit 121 is input. The packet streams input to the first decoder unit 140S and the second decoder unit 140U are MPEG (Moving Picture Experts Group)-2 TS (Transport Stream) or MPEG-2 PS (Program Stream). , TLV (Type Length Value), MMT (MPEG Media Transport), etc. may be a packet stream.
 第一デコーダ部140Sと第二デコーダ部140Uは、それぞれ、コンディショナルアクセス(Conditional Access:CA)処理、パケットストリームに含まれる各種制御情報に基づいて前記パケットストリームから映像データや音声データや各種情報データ等を分離抽出する多重分離処理、映像データや音声データの復号処理、番組情報の取得およびEPG(Electronic Program Guide:電子番組表)生成処理、データ放送画面やマルチメディアデータの再生処理、等を行う。また、生成したEPGや再生したマルチメディアデータを復号した映像データや音声データと重畳する処理を行う。 The first decoder section 140S and the second decoder section 140U perform conditional access (CA) processing and extract video data, audio data, and various information data from the packet stream based on various control information included in the packet stream. etc., decoding processing of video data and audio data, acquisition of program information and EPG (Electronic Program Guide) generation processing, processing of playing back data broadcasting screens and multimedia data, etc. . It also performs a process of superimposing the generated EPG and reproduced multimedia data with decoded video data and audio data.
 映像選択部191は、第一デコーダ部140Sから出力された映像データと第二デコーダ部140Uから出力された映像データを入力し、主制御部101の制御に基づいて、適宜選択および/または重畳等の処理を行う。また、映像選択部191は、適宜スケーリング処理やOSD(On Screen Display)データの重畳処理等を行う。モニタ部192は、例えば液晶パネル等の表示デバイスであり、映像選択部191で選択および/または重畳処理を施された映像データを表示して、放送受信装置100のユーザに提供する。映像出力部193は、映像選択部191で選択および/または重畳処理を施された映像データを外部に出力する映像出力インタフェースである。映像出力インタフェースは、例えばHDMI(High-Defenition Multimedia Interface)(登録商標)である。 The video selection unit 191 inputs the video data output from the first decoder unit 140S and the video data output from the second decoder unit 140U, and selects and/or superimposes the data as appropriate based on the control of the main control unit 101. Process. Further, the video selection unit 191 appropriately performs scaling processing, OSD (On Screen Display) data superimposition processing, and the like. The monitor unit 192 is, for example, a display device such as a liquid crystal panel, and displays the video data selected and/or superimposed by the video selection unit 191, and provides the video data to the user of the broadcast receiving apparatus 100. The video output unit 193 is a video output interface that outputs the video data selected and/or superimposed by the video selection unit 191 to the outside. The video output interface is, for example, HDMI (High-Defenition Multimedia Interface) (registered trademark).
 音声選択部194は、第一デコーダ部140Sから出力された音声データおよび第二デコーダ部140Uから出力された音声データを入力し、主制御部101の制御に基づいて、適宜選択および/またはミックス等の処理を行う。スピーカ部195は、音声選択部194で選択および/またはミックス処理を施された音声データを出音して、放送受信装置100のユーザに提供する。音声出力部196は、音声選択部194で選択および/またはミックス処理を施された音声データを外部に出力する音声出力インタフェースである。音声出力インタフェースは、例えばアナログのヘッドフォンジャックや光デジタルインタフェース、Bluetooth、HDMI入力端子に割り当てられたARC(Audio Return Channel)などがある。 The audio selection unit 194 inputs the audio data output from the first decoder unit 140S and the audio data output from the second decoder unit 140U, and selects and/or mixes the audio data as appropriate based on the control of the main control unit 101. Process. The speaker unit 195 outputs the audio data selected and/or mixed by the audio selection unit 194 and provides the audio data to the user of the broadcast receiving device 100 . The audio output unit 196 is an audio output interface that outputs the audio data selected and/or mixed by the audio selection unit 194 to the outside. Examples of the audio output interface include an analog headphone jack, an optical digital interface, Bluetooth, and an ARC (Audio Return Channel) assigned to an HDMI input terminal.
 デジタルインタフェース部125は、符号化されたデジタル映像データおよび/またはデジタル音声データを含むパケットストリームを出力若しくは入力するインタフェースである。デジタルインタフェース部125は、第一デコーダ部140Sや第二デコーダ部140Uが第一チューナ/復調部130Cや第二チューナ/復調部130Tや第三チューナ/復調部130Lや第四チューナ/復調部130Bから入力したパケットストリームをそのまま出力可能である。また、デジタルインタフェース部125を介して外部から入力したパケットストリームを第一デコーダ部140Sや第二デコーダ部140Uに入力したり、ストレージ(蓄積)部110に記憶するように制御しても良い。或いは、第一デコーダ部140Sや第二デコーダ部140Uで分離抽出した映像データや音声データを出力しても良い。また、デジタルインタフェース部125を介して外部から入力した映像データや音声データを第一デコーダ部140Sや第二デコーダ部140Uに入力したり、ストレージ(蓄積)部110に記憶するように制御しても良い。 The digital interface unit 125 is an interface that outputs or inputs a packet stream containing encoded digital video data and/or digital audio data. The digital interface section 125 allows the first decoder section 140S and the second decoder section 140U to communicate with each other from the first tuner/demodulator 130C, the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B. It is possible to output the input packet stream as is. Further, a packet stream input from the outside via the digital interface section 125 may be input to the first decoder section 140S or the second decoder section 140U, or may be controlled to be stored in the storage section 110. Alternatively, the video data and audio data separated and extracted by the first decoder section 140S and the second decoder section 140U may be output. Further, it is also possible to control the video data and audio data input from the outside via the digital interface unit 125 to be input to the first decoder unit 140S and the second decoder unit 140U, or to be stored in the storage unit 110. good.
 拡張インタフェース部124は、放送受信装置100の機能を拡張するためのインタフェース群であり、映像/音声インタフェース、USB(Universal Serial Bus)インタフェース、メモリインタフェース等で構成される。映像/音声インタフェースは、外部映像/音声出力機器からの映像信号/音声信号の入力、外部映像/音声入力機器への映像信号/音声信号の出力、等を行う。映像/音声インタフェースは。例えばアナログ信号を扱うピンジャックやD端子、デジタル信号を扱うHDMIなどがある。 The expansion interface unit 124 is a group of interfaces for expanding the functions of the broadcast receiving device 100, and includes a video/audio interface, a USB (Universal Serial Bus) interface, a memory interface, and the like. The video/audio interface inputs video signals/audio signals from an external video/audio output device, outputs video signals/audio signals to the external video/audio input device, and so on. Video/audio interface. For example, there are pin jacks and D terminals that handle analog signals, and HDMI that handles digital signals.
 USBインタフェースは、PC等と接続してデータの送受信を行う。HDDを接続して放送番組やその他のコンテンツデータの記録を行っても良い。また、キーボードやその他のUSB機器の接続を行っても良い。メモリインタフェースはメモリカードやその他のメモリ媒体を接続してデータの送受信を行う。 The USB interface is connected to a PC or the like to send and receive data. Broadcast programs and other content data may be recorded by connecting an HDD. Additionally, a keyboard or other USB devices may be connected. The memory interface connects a memory card or other memory medium to send and receive data.
 操作入力部180は、放送受信装置100に対する操作指示の入力を行う指示入力部であり、図示を省略したリモコン(リモートコントローラ)から送信されるコマンドを受信するリモコン受信部とボタンスイッチを並べた操作キーで構成される。いずれか一方のみであっても良い。また、操作入力部180は、モニタ部192に重ねて配したタッチパネル等で代替可能である。拡張インタフェース部124に接続したキーボード等で代替しても良い。リモコンはリモコンコマンド送信機能を備えた携帯情報端末700で代替可能である。なお、以下の実施例で説明される、リモコンに備えられた「キー」は、いずれも「ボタン」と表現しても問題ない。 The operation input unit 180 is an instruction input unit for inputting operation instructions to the broadcast receiving apparatus 100, and is an operation input unit that is arranged with button switches and a remote control reception unit that receives commands transmitted from a remote controller (not shown). Consists of keys. Only one of them may be used. Further, the operation input section 180 can be replaced with a touch panel or the like arranged over the monitor section 192. A keyboard or the like connected to the expansion interface unit 124 may be used instead. The remote control can be replaced by a portable information terminal 700 equipped with a remote control command sending function. Note that any of the "keys" provided on the remote control described in the following embodiments may be expressed as "buttons" without any problem.
 なお、放送受信装置100がテレビ受信機等である場合、映像出力部193および音声出力部196は必須の構成ではない。また、放送受信装置100は、DVD(Digital Versatile Disc)レコーダなどの光ディスクドライブレコーダ、HDDレコーダなどの磁気ディスクドライブレコーダ、STB(Set Top Box)等であっても良い。デジタル放送サービスの受信機能を備えたPC(Personal Computer)やタブレット端末等であっても良い。放送受信装置100がDVDレコーダやHDDレコーダやSTB等である場合、モニタ部192およびスピーカ部195は必須の構成ではない。映像出力部193および音声出力部196或いはデジタルインタフェース部125に外部モニタおよび外部スピーカを接続することにより、テレビ受信機等と同様の動作が可能となる。 Note that when the broadcast receiving device 100 is a television receiver or the like, the video output section 193 and the audio output section 196 are not essential components. Further, the broadcast receiving device 100 may be an optical disk drive recorder such as a DVD (Digital Versatile Disc) recorder, a magnetic disk drive recorder such as an HDD recorder, an STB (Set Top Box), or the like. It may be a PC (Personal Computer), a tablet terminal, or the like that is equipped with a function of receiving a digital broadcasting service. When the broadcast receiving device 100 is a DVD recorder, an HDD recorder, an STB, or the like, the monitor section 192 and the speaker section 195 are not essential components. By connecting an external monitor and external speakers to the video output section 193 and the audio output section 196 or the digital interface section 125, operations similar to those of a television receiver or the like are possible.
 図2Bは、第一チューナ/復調部130Cの詳細構成の一例を示すブロック図である。 FIG. 2B is a block diagram showing an example of a detailed configuration of the first tuner/demodulator 130C.
 選局/検波部131Cは、アンテナ200Cが受信した現行のデジタル放送波を入力し、チャンネル選択制御信号に基づいてチャンネル選択を行う。TMCC復号部132Cは選局/検波部131Cの出力信号からTMCC信号を抽出して各種TMCC情報を取得する。取得したTMCC情報は後段の各処理の制御に使用される。TMCC信号およびTMCC情報の詳細に関しては後述する。 The channel selection/detection unit 131C inputs the current digital broadcast wave received by the antenna 200C, and selects a channel based on the channel selection control signal. The TMCC decoding section 132C extracts the TMCC signal from the output signal of the channel selection/detection section 131C and obtains various TMCC information. The acquired TMCC information is used to control each subsequent process. Details of the TMCC signal and TMCC information will be described later.
 復調部133Cは、TMCC情報等に基づいて、QPSK(Quadrature Phase Shift Keying)、DQPSK(Differential QPSK)、16QAM(Quadrature Amplitude Modulation)、64QAM、等の方式を用いて変調された変調波を入力し、周波数デインターリーブや時間デインターリーブやキャリアデマッピング処理等を含む復調処理を行う。復調部133Cは、前述の各変調方式と異なる変調方式にさらに対応可能であっても良い。 The demodulator 133C uses QPSK (Quadrature Phase Shift Keying), DQPSK (Differential QPSK), and 16QAM (Quadrature Amplitude Modulation) based on TMCC information and the like. Input a modulated wave modulated using a method such as Performs demodulation processing including frequency deinterleaving, time deinterleaving, carrier demapping processing, etc. The demodulation unit 133C may be capable of supporting a modulation method different from each of the above-mentioned modulation methods.
 ストリーム再生部134Cは、階層分割処理、ビタビ復号等の内符号誤り訂正処理、エネルギー逆拡散処理、ストリーム再生処理、RS(Reed Solomon)復号等の外符号誤り訂正処理、等を行う。なお、誤り訂正処理としては、前述の各方式と異なるものが用いられても良い。また、ストリーム再生部134Cで再生されて出力されるパケットストリームは、例えばMPEG-2 TS等である。その他の形式のパケットストリームであっても良い。 The stream playback unit 134C performs layer division processing, inner code error correction processing such as Viterbi decoding, energy despreading processing, stream playback processing, outer code error correction processing such as RS (Reed Solomon) decoding, and the like. Note that as the error correction process, a method different from each of the above-mentioned methods may be used. Further, the packet stream reproduced and output by the stream reproduction unit 134C is, for example, MPEG-2 TS. Other formats of packet streams may also be used.
 図2Cは、第二チューナ/復調部130Tの詳細構成の一例を示すブロック図である。 FIG. 2C is a block diagram showing an example of a detailed configuration of the second tuner/demodulator 130T.
 選局/検波部131Hは、アンテナ200Tが受信したデジタル放送波の水平(H)偏波信号を入力し、チャンネル選択制御信号に基づいてチャンネル選択を行う。選局/検波部131Vは、アンテナ200Tが受信したデジタル放送波の垂直(V)偏波信号を入力し、チャンネル選択制御信号に基づいてチャンネル選択を行う。なお、選局/検波部131Hにおけるチャンネル選択処理の動作と選局/検波部131Vにおけるチャンネル選択処理の動作は、連動して制御されても良いし、それぞれ独立に制御されても良い。即ち、選局/検波部131Hと選局/検波部131Vを1つの選局/検波部であるものと見做して、水平/垂直両偏波を利用して伝送されるデジタル放送サービスの1つのチャンネルを選局するように制御することも可能であり、選局/検波部131Hと選局/検波部131Vを独立した二つの選局/検波部であるものとして、水平偏波のみ(或いは垂直偏波のみ)を利用して伝送されるデジタル放送サービスの異なる二つのチャンネルをそれぞれ選局するように制御することも可能である。 The channel selection/detection unit 131H inputs the horizontal (H) polarized signal of the digital broadcast wave received by the antenna 200T, and selects a channel based on the channel selection control signal. The channel selection/detection unit 131V inputs the vertical (V) polarized wave signal of the digital broadcast wave received by the antenna 200T, and selects a channel based on the channel selection control signal. Note that the operation of the channel selection process in the channel selection/detection section 131H and the operation of the channel selection process in the channel selection/detection section 131V may be controlled in conjunction with each other, or may be controlled independently. That is, considering the channel selection/detection section 131H and the channel selection/detection section 131V as one channel selection/detection section, one of the digital broadcasting services transmitted using both horizontal and vertical polarization. It is also possible to control to select one channel, and by assuming that the channel selection/detection section 131H and the channel selection/detection section 131V are two independent channel selection/detection sections, it is possible to perform control to select only horizontally polarized waves (or It is also possible to perform control to select two different channels of a digital broadcasting service transmitted using vertically polarized waves only.
 なお、本発明の各実施例における放送受信装置の第二チューナ/復調部130Tが受信する水平(H)偏波信号と垂直(V)偏波信号は偏波方向が略90度異なる放送波による偏波信号であれば良く、以下に説明する水平(H)偏波信号と垂直(V)偏波信号とその受信に関する構成を逆にしても構わない。 Note that the horizontal (H) polarized signal and the vertical (V) polarized signal received by the second tuner/demodulator 130T of the broadcast receiving device in each embodiment of the present invention are broadcast waves whose polarization directions differ by approximately 90 degrees. Any polarization signal may be used, and the configurations related to the horizontal (H) polarization signal, vertical (V) polarization signal, and their reception described below may be reversed.
 TMCC復号部132Hは選局/検波部131Hの出力信号からTMCC信号を抽出して各種TMCC情報を取得する。TMCC復号部132Vは選局/検波部131Vの出力信号からTMCC信号を抽出して各種TMCC情報を取得する。TMCC復号部132HとTMCC復号部132Vはいずれか一方のみであっても良い。取得したTMCC情報は後段の各処理の制御に使用される。 The TMCC decoding unit 132H extracts the TMCC signal from the output signal of the channel selection/detection unit 131H and obtains various TMCC information. The TMCC decoding unit 132V extracts the TMCC signal from the output signal of the channel selection/detection unit 131V and obtains various TMCC information. Only one of the TMCC decoding section 132H and the TMCC decoding section 132V may be provided. The acquired TMCC information is used to control each subsequent process.
 復調部133Hと復調部133Vは、それぞれ、TMCC情報等に基づいて、BPSK(Binary Phase Shift Keying)、DBPSK(Differential BPSK)、QPSK、DQPSK、8PSK(Phase Shift Keying)、16APSK(Amplitude and Phase Shift Keying)、32APSK、16QAM、64QAM、256QAM、1024QAM、等の方式を用いて変調された変調波を入力し、周波数デインターリーブや時間デインターリーブやキャリアデマッピング処理等を含む復調処理を行う。復調部133Hと復調部133Vは、前述の各変調方式と異なる変調方式にさらに対応可能であっても良い。 The demodulation unit 133H and the demodulation unit 133V each perform BPSK (Binary Phase Shift Keying), DBPSK (Differential BPSK), QPSK, DQPSK, 8PSK (Phase Shift Keying) based on TMCC information, etc. Keying), 16APSK (Amplitude and Phase Shift Keying) ), 32APSK, 16QAM, 64QAM, 256QAM, 1024QAM, etc., and performs demodulation processing including frequency deinterleaving, time deinterleaving, carrier demapping processing, etc. The demodulating section 133H and the demodulating section 133V may be capable of supporting a modulation method different from each of the above-mentioned modulation methods.
 ストリーム再生部134Hとストリーム再生部134Vは、それぞれ、階層分割処理、ビタビ復号やLDPC(Low Density Parity Check)復号等の内符号誤り訂正処理、エネルギー逆拡散処理、ストリーム再生処理、RS復号やBCH復号等の外符号誤り訂正処理、等を行う。なお、誤り訂正処理としては、前述の各方式と異なるものが用いられても良い。また、ストリーム再生部134Hで再生されて出力されるパケットストリームは、例えばMPEG-2 TS等である。ストリーム再生部134Vで再生されて出力されるパケットストリームは、例えばMPEG-2 TSやMMTパケットストリームを含むTLV等である。それぞれ、その他の形式のパケットストリームであっても良い。 The stream playback unit 134H and the stream playback unit 134V perform layer division processing, inner code error correction processing such as Viterbi decoding and LDPC (Low Density Parity Check) decoding, energy despreading processing, stream playback processing, RS decoding, and BCH decoding, respectively. Performs outer code error correction processing, etc. Note that as the error correction process, a method different from each of the above-mentioned methods may be used. Further, the packet stream reproduced and output by the stream reproduction unit 134H is, for example, MPEG-2 TS. The packet stream reproduced and output by the stream reproduction unit 134V is, for example, a TLV including an MPEG-2 TS or an MMT packet stream. Each of these may be a packet stream in another format.
 なお、第二チューナ/復調部130Tが単偏波地上デジタル放送のデジタル放送波を入力する場合、選局/検波部131VとTMCC復号部132Vと復調部133Vは備えなくとも良い。また、異なるセグメントで、現行地上デジタル放送サービスと高度地上デジタル放送サービスとが同時に伝送される場合、復調部133Hから出力される信号のうち、現行地上デジタル放送サービスを伝送するセグメントの信号はストリーム再生部134Hに入力され、高度地上デジタル放送サービスを伝送するセグメントの信号はストリーム再生部134Vに入力される。 Note that when the second tuner/demodulator 130T inputs digital broadcast waves of single-polarized terrestrial digital broadcasting, the channel selection/detection unit 131V, TMCC decoding unit 132V, and demodulation unit 133V may not be provided. In addition, when the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service are simultaneously transmitted in different segments, among the signals output from the demodulator 133H, the signal of the segment that transmits the current digital terrestrial broadcasting service is stream-played. The signal of the segment that transmits the advanced terrestrial digital broadcasting service is input to the stream playback unit 134V.
 図2Dは、第三チューナ/復調部130Lの詳細構成の一例を示すブロック図である。 FIG. 2D is a block diagram showing an example of a detailed configuration of the third tuner/demodulator 130L.
 選局/検波部131Lは、階層分割多重(Layered Division Multiplexing:LDM)処理を施されたデジタル放送波をアンテナ200Lから入力し、チャンネル選択制御信号に基づいてチャンネル選択を行う。階層分割多重処理を施されたデジタル放送波は、上側階層(Upper Layer:UL)の変調波と下側階層(Lower Layer:LL)の変調波が異なるデジタル放送サービス(或いは同一の放送サービスの異なるチャンネル)の送信に用いられて良い。また、上側階層の変調波は復調部133Sに、下側階層の変調波は復調部133Lに、それぞれ出力される。 The channel selection/detection unit 131L receives digital broadcast waves that have been subjected to layered division multiplexing (LDM) processing from the antenna 200L, and selects a channel based on a channel selection control signal. Digital broadcast waves that have been subjected to layer division multiplexing processing can be used for digital broadcasting services (or for different broadcasting services of the same broadcasting service) in which the modulated waves of the upper layer (UL) and the modulated waves of the lower layer (LL) are different. channel) can be used for transmission. Further, the modulated wave of the upper layer is output to the demodulator 133S, and the modulated wave of the lower layer is output to the demodulator 133L.
 TMCC復号部132Lは、選局/検波部131Lから出力される上側階層の変調波と下側階層の変調波を入力し、TMCC信号を抽出して各種TMCC情報を取得する。なお、TMCC復号部132Lに入力される信号は、上側階層の変調波のみであっても良いし、下側階層の変調波のみであっても良い。 The TMCC decoding unit 132L inputs the upper layer modulated wave and the lower layer modulated wave output from the channel selection/detection unit 131L, extracts the TMCC signal, and obtains various TMCC information. Note that the signal input to the TMCC decoding unit 132L may be only the modulated wave of the upper layer or only the modulated wave of the lower layer.
 復調部133Sと復調部133Lは、復調部133Hや復調部133Vと同様の動作を行うため、詳細説明を省略する。また、ストリーム再生部134Sやストリーム再生部134Lは、それぞれ、ストリーム再生部134Hやストリーム再生部134Vと同様の動作を行うため、詳細説明を省略する。
 図2Eは、第四チューナ/復調部130Bの詳細構成の一例を示すブロック図である。
The demodulating section 133S and the demodulating section 133L operate in the same manner as the demodulating section 133H and the demodulating section 133V, so a detailed description thereof will be omitted. Further, the stream playback unit 134S and the stream playback unit 134L operate in the same manner as the stream playback unit 134H and the stream playback unit 134V, respectively, so a detailed description thereof will be omitted.
FIG. 2E is a block diagram showing an example of a detailed configuration of the fourth tuner/demodulator 130B.
 選局/検波部131Bは、アンテナ200Bが受信した高度BSデジタル放送サービスや高度CSデジタル放送サービスのデジタル放送波を入力し、チャンネル選択制御信号に基づいてチャンネル選択を行う。その他の動作は選局/検波部131Hや選局/検波部131Vと同様であるので、詳細説明を省略する。また、TMCC復号部132B、復調部133B、ストリーム再生部134Bも、それぞれ、TMCC復号部132HやTMCC復号部132V、復調部133Hや復調部133V、ストリーム再生部134Vと同様の動作を行うため、詳細説明を省略する。 The channel selection/detection unit 131B inputs the digital broadcast waves of the advanced BS digital broadcasting service and the advanced CS digital broadcasting service received by the antenna 200B, and selects a channel based on the channel selection control signal. Other operations are the same as those of the channel selection/detection section 131H and the channel selection/detection section 131V, so detailed explanation will be omitted. Further, the TMCC decoding unit 132B, demodulation unit 133B, and stream playback unit 134B also operate in the same manner as the TMCC decoding unit 132H, TMCC decoding unit 132V, demodulation unit 133H, demodulation unit 133V, and stream playback unit 134V, so the details are The explanation will be omitted.
 図2Fは、第一デコーダ部140Sの詳細構成の一例を示すブロック図である。 FIG. 2F is a block diagram showing an example of a detailed configuration of the first decoder section 140S.
 選択部141Sは、主制御部101の制御に基づいて、第一チューナ/復調部130Cから入力したパケットストリームと第二チューナ/復調部130Tから入力したパケットストリームと第三チューナ/復調部130Lから入力したパケットストリームとから1つを選択して出力する。第一チューナ/復調部130Cや第二チューナ/復調部130Tや第三チューナ/復調部130Lから入力するパケットストリームは、例えばMPEG-2 TS等である。CAデスクランブラ142Sは、パケットストリームに重畳された限定受信に関する各種制御情報に基づいて、所定のスクランブル方式の暗号アルゴリズムの解除処理を行う。 The selection unit 141S, under the control of the main control unit 101, selects the packet stream input from the first tuner/demodulation unit 130C, the packet stream input from the second tuner/demodulation unit 130T, and the packet stream input from the third tuner/demodulation unit 130L. One of the packet streams is selected and output. The packet streams input from the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L are, for example, MPEG-2 TS. The CA descrambler 142S performs processing for canceling a predetermined scrambling encryption algorithm based on various control information related to conditional access superimposed on the packet stream.
 多重分離部143Sは、ストリームデコーダであり、入力したパケットストリームに含まれる各種制御情報に基づいて、映像データや音声データや文字スーパーデータや字幕データや番組情報データ等を分離抽出する。分離抽出された映像データは映像デコーダ145Sに、分離抽出された音声データは音声デコーダ146Sに、分離抽出された文字スーパーデータや字幕データや番組情報データ等はデータデコーダ144Sに、それぞれ分配される。多重分離部143Sには、LAN通信部121を介してインターネット800上のサーバ装置から取得したパケットストリーム(例えば、MPEG-2 PS等)が入力されても良い。また、多重分離部143Sは、第一チューナ/復調部130Cや第二チューナ/復調部130Tや第三チューナ/復調部130Lから入力したパケットストリームを、デジタルインタフェース部125を介して外部に出力することが可能であり、デジタルインタフェース部125を介して外部から取得したパケットストリームを入力することが可能である。 The demultiplexer 143S is a stream decoder, and separates and extracts video data, audio data, superimposed text data, subtitle data, program information data, etc. based on various control information included in the input packet stream. The separated and extracted video data is distributed to the video decoder 145S, the separated and extracted audio data is distributed to the audio decoder 146S, and the separated and extracted character superimposition data, subtitle data, program information data, etc. are distributed to the data decoder 144S. A packet stream (eg, MPEG-2 PS, etc.) obtained from a server device on the Internet 800 via the LAN communication unit 121 may be input to the demultiplexing unit 143S. Further, the demultiplexer 143S outputs the packet stream input from the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator 130L to the outside via the digital interface unit 125. It is possible to input a packet stream obtained from the outside via the digital interface section 125.
 映像デコーダ145Sは、多重分離部143Sから入力した映像データに対して、圧縮符号化を施された映像情報の復号処理や復号した映像情報に対するカラリメトリ変換処理やダイナミックレンジ変換処理等を行う。また、主制御部101の制御に基づいた解像度変換(アップ/ダウンコンバート)等の処理を行い、適宜UHD(水平3840画素×垂直2160画素)やHD(水平1920画素×垂直1080画素)やSD(水平720画素×垂直480画素)等の解像度で映像データを出力する。その他の解像度での映像データ出力を行っても良い。音声デコーダ146Sは、圧縮符号化を施された音声情報の復号処理等を行う。また、主制御部101の制御に基づいたダウンミックス処理等を行い、22.2chや7.1chや5.1chや2ch等のチャンネル数で音声データを出力する。なお、映像デコーダ145Sや音声デコーダ146Sは、映像データや音声データの復号処理等を複数同時に行うために、複数備えられていても良い。 The video decoder 145S performs decoding processing of compression-encoded video information, colorimetry conversion processing, dynamic range conversion processing, etc. on the decoded video information on the video data input from the demultiplexer 143S. In addition, processing such as resolution conversion (up/down conversion) based on the control of the main control unit 101 is performed as appropriate to UHD (horizontal 3840 pixels x vertical 2160 pixels), HD (horizontal 1920 pixels x vertical 1080 pixels), and SD ( Video data is output at a resolution of 720 pixels horizontally x 480 pixels vertically. Video data may be output at other resolutions. The audio decoder 146S performs decoding processing of compressed and encoded audio information. Further, it performs downmix processing and the like based on the control of the main control unit 101, and outputs audio data with a number of channels such as 22.2ch, 7.1ch, 5.1ch, 2ch, etc. Note that a plurality of video decoders 145S and audio decoders 146S may be provided in order to simultaneously perform decoding processing of video data and audio data.
 データデコーダ144Sは、番組情報データに基づいてEPGを生成する処理やBMLデータに基づくデータ放送画面生成処理や放送通信連携機能に基づく連携アプリケーションの制御処理等を行う。データデコーダ144SはBML文書を実行するBMLブラウザ機能を備え、データ放送画面生成処理は前記BMLブラウザ機能により実行される。また、データデコーダ144Sは、文字スーパーデータを復号して文字スーパー情報を生成する処理や字幕データを復号して字幕情報を生成する処理等を行う。 The data decoder 144S performs processes such as generating an EPG based on program information data, generating a data broadcasting screen based on BML data, and controlling a cooperative application based on a broadcast communication cooperative function. The data decoder 144S has a BML browser function that executes a BML document, and data broadcasting screen generation processing is executed by the BML browser function. The data decoder 144S also performs processes such as decoding character superimposition data to generate character superimposition information and decoding subtitle data to generate subtitle information.
 重畳部147Sと重畳部148Sと重畳部149Sは、それぞれ、映像デコーダ145Sから出力された映像データとデータデコーダ144Sから出力されたEPGやデータ放送画面等の重畳処理を行う。合成部151Sは、音声デコーダ146Sから出力された音声データとデータデコーダ144Sで再生された音声データとを合成する処理を行う。選択部150Sは、主制御部101の制御に基づいた映像データの解像度選択を行う。なお、重畳部147Sや重畳部148Sや重畳部149Sや選択部150Sの機能は映像選択部191と統合されても良い。合成部151Sの機能は音声選択部194と統合されても良い。 The superimposing section 147S, the superimposing section 148S, and the superimposing section 149S perform superimposition processing on the video data output from the video decoder 145S and the EPG, data broadcast screen, etc. output from the data decoder 144S, respectively. The synthesis unit 151S performs a process of synthesizing the audio data output from the audio decoder 146S and the audio data reproduced by the data decoder 144S. The selection unit 150S selects the resolution of video data based on the control of the main control unit 101. Note that the functions of the superimposing section 147S, the superimposing section 148S, the superimposing section 149S, and the selection section 150S may be integrated with the video selection section 191. The functions of the synthesis section 151S may be integrated with the voice selection section 194.
 図2Gは、第二デコーダ部140Uの詳細構成の一例を示すブロック図である。 FIG. 2G is a block diagram showing an example of a detailed configuration of the second decoder section 140U.
 選択部141Uは、主制御部101の制御に基づいて、第二チューナ/復調部130Tから入力したパケットストリームと第三チューナ/復調部130Lから入力したパケットストリームと第四チューナ/復調部130Bから入力したパケットストリームとから1つを選択して出力する。第二チューナ/復調部130Tや第三チューナ/復調部130Lや第四チューナ/復調部130Bから入力するパケットストリームは、例えば、MMTパケットストリーム或いはMMTパケットストリームを含むTLV等である。映像圧縮方式にHEVC(High Efficiency Video Coding)等を採用したMPEG-2 TS形式のパケットストリームであっても良い。CAデスクランブラ142Uは、パケットストリームに重畳された限定受信に関する各種制御情報に基づいて、所定のスクランブル方式の暗号アルゴリズムの解除処理を行う。 Based on the control of the main control unit 101, the selection unit 141U selects the packet stream input from the second tuner/demodulation unit 130T, the packet stream input from the third tuner/demodulation unit 130L, and the packet stream input from the fourth tuner/demodulation unit 130B. One of the packet streams is selected and output. The packet streams input from the second tuner/demodulator 130T, the third tuner/demodulator 130L, and the fourth tuner/demodulator 130B are, for example, an MMT packet stream or a TLV including an MMT packet stream. It may be an MPEG-2 TS format packet stream that uses HEVC (High Efficiency Video Coding) or the like as a video compression method. The CA descrambler 142U performs processing for canceling a predetermined scrambling encryption algorithm based on various control information related to conditional access superimposed on the packet stream.
 多重分離部143Uは、ストリームデコーダであり、入力したパケットストリームに含まれる各種制御情報に基づいて、映像データや音声データや文字スーパーデータや字幕データや番組情報データ等を分離抽出する。分離抽出された映像データは映像デコーダ145Uに、分離抽出された音声データは音声デコーダ146Uに、分離抽出された文字スーパーデータや字幕データや番組情報データ等はマルチメディアデコーダ144Uに、それぞれ分配される。多重分離部143Uには、LAN通信部121を介してインターネット800上のサーバ装置から取得したパケットストリーム(例えば、MPEG-2 PSやMMTパケットストリーム等)が入力されても良い。また、多重分離部143Uは、第二チューナ/復調部130Tや第三チューナ/復調部130Lや第四チューナ/復調部130Bから入力したパケットストリームを、デジタルインタフェース部125を介して外部に出力することが可能であり、デジタルインタフェース部125を介して外部から取得したパケットストリームを入力することが可能である。 The demultiplexer 143U is a stream decoder, and separates and extracts video data, audio data, superimposed text data, subtitle data, program information data, etc. based on various control information included in the input packet stream. The separated and extracted video data is distributed to the video decoder 145U, the separated and extracted audio data is distributed to the audio decoder 146U, and the separated and extracted character superimposition data, subtitle data, program information data, etc. are distributed to the multimedia decoder 144U. . A packet stream (eg, MPEG-2 PS, MMT packet stream, etc.) acquired from a server device on the Internet 800 via the LAN communication unit 121 may be input to the demultiplexing unit 143U. Further, the demultiplexer 143U outputs the packet stream input from the second tuner/demodulator 130T, the third tuner/demodulator 130L, or the fourth tuner/demodulator 130B to the outside via the digital interface unit 125. It is possible to input a packet stream obtained from the outside via the digital interface section 125.
 マルチメディアデコーダ144Uは、番組情報データに基づいてEPGを生成する処理やマルチメディアデータに基づくマルチメディア画面生成処理、放送通信連携機能に基づく連携アプリケーションの制御処理等を行う。マルチメディアデコーダ144UはHTML文書を実行するHTMLブラウザ機能を備え、マルチメディア画面生成処理は前記HTMLブラウザ機能により実行される。 The multimedia decoder 144U performs a process of generating an EPG based on program information data, a process of generating a multimedia screen based on multimedia data, a process of controlling a cooperative application based on a broadcast communication cooperative function, and the like. The multimedia decoder 144U has an HTML browser function that executes HTML documents, and multimedia screen generation processing is executed by the HTML browser function.
 映像デコーダ145Uと音声デコーダ146Uと重畳部147Uと重畳部148Uと重畳部149Uと合成部151Uと選択部150Uは、それぞれ、映像デコーダ145Sや音声デコーダ146Sや重畳部147Sや重畳部148Sや重畳部149Sや合成部151Sや選択部150Sと同様の機能を有する構成部である。これらは図2Fにおける映像デコーダ145Sや音声デコーダ146Sや重畳部147Sや重畳部148Sや重畳部149Sや合成部151Sや選択部150Sについての説明において符号の末尾のSをUに読み替えれば、図2Gにおける映像デコーダ145Uと音声デコーダ146Uと重畳部147Uと重畳部148Uと重畳部149Uと合成部151Uと選択部150Uのそれぞれの説明となるので別途の詳細説明は省略する。 The video decoder 145U, the audio decoder 146U, the superimposing section 147U, the superimposing section 148U, the superimposing section 149U, the combining section 151U, and the selecting section 150U are respectively the video decoder 145S, the audio decoder 146S, the superimposing section 147S, the superimposing section 148S, and the superimposing section 149S. It is a component having the same functions as the synthesis section 151S and the selection section 150S. In the description of the video decoder 145S, the audio decoder 146S, the superimposing section 147S, the superimposing section 148S, the superimposing section 149S, the combining section 151S, and the selecting section 150S in FIG. The video decoder 145U, the audio decoder 146U, the superimposing section 147U, the superimposing section 148U, the superimposing section 149U, the synthesizing section 151U, and the selecting section 150U will be explained separately, so a separate detailed explanation will be omitted.
 [放送受信装置のソフトウェア構成]
 図2Hは、放送受信装置100のソフトウェア構成図であり、ストレージ(蓄積)部110(或いはROM103、以下同様)およびRAM104におけるソフトウェア構成の一例を示す。ストレージ(蓄積)部110には、基本動作プログラム1001と受信機能プログラム1002とブラウザプログラム1003とコンテンツ管理プログラム1004およびその他の動作プログラム1009が記憶されている。また、ストレージ(蓄積)部110は、動画や静止画や音声等のコンテンツデータを記憶するコンテンツ記憶領域1011、外部の携帯端末機器やサーバ装置等との通信や連携の際に使用する認証情報等を記憶する認証情報記憶領域1012、その他の各種情報を記憶する各種情報記憶領域1019を備えるものとする。
[Software configuration of broadcast receiving device]
FIG. 2H is a software configuration diagram of the broadcast receiving apparatus 100, and shows an example of the software configuration in the storage unit 110 (or ROM 103, hereinafter the same) and RAM 104. The storage unit 110 stores a basic operation program 1001, a reception function program 1002, a browser program 1003, a content management program 1004, and other operation programs 1009. The storage unit 110 also includes a content storage area 1011 that stores content data such as videos, still images, and audio, and authentication information that is used for communication and cooperation with external mobile terminal devices, server devices, etc. , an authentication information storage area 1012 for storing information, and a various information storage area 1019 for storing various other information.
 ストレージ(蓄積)部110に記憶された基本動作プログラム1001はRAM104に展開され、さらに主制御部101が前記展開された基本動作プログラムを実行することにより、基本動作制御部1101を構成する。また、ストレージ(蓄積)部110に記憶された受信機能プログラム1002やブラウザプログラム1003やコンテンツ管理プログラム1004は、それぞれRAM104に展開され、さらに主制御部101が前記展開された各動作プログラムを実行することにより、受信機能制御部1102やブラウザエンジン1103やコンテンツ管理部1104を構成する。また、RAM104は、各動作プログラム実行時に作成したデータを、必要に応じて一時的に保持する一時記憶領域1200を備えるものとする。 The basic operation program 1001 stored in the storage unit 110 is expanded into the RAM 104, and the main control unit 101 further executes the expanded basic operation program to configure the basic operation control unit 1101. Further, the reception function program 1002, browser program 1003, and content management program 1004 stored in the storage unit 110 are each expanded into the RAM 104, and the main control unit 101 executes each of the expanded operation programs. This configures a reception function control unit 1102, a browser engine 1103, and a content management unit 1104. Furthermore, the RAM 104 is provided with a temporary storage area 1200 that temporarily holds data created when each operating program is executed, as needed.
 なお、以下では、説明を簡単にするために、主制御部101がストレージ(蓄積)部110に記憶された基本動作プログラム1001をRAM104に展開して実行することにより各動作ブロックの制御を行う処理を、基本動作制御部1101が各動作ブロックの制御を行うものとして記述する。他の動作プログラムに関しても同様の記述を行う。 In the following, for the sake of simplicity, the main control unit 101 controls each operation block by expanding the basic operation program 1001 stored in the storage unit 110 into the RAM 104 and executing it. will be described assuming that the basic operation control unit 1101 controls each operation block. Similar descriptions are made for other operating programs.
 受信機能制御部1102は、放送受信装置100の放送受信機能や放送通信連携機能等の基本的な制御を行う。特に、選局/復調部1102aは、第一チューナ/復調部130Cや第二チューナ/復調部130Tや第三チューナ/復調部130Lや第四チューナ/復調部130B等におけるチャンネル選局処理やTMCC情報取得処理や復調処理等を主として制御する。ストリーム再生制御部1102bは、第一チューナ/復調部130Cや第二チューナ/復調部130Tや第三チューナ/復調部130Lや第四チューナ/復調部130B等における階層分割処理や誤り訂正復号処理やエネルギー逆拡散処理やストリーム再生処理等を主として制御する。AVデコーダ部1102cは、第一デコーダ部140Sや第二デコーダ部140U等における多重分離処理(ストリームデコード処理)や映像データ復号処理や音声データ復号処理等を主として制御する。マルチメディア(MM)データ再生部1102dは、第一デコーダ部140SにおけるBMLデータ再生処理や文字スーパーデータ復号処理や字幕データ復号処理や通信連携アプリの制御処理、第二デコーダ部140UにおけるHTMLデータ再生処理やマルチメディア画面生成処理や通信連携アプリの制御処理、等を主として制御する。EPG生成部1102eは、第一デコーダ部140Sや第二デコーダ部140UにおけるEPG生成処理および生成したEPGの表示処理を主として制御する。提示処理部1102fは、第一デコーダ部140Sや第二デコーダ部140Uにおけるカラリメトリ変換処理やダイナミックレンジ変換処理や解像度変換処理や音声のダウンミックス処理等の制御、および映像選択部191や音声選択部194等の制御を行う。 The reception function control unit 1102 performs basic control of the broadcast reception function, broadcast communication cooperation function, etc. of the broadcast reception device 100. In particular, the channel selection/demodulation section 1102a performs channel selection processing and TMCC information in the first tuner/demodulation section 130C, the second tuner/demodulation section 130T, the third tuner/demodulation section 130L, the fourth tuner/demodulation section 130B, etc. Mainly controls acquisition processing, demodulation processing, etc. The stream playback control unit 1102b performs layer division processing, error correction decoding processing, and energy processing in the first tuner/demodulation unit 130C, second tuner/demodulation unit 130T, third tuner/demodulation unit 130L, fourth tuner/demodulation unit 130B, etc. Mainly controls despreading processing, stream playback processing, etc. The AV decoder section 1102c mainly controls demultiplexing processing (stream decoding processing), video data decoding processing, audio data decoding processing, etc. in the first decoder section 140S, the second decoder section 140U, and the like. The multimedia (MM) data playback unit 1102d performs BML data playback processing, text super data decoding processing, subtitle data decoding processing, communication cooperation application control processing in the first decoder unit 140S, and HTML data playback processing in the second decoder unit 140U. It mainly controls processes such as multimedia screen generation processing and communication cooperation application control processing. The EPG generation unit 1102e mainly controls EPG generation processing and display processing of the generated EPG in the first decoder unit 140S and the second decoder unit 140U. The presentation processing unit 1102f controls colorimetry conversion processing, dynamic range conversion processing, resolution conversion processing, audio downmix processing, etc. in the first decoder unit 140S and the second decoder unit 140U, and also controls the video selection unit 191 and the audio selection unit 194. etc. are controlled.
 ブラウザエンジン1103のBMLブラウザ1103aやHTMLブラウザ1103bは、前述のBMLデータ再生処理やHTMLデータ再生処理の際にBML文書やHTML文書の解釈を行い、データ放送画面生成処理やマルチメディア画面生成処理を行う。 The BML browser 1103a and HTML browser 1103b of the browser engine 1103 interpret BML documents and HTML documents during the aforementioned BML data playback processing and HTML data playback processing, and perform data broadcasting screen generation processing and multimedia screen generation processing. .
 コンテンツ管理部1104は、放送番組の録画予約や視聴予約を行う際のタイムスケジュール管理や実行制御、放送番組や録画済み番組等をデジタルインタフェース部125やLAN通信部121等から出力する際の著作権管理や放送通信連携機能に基づき取得した連携アプリケーションの有効期限管理等を行う。 The content management unit 1104 manages time schedules and execution controls when making recording reservations and viewing reservations for broadcast programs, and manages copyrights when outputting broadcast programs, recorded programs, etc. from the digital interface unit 125, the LAN communication unit 121, etc. Performs expiration date management, etc. of cooperative applications acquired based on management and broadcast communication cooperation functions.
 前記各動作プログラムは、製品出荷の時点で予めストレージ(蓄積)部110および/またはROM103に記憶されていても良い。製品出荷後にインターネット800上のサーバ装置からLAN通信部121等を介して取得しても良い。また、メモリカードや光ディスク等に記憶された前記各動作プログラムを、拡張インタフェース部124等を介して取得しても良い。放送波を介して新たに取得或いは更新されても良い。 Each of the operating programs may be stored in advance in the storage unit 110 and/or the ROM 103 at the time of product shipment. The information may be obtained from a server device on the Internet 800 via the LAN communication unit 121 or the like after the product is shipped. Further, each of the operating programs stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 124 or the like. It may be newly acquired or updated via broadcast waves.
 [放送局サーバの構成]
 図3Aは、放送局サーバ400の内部構成の一例である。放送局サーバ400は、主制御部401、システムバス402、RAM404、ストレージ部410、LAN通信部421、およびデジタル放送信号送出部460を、構成要素として備えている。
[Broadcast station server configuration]
FIG. 3A is an example of an internal configuration of the broadcast station server 400. The broadcast station server 400 includes a main control section 401, a system bus 402, a RAM 404, a storage section 410, a LAN communication section 421, and a digital broadcast signal transmission section 460 as components.
 主制御部401は、所定の動作プログラムに従って放送局サーバ400全体を制御するマイクロプロセッサユニットである。システムバス402は主制御部401と放送局サーバ400内の各動作ブロックとの間で各種データやコマンド等の送受信を行うための通信路である。RAM404は各動作プログラム実行時のワークエリアとなる。 The main control unit 401 is a microprocessor unit that controls the entire broadcasting station server 400 according to a predetermined operating program. The system bus 402 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 401 and each operational block within the broadcast station server 400. The RAM 404 serves as a work area when each operating program is executed.
 ストレージ部410は、基本動作プログラム4001およびコンテンツ管理/配信プログラム4002とコンテンツ送出プログラム4003を記憶し、さらに、コンテンツデータ記憶領域4011およびメタデータ記憶領域4012を備える。コンテンツデータ記憶領域4011は放送局が放送する各放送番組のコンテンツデータ等を記憶する。メタデータ記憶領域4012は前記各放送番組の番組タイトル、番組ID、番組概要、出演者、放送日時、等のメタデータを記憶する。 The storage unit 410 stores a basic operation program 4001, a content management/distribution program 4002, and a content transmission program 4003, and further includes a content data storage area 4011 and a metadata storage area 4012. The content data storage area 4011 stores content data of each broadcast program broadcast by a broadcast station. The metadata storage area 4012 stores metadata such as the program title, program ID, program summary, performers, broadcast date and time of each of the broadcast programs.
 また、ストレージ部410に記憶された基本動作プログラム4001およびコンテンツ管理/配信プログラム4002とコンテンツ送出プログラム4003は、それぞれRAM404に展開され、さらに主制御部401が前記展開された基本動作プログラムおよびコンテンツ管理/配信プログラムとコンテンツ送出プログラムを実行することにより、基本動作制御部4101およびコンテンツ管理/配信制御部4102とコンテンツ送出制御部4103を構成する。 Further, the basic operating program 4001, content management/distribution program 4002, and content sending program 4003 stored in the storage unit 410 are each expanded to the RAM 404, and the main control unit 401 is further expanded to the expanded basic operating program and the content management/distribution program 4003. By executing the distribution program and the content transmission program, a basic operation control section 4101, a content management/distribution control section 4102, and a content transmission control section 4103 are configured.
 なお、以下では、説明を簡単にするために、主制御部401がストレージ部410に記憶された基本動作プログラム4001をRAM404に展開して実行することにより各動作ブロックの制御を行う処理を、基本動作制御部4101が各動作ブロックの制御を行うものとして記述する。他の動作プログラムに関しても同様の記述を行う。 In the following, to simplify the explanation, the process in which the main control unit 401 controls each operation block by expanding the basic operation program 4001 stored in the storage unit 410 into the RAM 404 and executing it will be described as a basic explanation. The description will be made assuming that the operation control unit 4101 controls each operation block. Similar descriptions are made for other operating programs.
 コンテンツ管理/配信制御部4102は、コンテンツデータ記憶領域4011およびメタデータ記憶領域4012に記憶されたコンテンツデータやメタデータ等の管理と、前記コンテンツデータやメタデータ等を契約に基づいてサービス事業者に提供する際の制御を行う。さらに、コンテンツ管理/配信制御部4102は、前記サービス事業者に対してコンテンツデータやメタデータ等の提供を行う際に、必要に応じてサービス事業者サーバ500の認証処理等も行う。 A content management/distribution control unit 4102 manages content data, metadata, etc. stored in a content data storage area 4011 and a metadata storage area 4012, and distributes the content data, metadata, etc. to a service provider based on a contract. Control when providing. Furthermore, when providing content data, metadata, etc. to the service provider, the content management/distribution control unit 4102 also performs authentication processing of the service provider server 500 as necessary.
 コンテンツ送出制御部4103は、コンテンツデータ記憶領域4011に記憶された放送番組のコンテンツデータや、メタデータ記憶領域4012に記憶された放送番組の番組タイトル、番組ID、番組コンテンツのコピー制御情報等を含むストリームを、デジタル放送信号送出部460を介して送出する際のタイムスケジュール管理等を行う。 The content transmission control unit 4103 includes content data of the broadcast program stored in the content data storage area 4011, program title, program ID, program content copy control information, etc. of the broadcast program stored in the metadata storage area 4012. It performs time schedule management and the like when transmitting a stream via the digital broadcast signal transmitting section 460.
 LAN通信部421は、インターネット800と接続され、インターネット800上のサービス事業者サーバ500やその他の通信機器との通信を行う。LAN通信部421は符号回路や復号回路等を備える。デジタル放送信号送出部460は、コンテンツデータ記憶領域4011に記憶された各放送番組のコンテンツデータや番組情報データ等で構成されたストリームに変調等の処理を施して、電波塔300を介して、デジタル放送波として送出する。 The LAN communication unit 421 is connected to the Internet 800 and communicates with the service provider server 500 and other communication devices on the Internet 800. The LAN communication unit 421 includes an encoding circuit, a decoding circuit, and the like. The digital broadcast signal transmitting unit 460 performs processing such as modulation on the stream composed of content data and program information data of each broadcast program stored in the content data storage area 4011, and transmits the stream in digital form via the radio tower 300. Send it out as a broadcast wave.
 [サービス事業者サーバの構成]
 図3Bは、サービス事業者サーバ500の内部構成の一例である。サービス事業者サーバ500は、主制御部501、システムバス502、RAM504、ストレージ部510、LAN通信部521、で構成される。
[Service provider server configuration]
FIG. 3B is an example of the internal configuration of the service provider server 500. The service provider server 500 includes a main control section 501, a system bus 502, a RAM 504, a storage section 510, and a LAN communication section 521.
 主制御部501は、所定の動作プログラムに従ってサービス事業者サーバ500全体を制御するマイクロプロセッサユニットである。システムバス502は主制御部501とサービス事業者サーバ500内の各動作ブロックとの間で各種データやコマンド等の送受信を行うための通信路である。RAM504は各動作プログラム実行時のワークエリアとなる。 The main control unit 501 is a microprocessor unit that controls the entire service provider server 500 according to a predetermined operating program. The system bus 502 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 501 and each operation block in the service provider server 500. The RAM 504 serves as a work area when each operating program is executed.
 ストレージ部510は、基本動作プログラム5001およびコンテンツ管理/配信プログラム5002とアプリケーション管理/配布プログラム5003を記憶し、さらに、コンテンツデータ記憶領域5011およびメタデータ記憶領域5012とアプリケーション記憶領域5013を備える。コンテンツデータ記憶領域5011およびメタデータ記憶領域5012は、放送局サーバ400から提供されたコンテンツデータやメタデータ等、或いはサービス事業者が制作したコンテンツや前記コンテンツに関するメタデータ等を記憶する。アプリケーション記憶領域5013は、各テレビ受信機からの要求に応じて配布するための、放送通信連携システムの各サービスの実現に必要となるアプリケーション(動作プログラムおよび/または各種データ等)を記憶する。 The storage unit 510 stores a basic operation program 5001, a content management/distribution program 5002, and an application management/distribution program 5003, and further includes a content data storage area 5011, a metadata storage area 5012, and an application storage area 5013. The content data storage area 5011 and the metadata storage area 5012 store content data and metadata provided by the broadcasting station server 400, content produced by a service provider, metadata related to the content, and the like. The application storage area 5013 stores applications (operating programs and/or various data, etc.) necessary for realizing each service of the broadcasting and communication cooperation system, to be distributed in response to requests from each television receiver.
 また、ストレージ部510に記憶された基本動作プログラム5001およびコンテンツ管理/配信プログラム5002とアプリケーション管理/配布プログラム5003は、それぞれRAM504に展開され、さらに主制御部501が前記展開された基本動作プログラムおよびコンテンツ管理/配信プログラムとアプリケーション管理/配布プログラムを実行することにより、基本動作制御部5101およびコンテンツ管理/配信制御部5102とアプリケーション管理/配布制御部5103を構成する。 Further, the basic operation program 5001, content management/distribution program 5002, and application management/distribution program 5003 stored in the storage unit 510 are each expanded to the RAM 504, and the main control unit 501 is By executing the management/distribution program and the application management/distribution program, a basic operation control section 5101, a content management/distribution control section 5102, and an application management/distribution control section 5103 are configured.
 なお、以下では、説明を簡単にするために、主制御部501がストレージ部510に記憶された基本動作プログラム5001をRAM504に展開して実行することにより各動作ブロックの制御を行う処理を、基本動作制御部5101が各動作ブロックの制御を行うものとして記述する。他の動作プログラムに関しても同様の記述を行う。 In the following, to simplify the explanation, the process in which the main control unit 501 controls each operation block by expanding the basic operation program 5001 stored in the storage unit 510 into the RAM 504 and executing it will be described as a basic explanation. The description will be made assuming that the motion control unit 5101 controls each motion block. Similar descriptions are made for other operating programs.
 コンテンツ管理/配信制御部5102は、放送局サーバ400からのコンテンツデータやメタデータ等の取得、コンテンツデータ記憶領域5011およびメタデータ記憶領域5012に記憶されたコンテンツデータやメタデータ等の管理、および各テレビ受信機に対する前記コンテンツデータやメタデータ等の配信の制御を行う。また、アプリケーション管理/配布制御部5103は、アプリケーション記憶領域5013に記憶された各アプリケーションの管理と、前記各アプリケーションを各テレビ受信機からの要求に応じて配布する際の制御と、を行う。さらに、アプリケーション管理/配布制御部5103は、各テレビ受信機に対して各アプリケーションの配布を行う際に、必要に応じてテレビ受信機の認証処理等も行う。 The content management/distribution control unit 5102 acquires content data, metadata, etc. from the broadcasting station server 400, manages content data, metadata, etc. stored in the content data storage area 5011 and the metadata storage area 5012, and manages each content data, metadata, etc. Controls the distribution of the content data, metadata, etc. to the television receiver. Further, the application management/distribution control unit 5103 manages each application stored in the application storage area 5013 and controls the distribution of each application in response to a request from each television receiver. Furthermore, when distributing each application to each television receiver, the application management/distribution control unit 5103 also performs authentication processing of the television receiver, etc., as necessary.
 LAN通信部521は、インターネット800と接続され、インターネット800上の放送局サーバ400やその他の通信機器との通信を行う。また、ルータ装置800Rを介した放送受信装置100や携帯情報端末700との通信を行う。LAN通信部521は符号回路や復号回路等を備える。 The LAN communication unit 521 is connected to the Internet 800 and communicates with the broadcasting station server 400 and other communication devices on the Internet 800. It also communicates with the broadcast receiving device 100 and the mobile information terminal 700 via the router device 800R. The LAN communication unit 521 includes an encoding circuit, a decoding circuit, and the like.
 [携帯情報端末のハードウェア構成]
 図3Cは、携帯情報端末700の内部構成の一例を示すブロック図である。
 携帯情報端末700は、主制御部701、システムバス702、ROM703、RAM704、ストレージ部710、通信処理部720、拡張インタフェース部724、操作部730、画像処理部740、音声処理部750、センサ部760、で構成される。
[Hardware configuration of mobile information terminal]
FIG. 3C is a block diagram showing an example of the internal configuration of mobile information terminal 700.
The mobile information terminal 700 includes a main control section 701, a system bus 702, a ROM 703, a RAM 704, a storage section 710, a communication processing section 720, an expansion interface section 724, an operation section 730, an image processing section 740, an audio processing section 750, and a sensor section 760. , consists of.
 主制御部701は、所定の動作プログラムに従って携帯情報端末700全体を制御するマイクロプロセッサユニットである。システムバス702は、主制御部701と携帯情報端末700内の各動作ブロックとの間で各種データやコマンド等の送受信を行うための通信路である。 The main control unit 701 is a microprocessor unit that controls the entire mobile information terminal 700 according to a predetermined operating program. The system bus 702 is a communication path for transmitting and receiving various data, commands, etc. between the main control unit 701 and each operational block within the mobile information terminal 700.
 ROM703は、オペレーティングシステムなどの基本動作プログラムやその他の動作プログラムが格納された不揮発性メモリであり、例えばEEPROMやフラッシュROMのような書き換え可能なROMが用いられる。また、ROM703には、携帯情報端末700の動作に必要な動作設定値等が記憶される。RAM704は、基本動作プログラムやその他の動作プログラム実行時のワークエリアとなる。ROM703及びRAM704は、主制御部701と一体構成であっても良い。また、ROM703は、図3Cに示したような独立構成とはせず、ストレージ部710内の一部記憶領域を使用するようにしても良い。 The ROM 703 is a nonvolatile memory in which basic operating programs such as an operating system and other operating programs are stored, and for example, a rewritable ROM such as an EEPROM or a flash ROM is used. Further, the ROM 703 stores operation setting values and the like necessary for the operation of the mobile information terminal 700. The RAM 704 serves as a work area when executing the basic operation program and other operation programs. The ROM 703 and the RAM 704 may be integrated with the main control unit 701. Further, the ROM 703 may not have an independent configuration as shown in FIG. 3C, but may use a part of the storage area within the storage unit 710.
 ストレージ部710は、携帯情報端末700の動作プログラムや動作設定値、携帯情報端末700のユーザの個人情報等を記憶する。また、インターネット800を介してダウンロードした動作プログラムや前記動作プログラムで作成した各種データ等を記憶可能である。また、インターネット800を介してダウンロードした、動画、静止画、音声等のコンテンツも記憶可能である。ストレージ部710の一部領域を以ってROM703の機能の全部又は一部を代替しても良い。また、ストレージ部710は、携帯情報端末700に外部から電源が供給されていない状態であっても記憶している情報を保持する必要がある。したがって、例えば、フラッシュROMやSSD等の半導体素子メモリ、HDD等の磁気ディスクドライブ、等のデバイスが用いられる。 The storage unit 710 stores the operating program and operation setting values of the portable information terminal 700, the personal information of the user of the portable information terminal 700, and the like. Further, it is possible to store operating programs downloaded via the Internet 800 and various data created using the operating programs. Further, content downloaded via the Internet 800, such as moving images, still images, and audio, can also be stored. All or part of the functions of the ROM 703 may be replaced by a partial area of the storage section 710. Furthermore, the storage unit 710 needs to retain stored information even when power is not supplied to the portable information terminal 700 from the outside. Therefore, for example, devices such as semiconductor element memories such as flash ROMs and SSDs, and magnetic disk drives such as HDDs are used.
 なお、ROM703やストレージ部710に記憶された前記各動作プログラムは、インターネット800上の各サーバ装置からのダウンロード処理により、追加、更新および機能拡張することが可能である。 Note that each of the operating programs stored in the ROM 703 and the storage unit 710 can be added, updated, and expanded in functionality by downloading from each server device on the Internet 800.
 通信処理部720は、LAN通信部721、移動体電話網通信部722、NFC通信部723、で構成される。LAN通信部721は、ルータ装置800Rを介してインターネット800と接続され、インターネット800上の各サーバ装置やその他の通信機器とデータの送受信を行う。ルータ装置800Rとの接続は、Wi-Fi(登録商標)等の無線接続で行われる。移動体電話網通信部722は、移動体電話通信網の基地局600Bとの無線通信により、電話通信(通話)およびデータの送受信を行う。NFC通信部723は対応するリーダ/ライタとの近接時に無線通信を行う。LAN通信部721と移動体電話網通信部722とNFC通信部723は、それぞれ符号回路や復号回路、アンテナ等を備える。また、通信処理部720が、BlueTooth(登録商標)通信部や赤外線通信部等、他の通信部を更に備えていても良い。 The communication processing unit 720 includes a LAN communication unit 721, a mobile telephone network communication unit 722, and an NFC communication unit 723. The LAN communication unit 721 is connected to the Internet 800 via the router device 800R, and sends and receives data to and from each server device and other communication devices on the Internet 800. Connection with the router device 800R is performed by wireless connection such as Wi-Fi (registered trademark). The mobile telephone network communication unit 722 performs telephone communication (call) and data transmission/reception through wireless communication with the base station 600B of the mobile telephone communication network. The NFC communication unit 723 performs wireless communication when in close proximity to a corresponding reader/writer. The LAN communication section 721, mobile telephone network communication section 722, and NFC communication section 723 each include an encoding circuit, a decoding circuit, an antenna, and the like. Further, the communication processing unit 720 may further include other communication units such as a Bluetooth (registered trademark) communication unit and an infrared communication unit.
 拡張インタフェース部724は、携帯情報端末700の機能を拡張するためのインタフェース群であり、本実施例では、映像/音声インタフェース、USBインタフェース、メモリインタフェース等で構成されるものとする。映像/音声インタフェースは、外部映像/音声出力機器からの映像信号/音声信号の入力、外部映像/音声入力機器への映像信号/音声信号の出力、等を行う。USBインタフェースは、PC等と接続してデータの送受信を行う。また、キーボードやその他のUSB機器の接続を行っても良い。メモリインタフェースはメモリカードやその他のメモリ媒体を接続してデータの送受信を行う。 The expansion interface unit 724 is a group of interfaces for expanding the functions of the mobile information terminal 700, and in this embodiment, it is assumed to be composed of a video/audio interface, a USB interface, a memory interface, etc. The video/audio interface inputs video signals/audio signals from an external video/audio output device, outputs video signals/audio signals to the external video/audio input device, and so on. The USB interface is connected to a PC or the like to send and receive data. Additionally, a keyboard or other USB devices may be connected. The memory interface connects a memory card or other memory medium to send and receive data.
 操作部730は、携帯情報端末700に対する操作指示の入力を行う指示入力部であり、本実施例では、表示部741に重ねて配置したタッチパネル730Tおよびボタンスイッチを並べた操作キー730Kで構成される。何れか一方のみであっても良い。拡張インタフェース部724に接続したキーボード等を用いて携帯情報端末700の操作を行っても良い。有線通信又は無線通信により接続された別体の端末機器を用いて携帯情報端末700の操作を行っても良い。即ち、放送受信装置100から携帯情報端末700の操作を行っても良い。また、前記タッチパネル機能は表示部741が備え持っているものであっても良い。 The operation unit 730 is an instruction input unit for inputting operation instructions to the mobile information terminal 700, and in this embodiment, it is composed of a touch panel 730T arranged over a display unit 741 and an operation key 730K arranged with button switches. . It may be only one of them. The portable information terminal 700 may be operated using a keyboard or the like connected to the expansion interface section 724. The portable information terminal 700 may be operated using a separate terminal device connected by wired communication or wireless communication. That is, the portable information terminal 700 may be operated from the broadcast receiving device 100. Further, the touch panel function may be provided in the display section 741.
 画像処理部740は、表示部741、画像信号処理部742、第一画像入力部743、第二画像入力部744、で構成される。表示部741は、例えば液晶パネル等の表示デバイスであり、画像信号処理部742で処理した画像データを携帯情報端末700のユーザに提供する。画像信号処理部742は図示を省略したビデオRAMを備え、前記ビデオRAMに入力された画像データに基づいて表示部741が駆動される。また、画像信号処理部742は、必要に応じてフォーマット変換、メニューやその他のOSD(On Screen Display)信号の重畳処理等を行う機能を有する。第一画像入力部743および第二画像入力部744は、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)センサ等の電子デバイスを用いてレンズから入力した光を電気信号に変換することにより、周囲や対象物の画像データを入力するカメラユニットである。 The image processing section 740 includes a display section 741, an image signal processing section 742, a first image input section 743, and a second image input section 744. The display unit 741 is, for example, a display device such as a liquid crystal panel, and provides image data processed by the image signal processing unit 742 to the user of the mobile information terminal 700. The image signal processing section 742 includes a video RAM (not shown), and the display section 741 is driven based on image data input to the video RAM. Further, the image signal processing unit 742 has a function of performing format conversion, superimposition processing of menus and other OSD (On Screen Display) signals, etc. as necessary. The first image input unit 743 and the second image input unit 744 convert light input from a lens into an electrical signal using an electronic device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor. , a camera unit that inputs image data of surroundings and objects.
 音声処理部750は、音声出力部751、音声信号処理部752、音声入力部753、で構成される。音声出力部751はスピーカであり、音声信号処理部752で処理した音声信号を携帯情報端末700のユーザに提供する。音声入力部753はマイクであり、ユーザの声などを音声データに変換して入力する。 The audio processing section 750 includes an audio output section 751, an audio signal processing section 752, and an audio input section 753. The audio output unit 751 is a speaker, and provides the user of the mobile information terminal 700 with an audio signal processed by the audio signal processing unit 752. The voice input unit 753 is a microphone, and converts the user's voice into voice data and inputs the voice data.
 センサ部760は、携帯情報端末700の状態を検出するためのセンサ群であり、本実施例では、GPS受信部761、ジャイロセンサ762、地磁気センサ763、加速度センサ764、照度センサ765、近接センサ766、で構成される。これらのセンサ群により、携帯情報端末700の位置、傾き、方角、動き、および周囲の明るさ、周囲物の近接状況、等を検出することが可能となる。また、携帯情報端末700が、気圧センサ等、他のセンサを更に備えていても良い。 The sensor unit 760 is a group of sensors for detecting the state of the mobile information terminal 700, and in this embodiment, it includes a GPS receiving unit 761, a gyro sensor 762, a geomagnetic sensor 763, an acceleration sensor 764, an illuminance sensor 765, and a proximity sensor 766. , consists of. These sensor groups make it possible to detect the position, inclination, direction, movement, surrounding brightness, proximity of surrounding objects, etc. of the mobile information terminal 700. Furthermore, the mobile information terminal 700 may further include other sensors such as an atmospheric pressure sensor.
 携帯情報端末700は、携帯電話やスマートホン、タブレット端末等であって良い。PDA(Personal Digital Assistants)やノート型PCであっても良い。また、デジタルスチルカメラや動画撮影可能なビデオカメラ、携帯型ゲーム機やナビゲーション装置等、またはその他の携帯用デジタル機器であっても良い。 The mobile information terminal 700 may be a mobile phone, a smart phone, a tablet terminal, or the like. It may be a PDA (Personal Digital Assistant) or a notebook PC. Further, it may be a digital still camera, a video camera capable of shooting moving images, a portable game machine, a navigation device, or other portable digital equipment.
 なお、図3Cに示した携帯情報端末700の構成例は、センサ部760等、本実施例に必須ではない構成も多数含んでいるが、これらが備えられていない構成であっても本実施例の効果を損なうことはない。また、デジタル放送受信機能や電子マネー決済機能等、図示していない構成が更に加えられていても良い。 Note that the configuration example of the mobile information terminal 700 shown in FIG. 3C includes many configurations that are not essential to this embodiment, such as a sensor section 760, but even if the configuration does not include these, this embodiment does not impair its effectiveness. Furthermore, configurations not shown may be further added, such as a digital broadcast reception function and an electronic money payment function.
 [携帯情報端末のソフトウェア構成]
 図3Dは、携帯情報端末700のソフトウェア構成図であり、ROM703、RAM704およびストレージ部710におけるソフトウェアの構成の一例を示す。ROM703には、基本動作プログラム7001およびその他の動作プログラムが記憶されている。ストレージ部710には、連携制御プログラム7002およびその他の動作プログラムが記憶されている。また、ストレージ部710は、動画や静止画や音声等のコンテンツデータを記憶するコンテンツ記憶領域7200、テレビ受信機や各サーバ装置にアクセスする際に必要な認証情報等を記憶する認証情報記憶領域7300、その他の各種情報を記憶する各種情報記憶領域を備えるものとする。
[Software configuration of mobile information terminal]
FIG. 3D is a software configuration diagram of the mobile information terminal 700, and shows an example of the software configuration in the ROM 703, RAM 704, and storage unit 710. The ROM 703 stores a basic operation program 7001 and other operation programs. The storage unit 710 stores a cooperation control program 7002 and other operating programs. The storage unit 710 also includes a content storage area 7200 that stores content data such as videos, still images, and audio, and an authentication information storage area 7300 that stores authentication information necessary for accessing the television receiver and each server device. , and various information storage areas for storing various other information.
 ROM703に記憶された基本動作プログラム7001はRAM704に展開され、さらに主制御部701が前記展開された基本動作プログラムを実行することにより、基本動作実行部7101を構成する。また、ストレージ部710に記憶された連携制御プログラム7002も同様にRAM704に展開され、さらに主制御部701が前記展開された連携制御プログラムを実行することにより、連携制御実行部7102を構成する。また、RAM704は、各動作プログラム実行時に作成したデータを、必要に応じて一時的に保持する一時記憶領域を備えるものとする。 The basic operation program 7001 stored in the ROM 703 is expanded to the RAM 704, and the main control unit 701 further executes the expanded basic operation program to configure the basic operation execution unit 7101. Further, the cooperation control program 7002 stored in the storage unit 710 is similarly expanded to the RAM 704, and further, the main control unit 701 configures the cooperation control execution unit 7102 by executing the expanded cooperation control program. Furthermore, the RAM 704 is provided with a temporary storage area that temporarily holds data created when each operating program is executed, as needed.
 なお、以下では、説明を簡単にするために、主制御部701がROM703に格納された基本動作プログラム7001をRAM704に展開して実行することにより各動作ブロックの制御を行う処理を、基本動作実行部7101が各動作ブロックの制御を行うものとして記述する。他の動作プログラムに関しても同様の記述を行う。 In the following, to simplify the explanation, the process in which the main control unit 701 controls each operation block by expanding the basic operation program 7001 stored in the ROM 703 into the RAM 704 and executing it will be referred to as basic operation execution. The description will be made assuming that the unit 7101 controls each operation block. Similar descriptions are made for other operating programs.
 連携制御実行部7102は、携帯情報端末700がテレビ受信機との連係動作を行う際の、機器認証および接続、各データの送受信、等の管理を行う。また、連携制御実行部7102は、前記テレビ受信機と連動するアプリケーションを実行するためのブラウザエンジン機能を備えるものとする。 The cooperation control execution unit 7102 manages device authentication and connection, transmission and reception of each data, etc. when the mobile information terminal 700 performs a cooperation operation with the television receiver. Further, the cooperation control execution unit 7102 is provided with a browser engine function for executing an application that works in conjunction with the television receiver.
 前記各動作プログラムは、製品出荷の時点で、予めROM703および/またはストレージ部710に記憶されていても良い。製品出荷後に、インターネット800上のサーバ装置からLAN通信部721または移動体電話網通信部722を介して取得しても良い。また、メモリカードや光ディスク等に格納された前記各動作プログラムを、拡張インタフェース部724等を介して取得しても良い。 Each of the operating programs may be stored in advance in the ROM 703 and/or the storage unit 710 at the time of product shipment. After the product is shipped, the information may be obtained from a server device on the Internet 800 via the LAN communication section 721 or the mobile telephone network communication section 722. Further, each of the operating programs stored in a memory card, an optical disk, etc. may be acquired via the expansion interface unit 724 or the like.
 [デジタル放送の放送波]
 ここで、本発明の実施例の放送受信装置が受信するデジタル放送の放送波の一例に関して説明する。
[Broadcast waves of digital broadcasting]
Here, an example of a digital broadcast wave received by the broadcast receiving apparatus according to the embodiment of the present invention will be described.
 放送受信装置100は、ISDB-T(Integrated Services Digital Broadcasting for Terrestrial Television Broadcasting)方式と少なくとも一部の仕様を共通にする地上デジタル放送サービスを受信可能である。具体的には、第二チューナ/復調部130Tが受信可能な、偏波両用地上デジタル放送や単偏波地上デジタル放送は、一部の仕様をISDB-T方式と共通にする高度な地上デジタル放送である。また、第三チューナ/復調部130Lが受信可能な、階層分割多重地上デジタル放送は、一部の仕様をISDB-T方式と共通にする高度な地上デジタル放送である。なお、第一チューナ/復調部130Cが受信可能な現行地上デジタル放送は、ISDB-T方式の地上デジタル放送である。また、第四チューナ/復調部130Bが受信可能な高度BSデジタル放送や高度CSデジタル放送は、ISDB-T方式と異なるデジタル放送である。 The broadcast receiving device 100 is a terrestrial broadcasting system that shares at least some specifications with the ISDB-T (Integrated Services Digital Broadcasting) system. It is possible to receive digital broadcasting services. Specifically, dual-polarization terrestrial digital broadcasting and single-polarization terrestrial digital broadcasting that can be received by the second tuner/demodulator 130T are advanced terrestrial digital broadcasting that shares some specifications with the ISDB-T system. It is. Further, the hierarchical division multiplexing terrestrial digital broadcasting that can be received by the third tuner/demodulator 130L is an advanced terrestrial digital broadcasting that shares some specifications with the ISDB-T system. Note that the current terrestrial digital broadcasting that can be received by the first tuner/demodulator 130C is ISDB-T type terrestrial digital broadcasting. Further, advanced BS digital broadcasting and advanced CS digital broadcasting that can be received by the fourth tuner/demodulator 130B are digital broadcasting that is different from the ISDB-T system.
 ここで、本実施例に係る偏波両用地上デジタル放送と単偏波地上デジタル放送および階層分割多重地上デジタル放送は、ISDB-T方式と同様に、伝送方式にマルチキャリア方式の1つであるOFDM(Orthogonal Frequency Division Multiplexing:直交周波数分割多重)を採用する。OFDMは、マルチキャリア方式であるためにシンボル長が長く、ガードインターバルと呼ばれる時間軸方向の冗長部分を付加することが有効であり、ガードインターバルの範囲内のマルチパスの影響を軽減することが可能である。このためSFN(Single Frequency Network:単一周波数ネットワーク)を実現することが可能であり、周波数の有効利用が可能となる。 Here, the dual-polarization terrestrial digital broadcasting, the single-polarization terrestrial digital broadcasting, and the hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment use OFDM, which is one of the multicarrier systems, as a transmission method, similar to the ISDB-T system. (Orthogonal Frequency Division Multiplexing) is adopted. Since OFDM is a multicarrier system, the symbol length is long, and it is effective to add a redundant part in the time axis direction called a guard interval, which can reduce the effects of multipath within the range of the guard interval. It is. Therefore, it is possible to realize an SFN (Single Frequency Network), and effective use of frequencies is possible.
 本実施例に係る偏波両用地上デジタル放送と単偏波地上デジタル放送および階層分割多重地上デジタル放送は、ISDB-T方式と同様に、OFDMのキャリアをセグメントと呼ばれるグループに分割しており、図4Aに示すように、デジタル放送サービスの1つのチャンネル帯域幅は13セグメントで構成される。帯域の中央部をセグメント0の位置とし、この上下に順次セグメント番号(0~12)が割り付けられる。本実施例に係る偏波両用地上デジタル放送と単偏波地上デジタル放送および階層分割多重地上デジタル放送の伝送路符号化はOFDMセグメントを単位に行われる。このため階層伝送を定義することが可能であり、例えば、1つのテレビジョンチャンネルの帯域幅の中で、一部のOFDMセグメントを固定受信サービスに、残りを移動体受信サービスに、それぞれ割り当てることができる。階層伝送では、各階層が1つまたは複数のOFDMセグメントで構成され、階層ごとにキャリア変調方式、内符号の符号化率、時間インターリーブ長、等のパラメータを設定することができる。なお、階層数は任意に設定できて良く、例えば、最大3階層までと設定すれば良い。図4Bに、階層数を3または2とした場合のOFDMセグメントの階層割り当ての一例を示す。図4B(1)の例では、階層数が3であり、A階層、B階層、およびC階層がある。A階層は1セグメント(セグメント0)で構成され、B階層は7セグメント(セグメント1~7)で構成され、C階層は5セグメント(セグメント8~12)で構成される。図4B(2)の例では、階層数が3であり、A階層、B階層、およびC階層がある。A階層は1セグメント(セグメント0)で構成され、B階層は5セグメント(セグメント1~5)で構成され、C階層は7セグメント(セグメント6~12)で構成される。図4B(3)の例では、階層数が2であり、A階層およびB階層がある。A階層は1セグメント(セグメント0)で構成され、B階層は12セグメント(セグメント1~12)で構成される。各階層のOFDMセグメント数や伝送路符号化パラメータ等は編成情報に従って決定され、受信機の動作を補助するための制御情報であるTMCC信号によって伝送される。 In the dual-polarization terrestrial digital broadcasting, single-polarization terrestrial digital broadcasting, and hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment, the OFDM carrier is divided into groups called segments, as in the ISDB-T system. As shown in 4A, one channel bandwidth of the digital broadcasting service is composed of 13 segments. The center of the band is set as segment 0, and segment numbers (0 to 12) are sequentially assigned above and below this. Transmission path encoding of dual-polarization terrestrial digital broadcasting, single-polarization terrestrial digital broadcasting, and hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment is performed in units of OFDM segments. For this reason, it is possible to define hierarchical transmission, for example, within the bandwidth of one television channel, some OFDM segments can be allocated to fixed reception services and the rest to mobile reception services. can. In layered transmission, each layer is composed of one or more OFDM segments, and parameters such as carrier modulation method, inner code coding rate, time interleave length, etc. can be set for each layer. Note that the number of layers may be set arbitrarily; for example, it may be set to a maximum of three layers. FIG. 4B shows an example of layer allocation of OFDM segments when the number of layers is 3 or 2. In the example of FIG. 4B(1), the number of layers is three, and there are an A layer, a B layer, and a C layer. The A layer consists of one segment (segment 0), the B layer consists of seven segments (segments 1 to 7), and the C layer consists of five segments (segments 8 to 12). In the example of FIG. 4B(2), the number of layers is three, and there are an A layer, a B layer, and a C layer. The A layer consists of one segment (segment 0), the B layer consists of five segments (segments 1 to 5), and the C layer consists of seven segments (segments 6 to 12). In the example of FIG. 4B(3), the number of layers is 2, and there are an A layer and a B layer. The A layer consists of one segment (segment 0), and the B layer consists of 12 segments (segments 1 to 12). The number of OFDM segments, transmission path coding parameters, etc. of each layer are determined according to configuration information, and are transmitted by a TMCC signal, which is control information for assisting the operation of the receiver.
 なお、図4Bの(1)、(2)、(3)のセグメント階層割り当ての使用例の一例としては、例えば以下の例があり得る。 Note that the following example may be used as an example of the segment hierarchy allocation in (1), (2), and (3) of FIG. 4B.
 例えば、図4B(1)の階層割り当ては、本実施例に係る偏波両用地上デジタル放送において用いることができ、水平偏波、垂直偏波ともに同じセグメント階層割り当てを用いれば良い。具体的には、A階層として水平偏波の上記1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送すれば良い。(なお、当該現行の地上デジタル放送の移動体受信サービスは同じサービスを垂直偏波の上記1セグメントで伝送しても良い。この場合、これもA階層として扱う。)また、B階層として水平偏波の上記7セグメントで、現行の地上デジタル放送である水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスを伝送すれば良い。(なお、当該水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスは同じサービスを垂直偏波の上記7セグメントで伝送しても良い。この場合、これもB階層として扱う。)さらに、C階層として水平偏波と垂直偏波の両者の上記5セグメント、合計10セグメントで水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送するように構成しても良い。当該伝送の詳細は後述する。当該セグメント階層割り当ての伝送波は例えば、放送受信装置100の第二チューナ/復調部130Tで受信可能である。 For example, the layer assignment in FIG. 4B(1) can be used in the dual-polarization terrestrial digital broadcasting according to this embodiment, and the same segment layer assignment may be used for both horizontal and vertical polarization. Specifically, it is sufficient to transmit the current mobile reception service of digital terrestrial broadcasting in the above-mentioned one segment of horizontally polarized waves as layer A. (In addition, the current mobile reception service for digital terrestrial broadcasting may transmit the same service using the above-mentioned one segment of vertically polarized waves. In this case, this will also be treated as layer A.) Also, as layer B, horizontally polarized It is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting, using the seven segments of the wave. (In addition, the digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically may transmit the same service using the above 7 segments of vertically polarized waves. In this case, this is also considered as the B layer. In addition, as the C layer, an advanced terrestrial network capable of transmitting video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically in the above five segments for both horizontally polarized waves and vertically polarized waves, a total of 10 segments. It may also be configured to transmit digital broadcasting services. Details of the transmission will be described later. The transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
 また、図4B(1)の階層割り当ては、本実施例に係る単偏波地上デジタル放送において用いることができる。具体的には、A階層として上記1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送すれば良い。また、B階層として上記7セグメントで、現行の地上デジタル放送である水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスを伝送すれば良い。さらに、C階層として上記5セグメントで水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送するように構成しても良い。なお、この場合、C階層においては、現行の地上デジタル放送よりも高効率のキャリア変調方式や誤り訂正符号方式や映像符号化方式等を用いる。当該伝送の詳細は後述する。当該セグメント階層割り当ての伝送波は例えば、放送受信装置100の第二チューナ/復調部130Tで受信可能である。 Furthermore, the hierarchy assignment in FIG. 4B(1) can be used in the single-polarized digital terrestrial broadcasting according to this embodiment. Specifically, the current mobile reception service of digital terrestrial broadcasting may be transmitted in the above-mentioned one segment as layer A. Further, as the B layer, the above seven segments may be used to transmit a terrestrial digital broadcasting service that transmits video having a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting. Further, the C layer may be configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 horizontal pixels x 1080 vertical pixels in the five segments described above. In this case, the C layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Details of the transmission will be described later. The transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
 また、図示しない例として、本実施例に係る単偏波地上デジタル放送において、A階層の1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送し、B階層の8セグメントで現行の地上デジタル放送である水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスを伝送し、C階層の4セグメントで水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送するように構成しても良い。なお、この場合も、C階層においては、現行の地上デジタル放送よりも高効率のキャリア変調方式や誤り訂正符号方式や映像符号化方式等を用いる。当該伝送の詳細は後述する。当該セグメント階層割り当ての伝送波は例えば、放送受信装置100の第二チューナ/復調部130Tで受信可能である。 Furthermore, as an example not shown, in the single-polarized digital terrestrial broadcasting according to this embodiment, one segment of the A layer transmits the current mobile reception service of the terrestrial digital broadcasting, and eight segments of the B layer transmit the current terrestrial digital broadcasting service. Digital terrestrial broadcasting service that transmits broadcast video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically, and video whose maximum resolution exceeds 1920 pixels horizontally x 1080 pixels vertically in 4 segments of the C layer. It may also be configured to transmit advanced digital terrestrial broadcasting services that can transmit. In this case as well, the C layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Details of the transmission will be described later. The transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100.
 例えば、図4B(2)の階層割り当ては、本実施例に係る偏波両用地上デジタル放送において図4B(1)とは別の例として用いることができ、水平偏波、垂直偏波ともに同じセグメント階層割り当てを用いれば良い。具体的には、A階層として水平偏波の上記1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送すれば良い。(なお、当該現行の地上デジタル放送の移動体受信サービスは同じサービスを垂直偏波の上記1セグメントで伝送しても良い。この場合、これもA階層として扱う。)さらに、B階層として水平偏波と垂直偏波の両者の上記5セグメント、合計10セグメントで水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送するように構成しても良い。また、C階層として、水平偏波の上記7セグメントで現行の地上デジタル放送である、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスを伝送すれば良い。(なお、当該水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスは同じサービスを垂直偏波の上記7セグメントで伝送しても良い。この場合、これもC階層として扱う。)当該伝送の詳細は後述する。当該セグメント階層割り当ての伝送波は例えば、本実施例の放送受信装置100の第二チューナ/復調部130Tで受信可能である。 For example, the hierarchy allocation shown in FIG. 4B (2) can be used as an example different from that shown in FIG. Hierarchical assignment may be used. Specifically, it is sufficient to transmit the current mobile reception service of digital terrestrial broadcasting in the above-mentioned one segment of horizontally polarized waves as layer A. (In addition, the current mobile reception service for digital terrestrial broadcasting may transmit the same service using the above-mentioned one segment of vertically polarized waves. In this case, this is also treated as layer A.) Furthermore, as layer B, horizontally polarized It is configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels in the above five segments of both wave and vertical polarization, a total of 10 segments. It's okay. Further, as the C layer, it is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting, using the above seven segments of horizontally polarized waves. (In addition, the digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically may transmit the same service using the above 7 segments of vertical polarization. In this case, this is also the C layer. ) The details of this transmission will be described later. The transmission wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100 of this embodiment.
 また、図4B(2)の階層割り当ては、本実施例に係る単偏波地上デジタル放送において図4B(1)とは別の例として用いることができる。具体的には、A階層として上記1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送すれば良い。さらに、B階層として上記5セグメントで水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送するように構成しても良い。なお、この場合、B階層においては、現行の地上デジタル放送よりも高効率のキャリア変調方式や誤り訂正符号方式や映像符号化方式等を用いる。また、C階層として、上記7セグメントで現行の地上デジタル放送である、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスを伝送すれば良い。当該伝送の詳細は後述する。当該セグメント階層割り当ての伝送波は例えば、本実施例の放送受信装置100の第二チューナ/復調部130Tで受信可能である。 Furthermore, the hierarchy assignment in FIG. 4B(2) can be used as an example different from FIG. 4B(1) in the single-polarized digital terrestrial broadcasting according to this embodiment. Specifically, the current mobile reception service of digital terrestrial broadcasting may be transmitted in the above-mentioned one segment as layer A. Further, the B layer may be configured to transmit an advanced terrestrial digital broadcasting service capable of transmitting video whose maximum resolution exceeds 1920 horizontal pixels x 1080 vertical pixels in the five segments. In this case, the B layer uses a carrier modulation method, an error correction code method, a video coding method, etc. that are more efficient than the current terrestrial digital broadcasting. Further, as the C layer, it is sufficient to transmit a terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically, which is the current digital terrestrial broadcasting in the above seven segments. Details of the transmission will be described later. The transmitted wave assigned to the segment hierarchy can be received by, for example, the second tuner/demodulator 130T of the broadcast receiving apparatus 100 of this embodiment.
 例えば、図4B(3)の階層割り当ては、本実施例に係る階層分割多重地上デジタル放送や現行の地上デジタル放送において用いることができる。具体的には、階層分割多重地上デジタル放送で用いる場合には、A階層として図中の1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送すれば良い。さらに、B階層として図中の12セグメントで水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスを伝送するように構成しても良い。当該セグメント階層割り当ての伝送波は、例えば、本実施例の放送受信装置100の第三チューナ/復調部130Lで受信可能である。現行の地上デジタル放送において用いる場合には、A階層として図中の1セグメントで現行の地上デジタル放送の移動体受信サービスを伝送すれば良く、B階層として図中の12セグメントで現行の地上デジタル放送である、水平1920画素×垂直1080画素を最大解像度とする映像を伝送する地上デジタル放送サービスを伝送すれば良い。当該セグメント階層割り当ての伝送波は、例えば、本実施例の放送受信装置100の第一チューナ/復調部130Cで受信可能である。 For example, the layer allocation shown in FIG. 4B (3) can be used in the layer division multiplexing terrestrial digital broadcasting according to this embodiment and the current terrestrial digital broadcasting. Specifically, when used in hierarchical division multiplex terrestrial digital broadcasting, it is sufficient to transmit the current mobile reception service of terrestrial digital broadcasting in one segment in the figure as the A layer. Furthermore, as layer B, we have advanced terrestrial digital broadcasting services that can transmit video with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically in 12 segments in the figure, and 1920 pixels horizontally x 1080 pixels vertically. The current digital terrestrial broadcasting service that transmits video images may be configured to be transmitted. The transmitted wave assigned to the segment hierarchy can be received by, for example, the third tuner/demodulator 130L of the broadcast receiving apparatus 100 of this embodiment. When used in current terrestrial digital broadcasting, it is sufficient to transmit the current mobile reception service of terrestrial digital broadcasting in 1 segment in the figure as the A layer, and to transmit the current terrestrial digital broadcasting mobile reception service in 12 segments in the figure as the B layer. It is sufficient to transmit a terrestrial digital broadcasting service that transmits video having a maximum resolution of 1920 pixels horizontally by 1080 pixels vertically. The transmitted wave assigned to the segment hierarchy can be received by, for example, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 of this embodiment.
 図4Cに、本実施例に係る偏波両用地上デジタル放送と単偏波地上デジタル放送および階層分割多重地上デジタル放送のデジタル放送波であるOFDM伝送波の生成処理を実現する放送局側のシステムの一例を示す。情報源符号化部411は映像/音声/各種データ等をそれぞれ符号化する。多重化部/限定受信処理部415は、情報源符号化部411でそれぞれ符号化した映像/音声/各種データ等を多重化し、さらに限定受信に対応した処理を適宜実行して、パケットストリームとして出力する。情報源符号化部411と多重化部/限定受信処理部415は、並列的に複数存在させることができ、複数のパケットストリームを生成する。伝送路符号化部416では、当該複数のパケットストリームを再多重して1つのパケットストリームと為し、伝送路符号化処理を行って、OFDM伝送波として出力する。図4Cに示す構成は、情報源符号化や伝送路符号化の方式の詳細は異なるものの、OFDM伝送波の生成処理を実現する構成としては、ISDB-T方式と共通である。よって、複数の情報源符号化部411と多重化部/限定受信処理部415のうち、一部をISDB-T方式の地上デジタル放送サービスのための構成とし、一部を高度な地上デジタル放送サービスのための構成とし、複数の異なる地上デジタル放送サービスのパケットストリームを伝送路符号化部416で多重しても良い。多重化部/限定受信処理部415をISDB-T方式の地上デジタル放送サービスのための構成とする場合は、MPEG-2システムズで規定されるTSP(Transport Stream Packet)のストリームであるMPEG-2TSを生成すれば良い。また、多重化部/限定受信処理部415を高度な地上デジタル放送サービスのための構成とする場合は、MMTパケットストリーム或いはMMTパケットを含むTLVストリームや、その他のシステムで規定されるTSPのストリームを生成すれば良い。当然、複数の情報源符号化部411と多重化部/限定受信処理部415のすべてを高度な地上デジタル放送サービスのための構成とし、伝送路符号化部416で多重するすべてのパケットストリームを高度な地上デジタル放送サービスのためのパケットストリームにしても良い。 FIG. 4C shows a system on the broadcasting station side that realizes generation processing of OFDM transmission waves, which are digital broadcast waves for dual-polarized terrestrial digital broadcasting, single-polarized terrestrial digital broadcasting, and hierarchical division multiplexed terrestrial digital broadcasting according to this embodiment. An example is shown. The information source encoding unit 411 encodes video/audio/various data, etc., respectively. The multiplexing unit/conditional reception processing unit 415 multiplexes the video/audio/various data etc. encoded by the information source encoding unit 411, performs appropriate processing corresponding to conditional reception, and outputs it as a packet stream. do. A plurality of information source encoding units 411 and multiplexing units/conditional access processing units 415 can exist in parallel, and generate a plurality of packet streams. The transmission path encoding unit 416 re-multiplexes the plurality of packet streams into one packet stream, performs transmission path encoding processing, and outputs it as an OFDM transmission wave. The configuration shown in FIG. 4C is the same as the ISDB-T system as a configuration for realizing OFDM transmission wave generation processing, although the details of the information source encoding and transmission path encoding methods are different. Therefore, some of the plurality of information source encoding units 411 and multiplexing units/conditional access processing units 415 are configured for ISDB-T digital terrestrial broadcasting services, and some are configured for advanced digital terrestrial broadcasting services. The transmission path encoding unit 416 may multiplex packet streams of a plurality of different digital terrestrial broadcasting services. When the multiplexing unit/conditional access processing unit 415 is configured for ISDB-T digital terrestrial broadcasting service, MPEG-2TS, which is a TSP (Transport Stream Packet) stream defined by MPEG-2 Systems, is configured. Just generate it. In addition, when the multiplexing unit/conditional access processing unit 415 is configured for advanced terrestrial digital broadcasting services, an MMT packet stream, a TLV stream containing MMT packets, or a TSP stream specified by other systems may be used. Just generate it. Naturally, all of the multiple information source encoding units 411 and the multiplexing unit/conditional access processing unit 415 are configured for advanced terrestrial digital broadcasting services, and all packet streams multiplexed by the transmission line encoding unit 416 are It may also be a packet stream for digital terrestrial broadcasting services.
 図4Dに、伝送路符号化部416の構成の一例を示す。 FIG. 4D shows an example of the configuration of the transmission path encoding section 416.
 まず、図4D(1)について説明する。図4D(1)は、現行の地上デジタル放送サービスのデジタル放送のOFDM伝送波のみを生成する場合の伝送路符号化部416の構成である。本構成で伝送するOFDM伝送波は、例えば、図4B(3)のセグメント構成を有するものである。多重化部/限定受信処理部415から入力されて再多重処理を施されたパケットストリームは、誤り訂正の冗長度が付加される他、バイトインターリーブ、ビットインターリーブ、時間インターリーブ、周波数インターリーブなどの各種のインターリーブ処理が行われる。その後、パイロット信号、TMCC信号、AC信号とともにIFFT(Inverse Fast Fourier Transform)による処理が行われ、ガードインターバルが付加された後に直交変調を経てOFDM伝送波となる。なお、外符号処理、電力拡散処理、バイトインターリーブ、内符号処理、ビットインターリーブ処理、マッピング処理までは、A階層やB階層などの階層ごとに別々に処理が可能なように構成される。(なお、現行の地上デジタル放送サービスのデジタル放送では運用上2階層であるが、3階層まで伝送可能であるため、図4D(1)では3階層の例を示している。)マッピング処理はキャリアの変調処理である。また、多重化部/限定受信処理部415から入力されるパケットストリームは、TMCCの情報やモードやガードインターバル比等の情報が多重されていて良い。なお、伝送路符号化部416に入力されるパケットストリームは、上述のとおり、MPEG-2システムズで規定されるTSPのストリームで良い。図4D(1)の構成で生成されたOFDM伝送波は、例えば、本実施例の放送受信装置100の第一チューナ/復調部130Cで受信可能である。 First, FIG. 4D(1) will be explained. FIG. 4D (1) shows the configuration of the transmission path encoding unit 416 when generating only OFDM transmission waves for digital broadcasting of the current digital terrestrial broadcasting service. The OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. 4B (3). The packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, and is subjected to various types of processing such as byte interleaving, bit interleaving, time interleaving, and frequency interleaving. Interleaving processing is performed. Thereafter, the signal is processed by IFFT (Inverse Fast Fourier Transform) together with the pilot signal, TMCC signal, and AC signal, and after a guard interval is added, it becomes an OFDM transmission wave through orthogonal modulation. Note that outer code processing, power spreading processing, byte interleaving, inner code processing, bit interleaving processing, and mapping processing are configured so that they can be processed separately for each layer such as the A layer and the B layer. (Although the digital broadcasting of the current digital terrestrial broadcasting service has two operational layers, it is possible to transmit up to three layers, so Figure 4D (1) shows an example of three layers.) The mapping process is carried out by carriers. This is the modulation process. Further, the packet stream inputted from the multiplexing unit/conditional access processing unit 415 may be multiplexed with information such as TMCC information, mode, guard interval ratio, and the like. Note that the packet stream input to the transmission path encoding unit 416 may be a TSP stream defined by MPEG-2 Systems, as described above. The OFDM transmission wave generated with the configuration of FIG. 4D(1) can be received by, for example, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 of this embodiment.
 次に、図4D(2)について説明する。図4D(2)は、本実施例に係る偏波両用地上デジタル放送のOFDM伝送波を生成する場合の伝送路符号化部416の構成である。本構成で伝送するOFDM伝送波は、例えば、図4B(1)または(2)のセグメント構成を有するものである。図4D(2)においても、多重化部/限定受信処理部415から入力されて再多重処理を施されたパケットストリームは、誤り訂正の冗長度が付加される他、バイトインターリーブ、ビットインターリーブ、時間インターリーブ、周波数インターリーブなどの各種のインターリーブ処理が行われる。その後、パイロット信号、TMCC信号、AC信号とともにIFFTによる処理が行われ、ガードインターバル付加処理がされた後に直交変調を経てOFDM伝送波となるものである。 Next, FIG. 4D(2) will be explained. FIG. 4D (2) shows the configuration of the transmission path encoding unit 416 when generating OFDM transmission waves for dual-polarization terrestrial digital broadcasting according to this embodiment. The OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. 4B (1) or (2). In FIG. 4D (2) as well, the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, byte interleaving, bit interleaving, time Various interleaving processes such as interleaving and frequency interleaving are performed. Thereafter, it is processed by IFFT along with the pilot signal, TMCC signal, and AC signal, subjected to guard interval addition processing, and then subjected to orthogonal modulation to become an OFDM transmission wave.
 図4D(2)の構成例では、外符号処理、電力拡散処理、バイトインターリーブ、内符号処理、ビットインターリーブ処理、マッピング処理、時間インターリーブまでは、A階層、B階層、C階層などの階層ごとに別々に処理が可能なように構成する。しかしながら、図4D(2)の構成例では、水平偏波(H)のOFDM伝送波のみではなく、垂直偏波(V)のOFDM伝送波を生成するものであり、処理フローが2系統に分岐する。水平偏波(H)の処理系統から垂直偏波(V)の処理系統に分岐する際に、水平偏波(H)の処理系統と同じデータを垂直偏波(V)の処理系統に分岐するか、水平偏波(H)の処理系統と異なるデータを垂直偏波(V)の処理系統に分岐するか、または垂直偏波(V)の処理系統にデータを分岐しないかは、図4B(1)または(2)で説明したセグメント構成に対応して、階層ごとに異ならせることができる。 In the configuration example shown in FIG. 4D (2), outer code processing, power spreading processing, byte interleaving, inner code processing, bit interleaving processing, mapping processing, and time interleaving are performed for each layer such as layer A, layer B, and layer C. Configure so that they can be processed separately. However, in the configuration example shown in FIG. 4D (2), not only a horizontally polarized (H) OFDM transmission wave but also a vertically polarized (V) OFDM transmission wave is generated, and the processing flow is branched into two systems. do. When branching from the horizontal polarization (H) processing system to the vertical polarization (V) processing system, the same data as the horizontal polarization (H) processing system is branched to the vertical polarization (V) processing system. Figure 4B ( Corresponding to the segment configuration described in 1) or (2), it can be made different for each hierarchy.
 図4D(2)の構成に示される外符号、内符号、マッピング等の処理は、図4D(1)の構成と互換性のある処理に加えて、図4D(1)の構成の各処理では採用していないより高度な処理を用いることができる。具体的には、図4D(2)の構成のうち、階層ごとに処理が行われる部分について、現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスが伝送される階層では、外符号、内符号、マッピング等の処理について、図4D(1)の構成と互換性のある処理が行われる。これに対し、図4D(2)の構成のうち、階層ごとに処理が行われる部分について、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送する階層については、外符号、内符号、マッピング等の処理について、図4D(1)の構成の各処理では採用していないより高度な処理を用いるように構成すれば良い。 The processing of outer code, inner code, mapping, etc. shown in the configuration of FIG. 4D(2) is in addition to the processing compatible with the configuration of FIG. 4D(1). More advanced processing not adopted can be used. Specifically, in the configuration shown in Figure 4D (2), the part where processing is performed for each layer is compatible with current terrestrial digital broadcasting mobile reception services and video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically. In the layer where the current digital terrestrial broadcasting service is transmitted, processing compatible with the configuration shown in FIG. 4D (1) is performed regarding processing of outer codes, inner codes, mapping, etc. In contrast, in the configuration shown in Figure 4D (2), for the part where processing is performed for each layer, advanced digital terrestrial broadcasting is capable of transmitting video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels. The layer that transmits the service may be configured to use more advanced processing, such as outer code, inner code, mapping, etc., which is not adopted in each process in the configuration of FIG. 4D (1).
 なお、本実施例に係る偏波両用地上デジタル放送では、後述するTMCC情報により、階層と伝送される地上デジタル放送サービスの割り当てが切り替え可能とすることもできるため、各階層に施す外符号、内符号、マッピング等の処理をTMCC情報により切り替え可能に構成することが望ましい。 In addition, in the dual-polarization terrestrial digital broadcasting according to this embodiment, the allocation of the hierarchy and the transmitted terrestrial digital broadcasting service can be switched using the TMCC information described later. It is desirable that processing such as coding and mapping be configured to be switchable using TMCC information.
 なお、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送する階層については、バイトインターリーブ、ビットインターリーブ、時間インターリーブは現行の地上デジタル放送サービスと互換性のある処理を行っても良く、またより高度な異なる処理を行っても良い。または高度な地上デジタル放送サービスを伝送する階層については、一部のインターリーブを省略しても構わない。 Regarding the layer that transmits advanced digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 pixels horizontally x 1080 pixels vertically, byte interleaving, bit interleaving, and time interleaving are the same as in the current terrestrial digital broadcasting. Processing that is compatible with the service may be performed, or different, more advanced processing may be performed. Alternatively, for layers that transmit advanced digital terrestrial broadcasting services, some interleaving may be omitted.
 また、図4D(2)の構成では、現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスが伝送される階層のソースとなる入力ストリームは、伝送路符号化部416に入力されるパケットストリームのうち、現行の地上デジタル放送で採用されているMPEG-2システムズで規定されるTSPのストリームで良い。図4D(2)の構成の高度な地上デジタル放送サービスを伝送する階層のソースとなる入力ストリームは、伝送路符号化部416に入力されるパケットストリームのうち、MMTパケットストリーム或いはMMTパケットを含むTLVなどの、MPEG-2システムズで規定されるTSPのストリーム以外のシステムで規定されるストリームであって良い。ただし、高度な地上デジタル放送サービスにおいてMPEG-2システムズで規定されるTSPのストリームを採用しても構わない。 In addition, in the configuration shown in FIG. 4D (2), the current mobile reception service for digital terrestrial broadcasting and the current digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically are transmitted. The source input stream may be a TSP stream defined by the MPEG-2 system, which is currently used in digital terrestrial broadcasting, among the packet streams input to the transmission path encoding unit 416. The input stream that is the source of the layer that transmits the advanced digital terrestrial broadcasting service configured as shown in FIG. It may be a stream defined by a system other than the TSP stream defined by MPEG-2 systems, such as . However, in advanced terrestrial digital broadcasting services, TSP streams defined by MPEG-2 Systems may be adopted.
 以上説明した図4D(2)の構成では、入力ストリームからOFDM伝送波が生成されるまで、現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスが伝送される階層では、現行の地上デジタル放送と互換性のあるストリーム形式や処理が維持される。これにより、図4D(2)の構成で生成される水平偏波のOFDM伝送波や垂直偏波のOFDM伝送波の一方の伝送波を、現存する現行の地上デジタル放送サービスの受信装置が受信した場合も、現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスが伝送される階層については、地上デジタル放送サービスの放送信号を正しく受信および復調することが可能となる。 In the configuration shown in FIG. 4D (2) explained above, until the OFDM transmission wave is generated from the input stream, the current mobile reception service of digital terrestrial broadcasting and video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically are transmitted. At the layer where current digital terrestrial broadcasting services are transmitted, stream formats and processing compatible with current digital terrestrial broadcasting will be maintained. As a result, the current receiving device of the current digital terrestrial broadcasting service receives one of the horizontally polarized OFDM transmission waves and the vertically polarized OFDM transmission waves generated in the configuration shown in Figure 4D (2). In this case, for the layer where the current digital terrestrial broadcasting mobile reception service and the current digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically are transmitted, the broadcasting of the digital terrestrial broadcasting service It becomes possible to receive and demodulate signals correctly.
 また、図4D(2)の構成では、水平偏波のOFDM伝送波と垂直偏波のOFDM伝送波との両者のセグメントを用いる階層において、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送することができ、当該高度な地上デジタル放送サービスの放送信号は本発明の実施例に係る放送受信装置100で受信および復調することが可能となる。 In addition, in the configuration of FIG. 4D (2), in a layer that uses segments of both horizontally polarized OFDM transmission waves and vertically polarized OFDM transmission waves, the maximum resolution is the number of pixels exceeding 1920 pixels horizontally x 1080 pixels vertically. It is possible to transmit an advanced digital terrestrial broadcasting service that can transmit video images, and the broadcast signal of the advanced digital terrestrial broadcasting service can be received and demodulated by the broadcast receiving apparatus 100 according to the embodiment of the present invention. becomes.
 即ち、図4D(2)の構成では、高度な地上デジタル放送サービスに対応した放送受信装置においても、現存する現行の地上デジタル放送サービスの受信装置においても、デジタル放送が好適に受信および復調できるデジタル放送波を生成することができる。 That is, in the configuration of FIG. 4D (2), both the broadcast receiving device compatible with advanced terrestrial digital broadcasting services and the existing receiving device of terrestrial digital broadcasting services can receive and demodulate digital broadcasts favorably. Broadcast waves can be generated.
 なお、本実施例に係る単偏波地上デジタル放送のOFDM伝送波を生成する場合、図4D(2)に示した伝送路符号化部416は、水平偏波(H)のOFDM伝送波を生成する系統と垂直偏波(V)のOFDM伝送波を生成する系統の何れか一方のみで構成されれば良い。この場合も、本構成で伝送するOFDM伝送波は、例えば、図4B(1)または(2)のセグメント構成を有するものであるが、上述の偏波両用地上デジタル放送のOFDM伝送波を生成する場合と異なり、水平偏波のOFDM伝送波と垂直偏波のOFDM伝送波の何れか一方のみが送出される。その他の構成及び動作等は上述の偏波両用地上デジタル放送のOFDM伝送波を生成する場合と同様となる。 Note that when generating OFDM transmission waves for single-polarized terrestrial digital broadcasting according to this embodiment, the transmission path encoding unit 416 shown in FIG. 4D (2) generates horizontally polarized (H) OFDM transmission waves. It is sufficient to consist of only one of a system for generating a vertically polarized (V) OFDM transmission wave and a system for generating a vertically polarized (V) OFDM transmission wave. In this case as well, the OFDM transmission wave transmitted with this configuration has, for example, the segment configuration shown in FIG. Unlike the case, only one of the horizontally polarized OFDM transmission wave and the vertically polarized OFDM transmission wave is transmitted. Other configurations, operations, etc. are the same as in the case of generating OFDM transmission waves for dual-polarization terrestrial digital broadcasting described above.
 次に、図4D(3)について説明する。図4D(3)は、本実施例に係る階層分割多重地上デジタル放送のOFDM伝送波を生成する場合の伝送路符号化部416の構成である。図4D(3)においても、多重化部/限定受信処理部415から入力されて再多重処理を施されたパケットストリームは、誤り訂正の冗長度が付加される他、バイトインターリーブ、ビットインターリーブ、時間インターリーブ、周波数インターリーブなどの各種のインターリーブ処理が行われる。その後、パイロット信号、TMCC信号、AC信号とともにIFFTによる処理が行われ、ガードインターバルが付加された後に直交変調を経てOFDM伝送波となるものである。 Next, FIG. 4D (3) will be explained. FIG. 4D (3) shows the configuration of the transmission path encoding unit 416 when generating OFDM transmission waves for hierarchical division multiplexing digital terrestrial broadcasting according to this embodiment. In FIG. 4D (3) as well, the packet stream input from the multiplexing unit/conditional access processing unit 415 and subjected to re-multiplexing processing is added with redundancy for error correction, byte interleaving, bit interleaving, time Various interleaving processes such as interleaving and frequency interleaving are performed. Thereafter, the pilot signal, TMCC signal, and AC signal are processed by IFFT, and after a guard interval is added, they undergo orthogonal modulation to become an OFDM transmission wave.
 しかしながら、図4D(3)の構成では、上側階層で伝送される変調波と下側階層で伝送される変調波とをそれぞれ生成し、多重したのちデジタル放送波であるOFDM伝送波を生成する。図4D(3)の構成の上側に示す処理系統が上側階層で伝送される変調波を生成するための処理系統であり、下側に示す処理系統が下側階層で伝送される変調波を生成するための処理系統である。図4D(3)の上側階層で伝送される変調波を生成するための処理系統を伝送するデータは、現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスであり、図4D(3)の上側階層で伝送される変調波を生成するための処理系統における各種処理は、図4D(1)の各種処理と同一または互換性を有する処理である。図4D(3)の上側階層で伝送される変調波は、例えば、図4D(1)の伝送波と同様に図4B(3)のセグメント構成を有するものである。よって、図4D(3)の上側階層で伝送される変調波は現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスと互換性を有するデジタル放送波である。これに対し、図4D(3)の下側階層で伝送される変調波を生成するための処理系統を伝送するデータは、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスであり、例えば、外符号、内符号、マッピング等の処理について、図4D(1)の構成の各処理では採用していないより高度な処理を用いるように構成すれば良い。 However, in the configuration of FIG. 4D (3), a modulated wave transmitted in the upper layer and a modulated wave transmitted in the lower layer are respectively generated, and after multiplexing, an OFDM transmission wave that is a digital broadcast wave is generated. The processing system shown in the upper part of the configuration of FIG. 4D (3) is a processing system for generating modulated waves transmitted in the upper layer, and the processing system shown in the lower part generates modulated waves transmitted in the lower layer. This is a processing system for The data transmitted by the processing system for generating the modulated wave transmitted in the upper layer of Figure 4D (3) is based on the current terrestrial digital broadcasting mobile reception service and the maximum resolution is 1920 pixels horizontally x 1080 pixels vertically. This is the current terrestrial digital broadcasting service that transmits video, and various processes in the processing system for generating modulated waves transmitted in the upper layer of Figure 4D (3) are the same as or different from the various processes in Figure 4D (1). This is a compatible process. The modulated wave transmitted in the upper layer of FIG. 4D(3) has, for example, the segment configuration of FIG. 4B(3) similarly to the transmitted wave of FIG. 4D(1). Therefore, the modulated waves transmitted in the upper layer of FIG. 4D (3) are used for the current mobile reception service of digital terrestrial broadcasting and for the current digital terrestrial broadcasting service that transmits video with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically. It is a digital broadcast wave that is compatible with On the other hand, the data transmitted by the processing system for generating the modulated wave transmitted in the lower layer of FIG. It is an advanced terrestrial digital broadcasting service that can be transmitted, and is configured to use more advanced processing that is not adopted in the processing of the configuration in Figure 4D (1), for example, regarding processing of outer codes, inner codes, mapping, etc. Just do it.
 図4D(3)の下側階層で伝送される変調波は、例えば、13セグメントすべてをA階層として水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスに割り当てても良い。または、図4B(3)のセグメント構成を有して1セグメントのA階層で現行の地上デジタル放送の移動体受信サービスを伝送し、12セグメントのB階層で水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスを伝送しても良い。後者の場合、図4D(2)と同様に、外符号処理から時間インターリーブ処理までA階層とB階層などの階層ごとに処理を切り替えられるように構成すれば良い。現行の地上デジタル放送の移動体受信サービスを伝送する階層では、現行の地上デジタル放送と互換性のある処理を維持する必要がある点は、図4D(2)の説明と同様である。 The modulated wave transmitted in the lower layer of FIG. 4D (3) is, for example, an advanced terrestrial device that can transmit video with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels with all 13 segments in the A layer. It may also be allocated to digital broadcasting services. Or, with the segment configuration shown in Figure 4B (3), the current mobile reception service of digital terrestrial broadcasting is transmitted in the A layer of 1 segment, and the pixels exceeding 1920 pixels horizontally x 1080 pixels vertically in the B layer of 12 segments. An advanced digital terrestrial broadcasting service that can transmit video with a maximum resolution of In the latter case, as in FIG. 4D (2), the configuration may be such that the processing from outer code processing to time interleaving processing can be switched for each layer such as the A layer and the B layer. In the layer that transmits the mobile reception service of current digital terrestrial broadcasting, it is necessary to maintain processing compatible with current digital terrestrial broadcasting, which is similar to the explanation in FIG. 4D (2).
 図4D(3)の構成では、上側階層で伝送される変調波と、下側階層で伝送される変調波とを多重化した地上デジタル放送波であるOFDM伝送波を生成する。当該OFDM伝送波から上側階層で伝送される変調波を分離する技術は現存する現行の地上デジタル放送サービスの受信装置にも搭載されているため、上側階層で伝送される変調波に含まれる、現行の地上デジタル放送の移動体受信サービスや水平1920画素×垂直1080画素を最大解像度とする映像を伝送する現行の地上デジタル放送サービスの放送信号は、現存する現行の地上デジタル放送サービスの受信装置で正しく受信および復調される。これに対し、下側階層で伝送される変調波に含まれる、水平1920画素×垂直1080画素を超える画素数を最大解像度とする映像を伝送可能な高度な地上デジタル放送サービスの放送信号は、本発明の実施例に係る放送受信装置100で受信および復調することが可能となる。 In the configuration shown in FIG. 4D (3), an OFDM transmission wave, which is a terrestrial digital broadcast wave, is generated by multiplexing a modulated wave transmitted in the upper layer and a modulated wave transmitted in the lower layer. The technology for separating the modulated waves transmitted in the upper layer from the OFDM transmission waves is also installed in the existing reception equipment of the current digital terrestrial broadcasting service, so the current terrestrial digital broadcasting mobile reception services and the current digital terrestrial broadcasting service's broadcast signals that transmit images with a maximum resolution of 1920 pixels horizontally x 1080 pixels vertically cannot be correctly received by the existing receiving equipment of the current digital terrestrial broadcasting service. Received and demodulated. On the other hand, the broadcast signals of advanced digital terrestrial broadcasting services that can transmit images with a maximum resolution of more than 1920 horizontal pixels x 1080 vertical pixels, which are included in the modulated waves transmitted in the lower layer, are It becomes possible to receive and demodulate the broadcast receiving apparatus 100 according to the embodiment of the invention.
 即ち、図4D(3)の構成では、高度な地上デジタル放送サービスに対応した放送受信装置においても、現存する現行の地上デジタル放送サービスの受信装置においても、デジタル放送が好適に受信および復調できるデジタル放送波を生成することができる。また、図4D(3)の構成では、図4D(2)の構成と異なり、複数の偏波を用いる必要がなく、より簡便に受信可能なOFDM伝送波を生成することができる。 In other words, in the configuration of FIG. 4D (3), both the broadcast receiving device compatible with advanced terrestrial digital broadcasting services and the existing receiving device of terrestrial digital broadcasting services can suitably receive and demodulate digital broadcasts. Broadcast waves can be generated. Further, in the configuration of FIG. 4D(3), unlike the configuration of FIG. 4D(2), there is no need to use a plurality of polarized waves, and it is possible to generate an OFDM transmission wave that can be received more easily.
 本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM伝送波生成処理では、SFNの置局間距離への適合性や移動受信におけるドップラーシフトへの耐性等を考慮し、キャリア数の異なる三種類のモードを用意する。なお、キャリア数の異なる別モードをさらに用意しても良い。キャリア数が多いモードでは有効シンボル長が長くなり、同じガードインターバル比(ガードインターバル長/有効シンボル長)であればガードインターバル長が長くなり、長い遅延時間差のマルチパスに対する耐性を持たせることが可能である。一方、キャリア数が少ないモードの場合にはキャリア間隔が広くなり、移動体受信等の場合に生じるドップラーシフトによるキャリア間干渉の影響を受けにくくすることが可能である。 The OFDM transmission wave generation processing according to FIG. 4D (1), FIG. 4D (2), and FIG. 4D (3) of this embodiment is compatible with the distance between SFN stations and resistance to Doppler shift in mobile reception. Considering the above, three types of modes with different numbers of carriers are prepared. Note that another mode having a different number of carriers may be further prepared. In a mode with a large number of carriers, the effective symbol length becomes longer, and with the same guard interval ratio (guard interval length/effective symbol length), the guard interval length becomes longer, making it possible to provide resistance to multipaths with long delay time differences. It is. On the other hand, in the case of a mode with a small number of carriers, the carrier spacing becomes wide, and it is possible to make it less susceptible to the influence of inter-carrier interference due to Doppler shift that occurs in mobile reception and the like.
 本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM伝送波生成処理では、1つまたは複数のOFDMセグメントにより構成される階層ごとにキャリア変調方式、内符号の符号化率、時間インターリーブ長等のパラメータを設定可能である。図4Eに、本実施例に係るシステムのモードで識別されるOFDMセグメントの1セグメント単位の伝送パラメータの一例を示す。なお、図中のキャリア変調方式とは『データ』キャリアの変調方式を指すものである。SP信号、CP信号、TMCC信号、AC信号は、『データ』キャリアの変調方式とは異なる変調方式を採用する。これらの信号は、情報量よりも雑音に対する耐性が重要な信号であるため、『データ』キャリアの変調方式(いずれもQPSK以上、即ち4状態以上)より状態数の少ない少値のコンスタレーション(BPSKまたはDBPSK、即ち2状態)にマッピングを行う変調方式を採用し、雑音に対する耐性を高めている。 In the OFDM transmission wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, the carrier modulation method and internal Parameters such as the coding rate of the code and the time interleave length can be set. FIG. 4E shows an example of transmission parameters for each segment of OFDM segments identified by the mode of the system according to the present embodiment. Note that the carrier modulation method in the figure refers to the modulation method of the "data" carrier. The SP signal, CP signal, TMCC signal, and AC signal employ a different modulation method than that of the "data" carrier. These signals are signals in which noise immunity is more important than the amount of information, so a small-value constellation with fewer states (BPSK or DBPSK (ie, two states), a modulation method that performs mapping is adopted to improve resistance to noise.
 また、キャリア数の各数値は、斜線の左側の数値がキャリア変調方式としてQPSKや16QAMや64QAM等を設定した場合の値であり、斜線の右側の数値がキャリア変調方式としてDQPSKを設定した場合の値である。図中、下線を引いたパラメータは、現行の地上デジタル放送の移動体受信サービスとは互換性のないパラメータである。具体的には『データ』キャリアの変調方式の256QAM、1024QAMや4096QAMは、現行の地上デジタル放送サービスでは採用されていない。したがって、本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM放送波生成処理における現行の地上デジタル放送サービスと互換性が必要な階層における処理においては、『データ』キャリアの変調方式の256QAM、1024QAMや4096QAMは用いない。高度な地上デジタル放送サービスに対応する階層で伝送する『データ』キャリアに対しては、現行の地上デジタル放送サービスと互換性のあるQPSK(状態数4)、16QAM(状態数16)、64QAM(状態数64)などの変調方式に加えて、256QAM(状態数256)、1024QAM(状態数1024)や4096QAM(状態数4096)などのさらに多値の変調方式を適用しても構わない。また、これらの変調方式と異なる変調方式を採用しても構わない。 Also, for each numerical value of the number of carriers, the value on the left side of the diagonal line is the value when QPSK, 16QAM, 64QAM, etc. is set as the carrier modulation method, and the value on the right side of the diagonal line is the value when DQPSK is set as the carrier modulation method. It is a value. In the figure, the underlined parameters are incompatible with the current mobile reception service of digital terrestrial broadcasting. Specifically, the modulation methods of "data" carriers such as 256QAM, 1024QAM, and 4096QAM are not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, 256QAM, 1024QAM, and 4096QAM, which are modulation methods for the "data" carrier, are not used. For "data" carriers that transmit in layers that support advanced digital terrestrial broadcasting services, QPSK (4 states), 16QAM (16 states), and 64QAM (16 states) are compatible with current digital terrestrial broadcasting services. In addition to the modulation method such as Equation 64), further multi-level modulation methods such as 256QAM (number of states: 256), 1024QAM (number of states: 1024), and 4096QAM (number of states: 4096) may be applied. Furthermore, a modulation method different from these modulation methods may be employed.
 なお、パイロットシンボル(SPやCP)キャリアの変調方式は、現行の地上デジタル放送サービスと互換性のあるBPSK(状態数2)を用いれば良い。ACキャリアとTMCCキャリアの変調方式は、現行の地上デジタル放送サービスと互換性のあるDBPSK(状態数2)を用いれば良い。 Note that BPSK (number of states: 2), which is compatible with current digital terrestrial broadcasting services, may be used as the modulation method for the pilot symbol (SP or CP) carrier. As the modulation method for the AC carrier and TMCC carrier, DBPSK (number of states: 2), which is compatible with current digital terrestrial broadcasting services, may be used.
 また、内符号処理の方式として、LDPC符号は、現行の地上デジタル放送サービスでは採用されていない。したがって、本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM放送波生成処理における現行の地上デジタル放送サービスと互換性が必要な階層における処理においては、LDPC符号は用いない。高度な地上デジタル放送サービスに対応する階層で伝送するデータに対しては、内符号としてLDPC符号を適用して構わない。また、外符号処理の方式として、BCH符号は、現行の地上デジタル放送サービスでは採用されていない。したがって、本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM放送波生成処理における現行の地上デジタル放送サービスと互換性が必要な階層における処理においては、BCH符号は用いない。高度な地上デジタル放送サービスに対応する階層で伝送するデータに対しては、外符号としてBCH符号を適用して構わない。 Furthermore, as an inner code processing method, the LDPC code is not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, LDPC code is not used. An LDPC code may be applied as an inner code to data transmitted in a layer corresponding to advanced digital terrestrial broadcasting services. Furthermore, as an outer code processing method, the BCH code is not adopted in current digital terrestrial broadcasting services. Therefore, in the processing in the layer that requires compatibility with the current digital terrestrial broadcasting service in the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, BCH code is not used. The BCH code may be applied as an outer code to data transmitted in a layer corresponding to advanced terrestrial digital broadcasting services.
 また、図4Fに、本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM放送波生成処理の1物理チャンネル(6MHz帯域幅)単位の伝送信号パラメータの一例を示す。本実施例の図4D(1)、図4D(2)、および図4D(3)に係るOFDM放送波生成処理においては、現行の地上デジタル放送サービスとの互換性のために基本的には、図4Fのパラメータでは原則として現行の地上デジタル放送サービスと互換性のあるパラメータを採用する。ただし、図4D(3)の下側階層で伝送する変調波においてすべてのセグメントを高度な地上デジタル放送サービスに割り当てる場合は、当該変調波において現行の地上デジタル放送サービスと互換性を維持する必要はない。したがって、この場合、図4D(3)の下側階層で伝送する変調波については図4Fに示すパラメータ以外のパラメータを用いても良い。 In addition, FIG. 4F shows the transmission signal parameters for each physical channel (6 MHz bandwidth) of the OFDM broadcast wave generation processing according to FIGS. 4D (1), 4D (2), and 4D (3) of this embodiment. An example is shown. In the OFDM broadcast wave generation processing according to FIG. 4D(1), FIG. 4D(2), and FIG. 4D(3) of this embodiment, basically, for compatibility with the current terrestrial digital broadcasting service, In principle, the parameters shown in FIG. 4F are compatible with current digital terrestrial broadcasting services. However, if all segments of the modulated waves transmitted in the lower layer of Figure 4D (3) are assigned to advanced digital terrestrial broadcasting services, it is not necessary to maintain compatibility with the current digital terrestrial broadcasting services in the modulated waves. do not have. Therefore, in this case, parameters other than the parameters shown in FIG. 4F may be used for the modulated wave transmitted in the lower layer of FIG. 4D(3).
 次に、本実施例に係るOFDM伝送波のキャリアについて説明する。本実施例に係るOFDM伝送波のキャリアには、映像や音声等のデータが伝送されるキャリアの他、復調の基準となるパイロット信号(SP、CP、AC1、AC2)が伝送されるキャリアや、キャリアの変調形式や畳込み符号化率等の情報であるTMCC信号が伝送されるキャリアがある。これらの伝送には、セグメントごとのキャリア数の1/9に相当する数のキャリアが使用される。また、誤り訂正には連接符号を採用しており、外符号には短縮化リードソロモン(204,188)符号、内符号には拘束長7、符号化率1/2をマザーコードとするパンクチャード畳込み符号を採用する。外符号、内符号ともに前記と異なる符号化を使用しても良い。情報レートは、キャリア変調形式や畳込み符号化率やガードインターバル比等のパラメータにより異なる。 Next, the carrier of the OFDM transmission wave according to this embodiment will be explained. The carrier of the OFDM transmission wave according to this embodiment includes a carrier for transmitting data such as video and audio, a carrier for transmitting pilot signals (SP, CP, AC1, AC2) that serve as demodulation standards, There is a carrier on which a TMCC signal, which is information such as a carrier modulation format and a convolutional coding rate, is transmitted. For these transmissions, a number of carriers corresponding to 1/9 of the number of carriers for each segment are used. In addition, a concatenated code is used for error correction, with a shortened Reed-Solomon (204,188) code for the outer code and a punctured code with a constraint length of 7 and a coding rate of 1/2 as the mother code for the inner code. Adopt convolutional codes. Encoding different from the above may be used for both the outer code and the inner code. The information rate varies depending on parameters such as carrier modulation format, convolutional coding rate, and guard interval ratio.
 また、204シンボルを1フレームとし、1フレーム内には整数個のTSPが含まれる。伝送パラメータの切り替えはこのフレームの境界で行われる。 Furthermore, 204 symbols constitute one frame, and one frame includes an integer number of TSPs. Transmission parameter switching is performed at this frame boundary.
 復調の基準となるパイロット信号には、SP(Scattered Pilot)、CP(Continual Pilot)、AC(Auxiliary Channel)1、AC2がある。図4Gに、同期変調(QPSK、16QAM、64QAM、256QAM、1024QAM、4096QAM等)の場合のパイロット信号等のセグメント内での配置イメージの一例を示す。SPは同期変調のセグメントに挿入され、キャリア番号(周波数軸)方向に12キャリアに1回、OFDMシンボル番号(時間軸)方向には4シンボルに1回伝送される。SPの振幅および位相は既知であるため、同期復調の基準として使用可能となる。図4Hに、差動変調(DQPSK等)の場合のパイロット信号等のセグメント内での配置イメージの一例を示す。CPは差動変調のセグメントの左端に挿入される連続した信号であり、復調に使用される。 Pilot signals that serve as demodulation standards include SP (Scattered Pilot), CP (Continual Pilot), AC (Auxiliary Channel) 1, and AC2. FIG. 4G shows an example of how pilot signals and the like are arranged within a segment in the case of synchronous modulation (QPSK, 16QAM, 64QAM, 256QAM, 1024QAM, 4096QAM, etc.). The SP is inserted into a synchronous modulation segment and is transmitted once every 12 carriers in the carrier number (frequency axis) direction and once every 4 symbols in the OFDM symbol number (time axis) direction. Since the amplitude and phase of SP are known, they can be used as a reference for synchronous demodulation. FIG. 4H shows an example of how pilot signals and the like are arranged within a segment in the case of differential modulation (DQPSK, etc.). CP is a continuous signal inserted at the left end of a differential modulation segment and is used for demodulation.
 AC1およびAC2は、CPに情報を載せたものであり、パイロット信号の役割を担うことに加え、放送事業者用の情報の伝送にも使用される。AC1およびAC2は、その他の情報の伝送に使用されても良い。 AC1 and AC2 carry information on the CP, and in addition to playing the role of pilot signals, they are also used to transmit information for broadcasters. AC1 and AC2 may also be used to transmit other information.
 なお、図4Gおよび図4Hに示した配置イメージは、それぞれモード3の場合の例であり、キャリア番号は0から431となるが、モード1やモード2の場合では、それぞれ、0から107或いは0から215となる。また、AC1やAC2やTMCCを伝送するキャリアはセグメントごとに予め決められていて良い。なお、AC1やAC2やTMCCを伝送するキャリアは、マルチパスによる伝送路特性の周期的なディップの影響を軽減するために、周波数方向にランダムに配置されるものとする。 Note that the layout images shown in FIGS. 4G and 4H are examples for mode 3, and the carrier numbers range from 0 to 431, but in mode 1 and mode 2, carrier numbers range from 0 to 107 or 0, respectively. 215. Furthermore, carriers for transmitting AC1, AC2, and TMCC may be determined in advance for each segment. Note that carriers for transmitting AC1, AC2, and TMCC are randomly arranged in the frequency direction in order to reduce the influence of periodic dips in transmission path characteristics due to multipath.
 [TMCC信号]
 TMCC信号は、階層構成やOFDMセグメントの伝送パラメータ等、受信機の復調動作等に関わる情報(TMCC情報)を伝送する。TMCC信号は、各セグメント内で規定されたTMCC伝送用のキャリアで伝送される。図5Aに、TMCCキャリアのビット割り当ての一例を示す。TMCCキャリアは204ビット(B0~B203)で構成される。B0はTMCCシンボルのための復調基準信号であり、所定の振幅および位相基準を有する。B1~B16は同期信号であり、16ビットのワードで構成される。同期信号は、w0とw1の二種類が規定され、フレームごとにw0とw1が交互に送出される。B17~B19はセグメント形式の識別に用いられ、各セグメントが差動変調部か同期変調部かを識別する。B20~B121はTMCC情報が記載される。B122~B203はパリティビットである。
[TMCC signal]
The TMCC signal transmits information (TMCC information) related to the demodulation operation of the receiver, such as the layer configuration and the transmission parameters of the OFDM segment. The TMCC signal is transmitted on a carrier for TMCC transmission defined within each segment. FIG. 5A shows an example of bit allocation for TMCC carriers. The TMCC carrier consists of 204 bits (B0 to B203). B0 is the demodulation reference signal for the TMCC symbol and has predetermined amplitude and phase references. B1 to B16 are synchronization signals, each consisting of a 16-bit word. Two types of synchronization signals, w0 and w1, are defined, and w0 and w1 are sent out alternately for each frame. B17 to B19 are used to identify the segment type, and identify whether each segment is a differential modulation section or a synchronous modulation section. TMCC information is written in B20 to B121. B122 to B203 are parity bits.
 本実施例に係るOFDM伝送波のTMCC情報は、例えば、その一例として、システム識別、伝送パラメータ切替指標、起動制御信号(緊急警報放送用起動フラグ)、カレント情報、ネクスト情報、周波数変換処理識別、物理チャンネル番号識別、主信号識別、4K信号伝送階層識別、追加階層伝送識別、等の、受信機の復調と復号動作を補助するための情報を含むように、構成すれば良い。カレント情報は現在の階層構成および伝送パラメータを示し、ネクスト情報は切り替え後の階層構成および伝送パラメータを示す。伝送パラメータの切り替えはフレーム単位で行われる。図5Bに、TMCC情報のビット割り当ての一例を示す。また、図5Cに、カレント情報/ネクスト情報に含まれる伝送パラメータ情報の構成の一例を示す。なお、連結送信位相補正量は、伝送方式が共通な地上デジタル音声放送ISDB-TSB(ISDB for Terrestrial Sound Broadcasting)等の場合に使用される制御情報である。ここでは、連結送信位相補正量の詳細について説明を省略する。 The TMCC information of the OFDM transmission wave according to the present embodiment includes, for example, system identification, transmission parameter switching index, activation control signal (startup flag for emergency warning broadcasting), current information, next information, frequency conversion process identification, It may be configured to include information to assist demodulation and decoding operations of the receiver, such as physical channel number identification, main signal identification, 4K signal transmission layer identification, and additional layer transmission identification. Current information indicates the current hierarchical configuration and transmission parameters, and next information indicates the hierarchical configuration and transmission parameters after switching. Transmission parameter switching is performed on a frame-by-frame basis. FIG. 5B shows an example of bit allocation of TMCC information. Further, FIG. 5C shows an example of the configuration of transmission parameter information included in the current information/next information. Note that the connected transmission phase correction amount is control information used in the case of terrestrial digital audio broadcasting ISDB-TSB (ISDB for Terrestrial Sound Broadcasting), etc., which uses a common transmission method. Here, detailed explanation of the coupled transmission phase correction amount will be omitted.
 図5Dに、システム識別のビット割り当ての一例を示す。システム識別用の信号には2ビットが割り当てられる。現行の地上デジタルテレビジョン放送システムの場合、『00』が設定される。伝送方式が共通な地上デジタル音声放送システムの場合、『01』が設定される。また、本実施例に係る偏波両用地上デジタル放送または単偏波地上デジタル放送または階層分割多重地上デジタル放送などの高度地上デジタルテレビジョン放送システムの場合、『10』が設定される。高度地上デジタルテレビジョン放送システムでは、偏波両用伝送方式または単偏波地上デジタル放送または階層分割多重方式による放送波伝送により、2K放送番組(水平1920画素×垂直1080画素の映像の放送番組、それ以下の解像度の映像の放送番組を含んでも良い)と4K放送番組(水平1920画素×垂直1080画素を超える映像の放送番組、水平3840画素×垂直2160画素の映像の放送番組に限られない)を同一サービス内で同時に伝送することが可能である。 FIG. 5D shows an example of bit allocation for system identification. Two bits are allocated to the system identification signal. In the case of the current digital terrestrial television broadcasting system, "00" is set. In the case of digital terrestrial audio broadcasting systems that use a common transmission method, "01" is set. Further, in the case of an advanced digital terrestrial television broadcasting system such as dual-polarization terrestrial digital broadcasting, single-polarization digital terrestrial broadcasting, or hierarchical division multiplexing digital terrestrial broadcasting according to this embodiment, "10" is set. The advanced digital terrestrial television broadcasting system transmits 2K broadcast programs (1920 horizontal pixels x 1080 vertical pixels, 4K broadcast programs (not limited to broadcast programs with video exceeding 1920 pixels horizontally x 1080 pixels vertically, and broadcast programs with video exceeding 3840 pixels horizontally x 2160 pixels vertically). It is possible to transmit simultaneously within the same service.
 伝送パラメータ切替指標は、伝送パラメータを切り替える場合にカウントダウンすることにより、受信機に切り替えタイミングを通知するために用いられる。この指標は、通常時には『1111』の値であり、伝送パラメータを切り替える場合には切り替えの15フレーム前からフレームごとに1ずつ減算される。切り替えタイミングは『0000』を送出する次のフレーム同期とする。指標の値は、『0000』の次は『1111』に戻る。図5Bに示したTMCC情報のシステム識別やカレント情報/ネクスト情報に含まれる伝送パラメータ情報や周波数変換処理識別や主信号識別や4K信号伝送階層識別や追加階層伝送識別等のパラメータのいずれか1つ以上を切り替える場合にはカウントダウンを行う。TMCC情報の起動制御信号のみを切り替える場合にはカウントダウンを行わない。 The transmission parameter switching index is used to notify the receiver of the switching timing by counting down when switching transmission parameters. This index normally has a value of "1111", and when switching transmission parameters, it is subtracted by 1 for each frame starting 15 frames before switching. The switching timing is the next frame synchronization when "0000" is sent. The value of the index returns to "1111" after "0000". Any one of the parameters such as the system identification of the TMCC information, the transmission parameter information, the frequency conversion process identification, the main signal identification, the 4K signal transmission layer identification, the additional layer transmission identification, etc. included in the current information/next information shown in FIG. 5B. When switching between the above, a countdown is performed. When switching only the activation control signal of TMCC information, no countdown is performed.
 起動制御信号(緊急警報放送用起動フラグ)は、緊急警報放送において受信機への起動制御が行われている場合には『1』とし、起動制御が行われていない場合には『0』とする。 The activation control signal (activation flag for emergency warning broadcasting) is set to ``1'' when activation control is being performed to the receiver during emergency warning broadcasting, and is set to ``0'' when activation control is not performed. do.
 カレント情報/ネクスト情報ごとの部分受信フラグは、伝送帯域中央のセグメントが部分受信に設定される場合には『1』に、そうでない場合には『0』に設定される。部分受信用にセグメント0が設定される場合、その階層はA階層として規定される。ネクスト情報が存在しない場合には、部分受信フラグは『1』に設定される。 The partial reception flag for each current information/next information is set to "1" when the segment at the center of the transmission band is set for partial reception, and to "0" otherwise. When segment 0 is configured for partial reception, its layer is defined as layer A. If next information does not exist, the partial reception flag is set to "1".
 図5Eに、カレント情報/ネクスト情報ごとの各階層伝送パラメータにおけるキャリア変調マッピング方式(データキャリアの変調方式)に対するビット割り当ての一例を示す。このパラメータが『000』の場合、変調方式がDQPSKであることを示す。『001』の場合、変調方式がQPSKであることを示す。『010』の場合、変調方式が16QAMであることを示す。『011』の場合、変調方式が64QAMであることを示す。『100』の場合、変調方式が256QAMであることを示す。『101』の場合、変調方式が1024QAMであることを示す。『110』の場合、変調方式が4096QAMであることを示す。未使用の階層またはネクスト情報が存在しない場合には、このパラメータには『111』が設定される。 FIG. 5E shows an example of bit allocation for the carrier modulation mapping method (data carrier modulation method) in each layer transmission parameter for each current information/next information. When this parameter is "000", it indicates that the modulation method is DQPSK. "001" indicates that the modulation method is QPSK. "010" indicates that the modulation method is 16QAM. "011" indicates that the modulation method is 64QAM. "100" indicates that the modulation method is 256QAM. "101" indicates that the modulation method is 1024QAM. "110" indicates that the modulation method is 4096QAM. If there is no unused hierarchy or next information, "111" is set in this parameter.
 符号化率や時間インターリーブの長さ等の設定は、カレント情報/ネクスト情報ごとの各階層の編成情報に応じて各パラメータが設定されて良い。セグメント数は各階層のセグメント数を4ビットの数値で示す。未使用の階層やネクスト情報が存在しない場合には『1111』を設定する。なお、モードやガードインターバル比等の設定は、受信機側において独自に検出されるため、TMCC情報での伝送は行わなくとも良い。 For settings such as the coding rate and the length of time interleaving, each parameter may be set according to the organization information of each layer for each current information/next information. The number of segments indicates the number of segments in each layer using a 4-bit numerical value. If there is no unused hierarchy or next information, "1111" is set. Note that settings such as the mode and guard interval ratio are independently detected on the receiver side, so they do not need to be transmitted using TMCC information.
 図5Fに、周波数変換処理識別のビット割り当ての一例を示す。周波数変換処理識別は、図2Aの変換部201Tや変換部201Lにおいて、後述の周波数変換処理(偏波両用伝送方式の場合)や周波数変換増幅処理(階層分割多重伝送方式の場合)が行われた場合には『0』を設定する。周波数変換処理や周波数変換増幅処理が行われていない場合には『1』を設定する。このパラメータは、例えば、放送局から送出される際には『1』に設定され、変換部201Tや変換部201Lで周波数変換処理や周波数変換増幅処理を実行した際に変換部201Tや変換部201Lにおいて『0』への書き換えを行うように構成しても良い。このようにすれば、放送受信装置100の第二チューナ/復調部130Tや第三チューナ/復調部130Lで受信した際に、周波数変換処理識別のビットが『0』であった場合に、当該OFDM伝送波が放送局から送出された後に周波数変換処理等が行われたことを識別することができる。 FIG. 5F shows an example of bit allocation for frequency conversion processing identification. Frequency conversion processing identification indicates whether frequency conversion processing (in the case of dual-polarization transmission method) or frequency conversion amplification processing (in the case of hierarchical division multiplexing transmission method), which will be described later, has been performed in the conversion unit 201T or conversion unit 201L in FIG. 2A. In this case, set "0". If frequency conversion processing or frequency conversion amplification processing is not being performed, "1" is set. For example, this parameter is set to "1" when transmitted from a broadcasting station, and when the conversion section 201T or conversion section 201L executes frequency conversion processing or frequency conversion amplification processing, the conversion section 201T or conversion section 201L The configuration may also be such that rewriting to "0" is performed at the time. In this way, when the second tuner/demodulator 130T or the third tuner/demodulator 130L of the broadcast receiving device 100 receives the frequency conversion processing identification bit as "0", the OFDM It can be identified that frequency conversion processing or the like has been performed after the transmission wave is sent out from the broadcasting station.
 本実施例に係る偏波両用地上デジタル放送においては、複数の偏波のそれぞれにおいて、当該周波数変換処理識別ビットの設定や書き換えを行えば良い。例えば、複数の偏波の両者が図2Aの変換部201Tで周波数変換されないのであれば、両者のOFDM伝送波に含まれる周波数変換処理識別ビットを『1』のままとすれば良い。また、複数の偏波の一方の偏波のみを変換部201Tで周波数変換するのであれば、当該周波数変換された偏波のOFDM伝送波に含まれる周波数変換処理識別ビットを変換部201Tにおいて『0』に書き換えれば良い。また、複数の偏波の両者を変換部201Tで周波数変換するのであれば、当該周波数変換された両者の偏波のOFDM伝送波に含まれる周波数変換処理識別ビットを変換部201Tにおいて『0』に書き換えれば良い。このようにすれば、放送受信装置100において、複数の偏波のうち、偏波ごとに周波数変換の有無を識別することができる。 In the dual-polarization terrestrial digital broadcasting according to this embodiment, the frequency conversion processing identification bit may be set or rewritten for each of a plurality of polarizations. For example, if both of the plurality of polarized waves are not frequency-converted by the conversion unit 201T in FIG. 2A, the frequency conversion process identification bits included in both OFDM transmission waves may be left as "1". In addition, if only one of the plurality of polarized waves is frequency-converted by the conversion unit 201T, the frequency conversion processing identification bit included in the OFDM transmission wave of the frequency-converted polarized wave is set to “0” in the conversion unit 201T. ”. In addition, if both of the plurality of polarized waves are frequency-converted by the conversion unit 201T, the frequency conversion processing identification bit included in the frequency-converted OFDM transmission waves of both polarized waves is set to “0” in the conversion unit 201T. Just rewrite it. In this way, in the broadcast receiving apparatus 100, it is possible to identify whether or not frequency conversion is to be performed for each polarized wave among a plurality of polarized waves.
 なお、当該周波数変換処理識別ビットは、現行地上デジタル放送では定義されていないため、既にユーザに使用されている地上デジタル放送受信装置では無視されることとなる。ただし、現行地上デジタル放送を改良した水平1920画素×垂直1080画素を最大解像度とする映像を伝送する新たな地上デジタル放送サービスに当該ビットを導入しても良い。この場合、本発明の実施例の放送受信装置100の第一チューナ/復調部130Cも当該新たな地上デジタル放送サービスに対応する第一チューナ/復調部として構成しても良い。 Note that the frequency conversion processing identification bit is not defined in current digital terrestrial broadcasting, so it will be ignored by digital terrestrial broadcasting receiving devices that are already used by users. However, the bits may be introduced into a new terrestrial digital broadcasting service that transmits video with a maximum resolution of 1920 horizontal pixels x 1080 vertical pixels, which is an improvement on the current terrestrial digital broadcasting. In this case, the first tuner/demodulator 130C of the broadcast receiving apparatus 100 according to the embodiment of the present invention may also be configured as a first tuner/demodulator compatible with the new terrestrial digital broadcasting service.
 なお、変形例としては、図2Aの変換部201Tや変換部201Lで当該OFDM伝送波に対して周波数変換処理や周波数変換増幅処理が実行されることを前提に、放送局から送出される際に予め『0』に設定されても良い。なお、受信する放送波が高度地上デジタル放送サービスでない場合、このパラメータは『1』に設定されるように構成しても良い。 In addition, as a modified example, on the premise that the conversion unit 201T or conversion unit 201L in FIG. 2A performs frequency conversion processing or frequency conversion amplification processing on the OFDM transmission wave, It may be set to "0" in advance. Note that if the received broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter may be configured to be set to "1".
 図5Gに、物理チャンネル番号識別のビット割り当ての一例を示す。物理チャンネル番号識別は6ビットの符号で構成され、受信する放送波の物理チャンネル番号(13~52ch)を識別する。受信する放送波が高度地上デジタル放送サービスでない場合、このパラメータは『111111』に設定される。当該物理チャンネル番号識別のビットは、現行地上デジタル放送では定義されておらず、現行地上デジタル放送の受信装置では、放送局側で指定した放送波の物理チャンネル番号をTMCC信号やAC信号などから取得することができなかった。本発明の実施例に係る放送受信装置100では、受信したOFDM伝送波の物理チャンネル番号識別のビットを用いて、TMCC信号やAC信号以外のキャリアを復調しなくとも、当該OFDM伝送波に対して放送局側が設定した物理チャンネル番号を把握することができる。なお、13ch~52chの物理チャンネルは、1ch当たり6MHzの帯域幅で、470~710MHzの周波数帯域に予め割り当てられているものである。よって、放送受信装置100で物理チャンネル番号識別のビットに基づいてOFDM伝送波の物理チャンネル番号を把握できるということは、当該OFDM伝送波が地上デジタル放送波として空中で伝送されていた周波数帯を把握できることを意味するものである。 FIG. 5G shows an example of bit allocation for physical channel number identification. The physical channel number identification consists of a 6-bit code, and identifies the physical channel number (13 to 52 ch) of the received broadcast wave. If the received broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter is set to "111111". The physical channel number identification bit is not defined in current digital terrestrial broadcasting, and current digital terrestrial broadcasting receiving devices acquire the physical channel number of the broadcast wave specified by the broadcasting station from the TMCC signal, AC signal, etc. I couldn't. In the broadcast receiving apparatus 100 according to the embodiment of the present invention, by using the physical channel number identification bit of the received OFDM transmission wave, the OFDM transmission wave is It is possible to understand the physical channel number set by the broadcasting station. Note that the physical channels 13ch to 52ch have a bandwidth of 6 MHz per channel and are previously assigned to a frequency band of 470 to 710 MHz. Therefore, the fact that the broadcast receiving device 100 can grasp the physical channel number of the OFDM transmission wave based on the physical channel number identification bit means that the frequency band in which the OFDM transmission wave was transmitted in the air as a terrestrial digital broadcast wave can be grasped. It means that it is possible.
 本実施例に係る偏波両用地上デジタル放送においては、放送局側のOFDM伝送波の生成処理においては元々1つの物理チャンネルを構成する帯域幅における複数の偏波のペアのそれぞれに当該物理チャンネル番号識別ビットを配置し、同一の物理番号を付与しておけば良い。ここで、放送受信装置100の設置環境によっては、図2Aの変換部201Tにおいて複数の偏波のうち一方の偏波の周波数のみを変換する場合がある。これにより、放送受信装置100で受信する際の当該複数の偏波のペアのそれぞれの周波数が互いに異なってしまった場合、周波数が異なってしまった当該複数の偏波を元々ペアであったことを何らかの方法で把握できなければ放送受信装置側で、偏波両用地上デジタル放送の両方の偏波を用いた高度な地上デジタル放送の復調ができなくなってしまう。このような場合でも、上述の物理チャンネル番号識別ビットを用いれば、放送受信装置100において物理チャンネル番号識別ビットが同一の値を示す伝送波が複数の異なる周波数に存在した場合、放送局側で元々1つの物理チャンネルを構成していた偏波ペアとして伝送されていた伝送波であると識別することができる。これにより、当該同一の値を示す複数の伝送波を用いて、偏波両用地上デジタル放送の高度な地上デジタル放送の復調を実現することが可能となる。 In the dual-polarization terrestrial digital broadcasting according to this embodiment, in the OFDM transmission wave generation processing on the broadcasting station side, each of a plurality of pairs of polarized waves in the bandwidth that originally constitutes one physical channel is assigned the corresponding physical channel number. It is sufficient to arrange identification bits and give the same physical number. Here, depending on the installation environment of the broadcast receiving apparatus 100, the converter 201T in FIG. 2A may convert only the frequency of one of the plurality of polarized waves. As a result, if the frequencies of the plurality of polarized waves received by the broadcast receiving apparatus 100 differ from each other, the plurality of polarized waves with different frequencies can be recognized as originally a pair. If this cannot be grasped in some way, the broadcast receiving device will not be able to demodulate advanced digital terrestrial broadcasting using both polarizations of dual-polarization digital terrestrial broadcasting. Even in such a case, if the physical channel number identification bits described above are used, if there are transmission waves whose physical channel number identification bits have the same value at multiple different frequencies in the broadcast receiving apparatus 100, the broadcasting station can It can be identified as a transmission wave that was transmitted as a polarized wave pair that constituted one physical channel. This makes it possible to realize advanced demodulation of dual-polarization terrestrial digital broadcasting using the plurality of transmission waves exhibiting the same value.
 図5Hに、主信号識別のビット割り当ての一例を示す。本例は当該主信号識別のビットをビットB117に配置する例である。 FIG. 5H shows an example of bit allocation for main signal identification. In this example, the main signal identification bit is placed in bit B117.
 伝送されるOFDM伝送波が偏波両用地上デジタル放送の伝送波である場合、主たる偏波で伝送される伝送波のTMCC情報ではこのパラメータを『1』に設定する。副たる偏波で伝送される伝送波のTMCC情報ではこのパラメータを『0』に設定する。なお、主たる偏波で伝送される伝送波とは、垂直偏波信号と水平偏波信号のうち、現行の地上デジタル放送サービスの伝送に使用されている偏波方向と同一の偏波方向の偏波信号を指す。即ち、現行の地上デジタル放送サービスで水平偏波での伝送が採用されている地域では、偏波両用地上デジタル放送サービスにおいて、水平偏波が主たる偏波であり、垂直偏波が副たる偏波となる。また、現行の地上デジタル放送サービスで垂直偏波での伝送が採用されている地域では、偏波両用地上デジタル放送サービスにおいて、垂直偏波が主たる偏波であり、水平偏波が副たる偏波となる。 If the OFDM transmission wave to be transmitted is a transmission wave of dual polarization terrestrial digital broadcasting, this parameter is set to "1" in the TMCC information of the transmission wave transmitted with the main polarization. This parameter is set to "0" in the TMCC information of the transmission wave transmitted with the secondary polarization. Transmission waves that are transmitted with main polarization are vertically polarized signals and horizontally polarized signals that are polarized in the same direction as the polarization direction used for transmission of current digital terrestrial broadcasting services. Refers to wave signals. In other words, in areas where current terrestrial digital broadcasting services use horizontally polarized transmission, horizontal polarization is the main polarization and vertical polarization is the secondary polarization in dual-polarization terrestrial digital broadcasting services. becomes. In addition, in regions where current digital terrestrial broadcasting services use vertically polarized transmission, in dual-polarization digital terrestrial broadcasting services, vertical polarization is the main polarization, and horizontal polarization is the secondary polarization. becomes.
 本発明の実施例の偏波両用地上デジタル放送の伝送波を受信した放送受信装置100においては、当該主信号識別のビットを用いることにより、受信している伝送波が伝送時に主たる偏波で伝送されていたのか、副たる偏波で伝送されていたのかを識別することができる。例えば、当該主たる偏波および副たる偏波の識別処理を用いれば、後述する初期スキャンの際に、主たる偏波で伝送された伝送波を先に初期スキャンを行い、主たる偏波で伝送された伝送波の初期スキャンの終了後に、副たる偏波で伝送された伝送波の初期スキャンを行うなどの処理が可能となる。 In the broadcast receiving device 100 that receives the transmission wave of the dual polarization terrestrial digital broadcasting according to the embodiment of the present invention, by using the main signal identification bit, the received transmission wave is transmitted in the main polarization at the time of transmission. It is possible to identify whether the signal was being transmitted using a secondary polarization. For example, if the primary polarization and secondary polarization identification processing is used, during the initial scan described later, the transmission wave transmitted with the primary polarization is first scanned, and the transmission wave transmitted with the primary polarization is scanned first. After the initial scan of the transmitted wave is completed, it becomes possible to perform processing such as performing an initial scan of the transmitted wave transmitted with the secondary polarization.
 本実施例に係る偏波両用地上デジタル放送の階層とセグメントと送信するデジタル放送サービスの構成例の詳細は後述するが、主たる偏波のみに含まれるセグメントから構成される階層を用いて現行の地上デジタル放送サービスを伝送し、主たる偏波と副たる偏波の両者に含まれるセグメントを含む階層で高度な地上デジタルサービスを伝送する場合は、先に主たる偏波で伝送された伝送波の初期スキャンを行ってしまい、現行の地上デジタル放送サービスの初期スキャンを完了し、その後、副たる偏波で伝送された伝送波の初期スキャンを行って高度な地上デジタル放送サービスの初期スキャンを行うようにしても良い。このようにすれば、高度な地上デジタル放送サービスの初期スキャンを現行の地上デジタル放送サービスの初期スキャンの完了後に行うことができ、現行の地上デジタル放送サービスの初期スキャンによる設定を、高度な地上デジタル放送サービスの初期スキャンによる設定に反映することができ、好適である。
 なお、主信号識別のビットの『1』と『0』の意味の定義は上述の説明の逆でも構わない。
Details of the hierarchy and segments of dual-polarization terrestrial digital broadcasting according to this embodiment and the configuration example of the digital broadcasting service to be transmitted will be described later. When transmitting digital broadcasting services and transmitting advanced digital terrestrial services in a layer that includes segments included in both the primary and secondary polarizations, an initial scan of the transmitted wave transmitted in the primary polarization first The initial scan of the current digital terrestrial broadcasting service is completed, and then the initial scan of the transmission wave transmitted with the secondary polarization is performed to perform the initial scan of the advanced digital terrestrial broadcasting service. Also good. In this way, the initial scan of the advanced digital terrestrial broadcasting service can be performed after the initial scan of the current digital terrestrial broadcasting service is completed, and the settings made by the initial scan of the current digital terrestrial broadcasting service can be This is suitable because it can be reflected in the settings based on the initial scan of the broadcasting service.
Note that the meanings of the main signal identification bits "1" and "0" may be defined in the opposite way to the above explanation.
 また、当該主信号識別のビットに替えて、偏波方向識別ビットをTMCC情報の一パラメータとしても良い。具体的には、水平偏波で伝送する伝送波には放送局側で偏波方向識別ビットを『1』とし、垂直偏波で伝送する伝送波には放送局側で偏波方向識別ビットを『0』とすれば良い。本発明の実施例の偏波両用地上デジタル放送の伝送波を受信した放送受信装置100においては、当該偏波方向識別ビットを用いることにより、受信している伝送波が伝送時にいずれの偏波方向で伝送されていたのかを識別することができる。例えば、当該偏波方向の識別処理を用いれば、後述する初期スキャンの際に、水平偏波で伝送された伝送波を先に初期スキャンを行い、水平偏波で伝送された伝送波の初期スキャンの終了後に、垂直偏波で伝送された伝送波の初期スキャンを行うなどの処理が可能となる。当該処理の効果の説明は、上述の主信号識別のビットの説明における初期スキャンに関する部分の『主たる偏波』を『水平偏波』と読み替え、『副たる偏波』を『垂直偏波』と読み替えれば良いため、再度の説明は省略する。
 なお、偏波方向識別ビットの『1』と『0』の意味の定義は上述の説明の逆でも構わない。
Further, instead of the main signal identification bit, the polarization direction identification bit may be used as one parameter of the TMCC information. Specifically, the broadcasting station sets the polarization direction identification bit to "1" for transmission waves transmitted with horizontal polarization, and the polarization direction identification bit is set on the broadcasting station side for transmission waves transmitted with vertical polarization. It is sufficient to set it to "0". In the broadcast receiving apparatus 100 that receives the transmission wave of the dual-polarization terrestrial digital broadcasting according to the embodiment of the present invention, by using the polarization direction identification bit, it is possible to determine which polarization direction the received transmission wave is at the time of transmission. It is possible to identify whether the data was being transmitted by For example, if the polarization direction identification process is used, during the initial scan described later, the transmission wave transmitted with horizontal polarization is first scanned, and the transmission wave transmitted with horizontal polarization is initially scanned. After this is completed, it becomes possible to perform processing such as performing an initial scan of the transmitted wave transmitted with vertical polarization. To explain the effect of this processing, in the part related to the initial scan in the explanation of the main signal identification bits above, ``main polarization'' should be read as ``horizontal polarization,'' and ``secondary polarization'' should be read as ``vertical polarization.'' The explanation will be omitted as it is sufficient to read it in other words.
Note that the meanings of "1" and "0" of the polarization direction identification bits may be defined in the opposite way to the above explanation.
 また、上述の主信号識別のビットに替えて、第1信号第2信号識別ビットをTMCC情報の一パラメータとしても良い。具体的には、水平偏波と垂直偏波のうち一方の偏波を第1の偏波と定義し、第1の偏波で伝送する伝送波の放送信号を第1信号と定義し、放送局側で第1信号第2信号識別ビットを『1』とすれば良い。また、他方の偏波を第2の偏波と定義し、第2の偏波で伝送する伝送波の放送信号を第2信号と定義し、放送局側で第1信号第2信号識別ビットを『0』とすれば良い。本発明の実施例の偏波両用地上デジタル放送の伝送波を受信した放送受信装置100においては、当該第1信号第2信号識別ビットを用いることにより、受信している伝送波が伝送時にいずれの偏波方向で伝送されていたのかを識別することができる。なお、当該第1信号第2信号識別ビットは、上述の主信号識別のビットの定義から『主たる偏波』および『副たる偏波』という概念を『第1の偏波』および『第2の偏波』に替えたのみであり、放送受信装置100における処理および効果は、上述の主信号識別のビットの説明における放送受信装置100の処理に関する部分の『主たる偏波』を『第1の偏波』と読み替え、『副たる偏波』を『第2の偏波』と読み替えれば良いため、再度の説明は省略する。 Furthermore, instead of the main signal identification bit described above, the first signal second signal identification bit may be used as one parameter of the TMCC information. Specifically, one of horizontally polarized waves and vertically polarized waves is defined as the first polarized wave, and the broadcast signal of the transmission wave transmitted with the first polarized wave is defined as the first signal. The station side may set the first signal second signal identification bit to "1". In addition, the other polarized wave is defined as the second polarized wave, the broadcast signal of the transmission wave transmitted with the second polarized wave is defined as the second signal, and the broadcast station side sets the first signal second signal identification bit. It is sufficient to set it to "0". In the broadcast receiving apparatus 100 that receives the transmission wave of the dual-polarized terrestrial digital broadcast according to the embodiment of the present invention, by using the first signal second signal identification bit, the received transmission wave is It is possible to identify whether the signal was being transmitted in the polarization direction. Note that the first and second signal identification bits are different from the concepts of "main polarization" and "secondary polarization" in the definition of the main signal identification bits described above. The processing and effects in the broadcast receiving apparatus 100 are as follows: "main polarization" in the part related to the processing of the broadcast receiving apparatus 100 in the explanation of the main signal identification bits described above is changed to "first polarization". ``wave'' and ``sub-polarized wave'' may be read as ``second polarized wave,'' so the explanation will be omitted again.
 なお、第1信号第2信号識別ビットの『1』と『0』の意味の定義は上述の説明の逆でも構わない。 Note that the meanings of "1" and "0" of the first signal and second signal identification bits may be defined in the opposite way to the above explanation.
 なお、上述の主信号識別や偏波方向識別や第1信号第2信号識別は、放送波が本実施例に係る単偏波地上デジタル放送のサービスである場合や高度地上デジタル放送サービスではない場合には必須ではなく、このパラメータは『1』に設定すれば良い。 Note that the above-mentioned main signal identification, polarization direction identification, and first signal and second signal identification apply only when the broadcast wave is a single-polarization digital terrestrial broadcasting service according to this embodiment or when it is not an advanced digital terrestrial broadcasting service. This parameter is not required and can be set to "1".
 次に、本実施例に係る階層分割多重地上デジタル放送の伝送波では、上述の主信号識別のビットに替えて、上下階層識別ビットをTMCC情報の一パラメータとしても良い。具体的には、上側階層で伝送される変調波のTMCC情報では上述の上下階層識別ビットを『1』に設定し、下側階層で伝送される伝送波のTMCC情報では上述の上下階層識別ビットを『0』に設定すれば良い。また、放送波が高度地上デジタル放送サービスではない場合、このパラメータは『1』に設定すれば良い。 Next, in the transmission wave of the hierarchical division multiplexing terrestrial digital broadcasting according to this embodiment, the upper and lower layer identification bits may be used as one parameter of the TMCC information instead of the above-mentioned main signal identification bits. Specifically, in the TMCC information of the modulated wave transmitted in the upper layer, the above-mentioned upper and lower layer identification bit is set to "1", and in the TMCC information of the transmission wave transmitted in the lower layer, the above-mentioned upper and lower layer identification bit is set to "1". It is sufficient to set it to "0". Furthermore, if the broadcast wave is not an advanced terrestrial digital broadcasting service, this parameter may be set to "1".
 本実施例に係る階層分割多重地上デジタル放送においては、放送局側のOFDM伝送波の生成処理においては元々1つの物理チャンネルの上側階層と下側階層とで伝送していた複数の変調波のうち下側階層について、放送受信装置100の設置環境によっては、図2Aの変換部201Lで周波数変換と信号増幅が行われる場合もある。放送受信装置100では、階層分割多重地上デジタル放送の伝送波を受信している場合、上述の上下階層識別ビットに基づいて、元々上側階層で伝送されていた変調波であったのか、下側階層で伝送されていた変調波であったのかを識別することが可能である。例えば、当該識別処理により、下側階層で伝送される高度な地上デジタル放送サービスの初期スキャンを上側階層で伝送される現行の地上デジタル放送サービスの初期スキャンの完了後に行うことができ、現行の地上デジタル放送サービスの初期スキャンによる設定を、高度な地上デジタル放送サービスの初期スキャンによる設定に反映することが可能となる。また、放送受信装置100の第三チューナ/復調部130Lにおいて、当該識別結果に基づいて復調部133Sと復調部133Lの処理の切り替えに用いることもできる。 In the layer division multiplexing terrestrial digital broadcasting according to this embodiment, in the OFDM transmission wave generation process on the broadcasting station side, among the multiple modulated waves originally transmitted in the upper layer and lower layer of one physical channel, Regarding the lower hierarchy, depending on the installation environment of the broadcast receiving device 100, frequency conversion and signal amplification may be performed in the conversion unit 201L in FIG. 2A. When the broadcast receiving apparatus 100 receives transmission waves of layer division multiplexed terrestrial digital broadcasting, it determines whether the modulated waves were originally transmitted in the upper layer or not, based on the above-mentioned upper and lower layer identification bits. It is possible to identify whether it was the modulated wave that was being transmitted. For example, through this identification process, the initial scan of advanced digital terrestrial broadcasting services transmitted in the lower layer can be performed after the initial scan of the current digital terrestrial broadcasting service transmitted in the upper layer is completed, and the It becomes possible to reflect the settings made by the initial scan of the digital broadcasting service to the settings made by the initial scan of the advanced digital terrestrial broadcasting service. Furthermore, in the third tuner/demodulator 130L of the broadcast receiving apparatus 100, it can be used to switch the processing of the demodulator 133S and the demodulator 133L based on the identification result.
 なお、以下の各実施例における偏波両用伝送方式の説明においては、特に断りのない場合、一例として水平偏波が主たる偏波であり垂直偏波が副たる偏波である例について説明する。しかしながら、水平偏波と垂直偏波について、主と副の関係が逆であっても良い。
 図5Iに、4K信号伝送階層識別のビット割り当ての一例を示す。
In the description of the dual polarization transmission system in each embodiment below, unless otherwise specified, an example will be described in which horizontal polarization is the main polarization and vertical polarization is the secondary polarization. However, the relationship between main and sub-waves may be reversed for horizontally polarized waves and vertically polarized waves.
FIG. 5I shows an example of bit allocation for 4K signal transmission layer identification.
 伝送する放送波が本実施例に係る偏波両用地上デジタル放送サービスの伝送波の場合、当該4K信号伝送階層識別のビットは、B階層およびC階層のそれぞれについて、水平偏波信号と垂直偏波信号の両方を使用して4K放送番組の伝送を行うか否かを示すものとすれば良い。B階層の設定およびC階層の設定にそれぞれ1ビットを割り当てる。例えば、B階層およびC階層において、それぞれの階層についての4K信号伝送階層識別のビットが『0』の場合、当該階層において水平偏波信号と垂直偏波信号の両方を使用して4K放送番組の伝送を行うことを示すようにすれば良い。B階層およびC階層において、それぞれの階層についての4K信号伝送階層識別のビットが『1』の場合、当該階層において水平偏波信号と垂直偏波信号の両方を使用する4K放送番組の伝送を行わないことを示すようにすれば良い。このようにすれば、放送受信装置100において、4K信号伝送階層識別のビットを用いて、B階層およびC階層において、それぞれの階層で水平偏波信号と垂直偏波信号の両方を使用して4K放送番組の伝送を行うか否かを識別することができる。 If the broadcast waves to be transmitted are those of the dual-polarization terrestrial digital broadcasting service according to this embodiment, the 4K signal transmission layer identification bits are for horizontally polarized signals and vertically polarized signals for each of the B layer and C layer. It is sufficient to indicate whether or not to transmit a 4K broadcast program using both signals. One bit is assigned to each of the settings of the B layer and the C layer. For example, in the B layer and the C layer, if the 4K signal transmission layer identification bit for each layer is "0", the 4K broadcast program is broadcast using both the horizontally polarized signal and the vertically polarized signal in the layer concerned. It is sufficient to indicate that transmission is to be performed. In the B and C layers, if the 4K signal transmission layer identification bit for each layer is "1", a 4K broadcast program that uses both horizontally polarized signals and vertically polarized signals is transmitted in that layer. Just show that there is no such thing. In this way, in the broadcast receiving apparatus 100, using the 4K signal transmission layer identification bit, the 4K signal can be transmitted using both the horizontally polarized signal and the vertically polarized signal in each layer in the B layer and the C layer. It is possible to identify whether or not to transmit a broadcast program.
 また、伝送する放送波が本実施例に係る単偏波地上デジタル放送サービスの伝送波の場合、当該4K信号伝送階層識別のビットは、B階層およびC階層のそれぞれについて、4K放送番組の伝送を行うか否かを示すものとすれば良い。B階層の設定およびC階層の設定にそれぞれ1ビットを割り当てる。例えば、B階層およびC階層において、それぞれの階層についての4K信号伝送階層識別のビットが『0』の場合、当該階層において4K放送番組の伝送を行うことを示すようにすれば良い。B階層およびC階層において、それぞれの階層についての4K信号伝送階層識別のビットが『1』の場合、当該階層において4K放送番組の伝送を行わないことを示すようにすれば良い。このようにすれば、放送受信装置100において、4K信号伝送階層識別のビットを用いて、B階層およびC階層において、それぞれの階層で4K放送番組の伝送を行うか否かを識別することができる。 In addition, if the broadcast waves to be transmitted are those of the single-polarized digital terrestrial broadcasting service according to this embodiment, the bits of the 4K signal transmission layer identification indicate that the 4K broadcast program is transmitted for each of the B layer and the C layer. It suffices to indicate whether or not to do so. One bit is assigned to each of the settings of the B layer and the C layer. For example, in the B layer and the C layer, if the 4K signal transmission layer identification bit for each layer is "0", it may indicate that a 4K broadcast program will be transmitted in that layer. In the B layer and the C layer, when the 4K signal transmission layer identification bit for each layer is "1", it may indicate that the 4K broadcast program is not transmitted in that layer. In this way, the broadcast receiving apparatus 100 can use the 4K signal transmission layer identification bit to identify whether or not to transmit a 4K broadcast program in each layer in the B layer and the C layer. .
 また、伝送する放送波が、本実施例の階層分割多重地上デジタル放送サービスの放送波である場合、当該4K信号伝送階層識別のビットは、下側階層で4K放送番組の伝送を行うか否かを示すものとすれば良い。このパラメータのB119が『0』の場合、下側階層で4K放送番組の伝送を行う。このパラメータのB119が『1』の場合、下側階層で4K放送番組の伝送を行わない。このようにすれば、放送受信装置100において、4K信号伝送階層識別のビットを用いて、下側階層で4K放送番組の伝送を行うか否かを識別することができる。なお、伝送する放送波が、本実施例の階層分割多重地上デジタル放送サービスの放送波である場合、このパラメータのB118は未定義で良い。 In addition, when the broadcast waves to be transmitted are broadcast waves of the hierarchical division multiplexing terrestrial digital broadcasting service of this embodiment, the bits of the 4K signal transmission layer identification indicate whether or not to transmit the 4K broadcast program in the lower layer. It is sufficient to indicate the following. When this parameter B119 is "0", the 4K broadcast program is transmitted in the lower layer. When this parameter B119 is "1", the 4K broadcast program is not transmitted in the lower hierarchy. In this way, the broadcast receiving apparatus 100 can use the 4K signal transmission layer identification bit to identify whether or not to transmit a 4K broadcast program in the lower layer. Note that when the broadcast wave to be transmitted is a broadcast wave of the hierarchical division multiplexing terrestrial digital broadcasting service of this embodiment, this parameter B118 may be undefined.
 なお、このパラメータが『0』の場合、キャリア変調マッピング方式として、図5Eに示した基本的な変調方式の他、NUC(Non-Uniform Constellation)の変調方式を採用することが可能である。この場合、B階層/C階層に関する伝送パラメータ付加情報のカレント/ネクスト情報を、AC1等を用いて伝送することが可能である。 Note that when this parameter is "0", it is possible to employ an NUC (Non-Uniform Constellation) modulation method as the carrier modulation mapping method in addition to the basic modulation method shown in FIG. 5E. In this case, it is possible to transmit the current/next information of the transmission parameter additional information regarding the B layer/C layer using AC1 or the like.
 また、伝送する放送波が高度地上デジタル放送サービスでない場合、このパラメータはそれぞれ『1』に設定しても良い。 Furthermore, if the broadcast waves to be transmitted are not advanced terrestrial digital broadcasting services, each of these parameters may be set to "1".
 なお、以上説明した4K信号伝送階層識別のビットの『0』と『1』の定義を上述の説明と逆にしても構わない。 Note that the definitions of "0" and "1" of the 4K signal transmission layer identification bits explained above may be reversed from the above explanation.
 図5Jに、追加階層伝送識別のビット割り当ての一例を示す。当該追加階層伝送識別のビットは、伝送する放送波が本実施例の偏波両用地上デジタル放送サービスであって、副たる偏波で伝送される伝送波のB階層およびC階層のそれぞれについて、仮想D階層または仮想E階層として使用するか否かを示すものとすれば良い。 FIG. 5J shows an example of bit allocation for additional layer transmission identification. The bits of the additional layer transmission identification are virtual when the broadcast wave to be transmitted is the dual-polarization terrestrial digital broadcasting service of this embodiment, and for each of the B layer and C layer of the transmission wave transmitted with the secondary polarization. It suffices if it indicates whether to use it as the D layer or the virtual E layer.
 例えば、図の例では、B120に配置するビットは、D階層伝送識別ビットであり、このパラメータが『0』の場合、副たる偏波で伝送されるB階層を仮想D階層として使用する。これは、正確に表現すれば、副たる偏波で伝送されるセグメントのうち、主たる偏波で伝送されるB階層に属するセグメントと同じセグメント番号を有するセグメント群を、主たる偏波で伝送されるB階層とは異なる階層であるD階層として扱うということである。このパラメータが『1』の場合、副たる偏波で伝送されるB階層を仮想D階層として使用せず、B階層として使用する。 For example, in the example shown in the figure, the bit placed in B120 is the D layer transmission identification bit, and if this parameter is "0", the B layer transmitted with the secondary polarization is used as the virtual D layer. To be precise, this means that among the segments transmitted in the secondary polarization, a group of segments that have the same segment number as the segment belonging to the B layer transmitted in the main polarization are transmitted in the main polarization. This means that it is treated as the D layer, which is a different layer from the B layer. When this parameter is "1", the B layer transmitted by secondary polarization is not used as the virtual D layer, but is used as the B layer.
 また、例えば、B121に配置するビットは、E階層伝送識別ビットであり、このパラメータが『0』の場合、副たる偏波で伝送されるC階層を仮想E階層として使用する。これは、正確に表現すれば、副たる偏波で伝送されるセグメントのうち、主たる偏波で伝送されるC階層に属するセグメントと同じセグメント番号を有するセグメント群を、主たる偏波で伝送されるC階層とは異なる階層であるE階層として扱うということである。このパラメータが『1』の場合、副たる偏波で伝送されるC階層を仮想E階層として使用せず、C階層として使用する。 Also, for example, the bit placed in B121 is an E layer transmission identification bit, and if this parameter is "0", the C layer transmitted with the secondary polarization is used as the virtual E layer. To be more precise, this means that among the segments transmitted in the secondary polarization, a group of segments that have the same segment number as the segment belonging to the C layer transmitted in the main polarization are transmitted in the main polarization. This means that it is treated as the E layer, which is a different layer from the C layer. When this parameter is "1", the C layer transmitted by the secondary polarization is not used as the virtual E layer, but is used as the C layer.
 このようにすれば、放送受信装置100において、追加階層伝送識別のビット(D階層伝送識別ビットおよび/またはE階層伝送識別ビット)を用いて、副たる偏波で伝送されるD階層、E階層の有無を識別することができる。即ち、本実施例に係る地上デジタル放送では、図5Jに示す追加階層伝送識別のパラメータを用いることにより、現行の地上デジタル放送ではA階層、B階層、C階層の3つに制限されていた階層数を越えて新たな階層(図5Jの例ではD階層とE階層)を運用することができる。 In this way, in the broadcast receiving apparatus 100, using the additional layer transmission identification bit (D layer transmission identification bit and/or E layer transmission identification bit), It is possible to identify the presence or absence of That is, in the digital terrestrial broadcasting according to this embodiment, by using the additional layer transmission identification parameters shown in FIG. New layers (D layer and E layer in the example of FIG. 5J) can be operated beyond the number of layers.
 なお、このパラメータが『0』の場合、図5Cに示したキャリア変調マッピング方式や符号化率や時間インターリーブの長さ等のパラメータを、仮想D階層/仮想E階層とB階層/C階層とで異ならせることが可能である。この場合、仮想D階層/仮想E階層に関するキャリア変調マッピング方式や畳込み符号化率や時間インターリーブの長さ等のパラメータのカレント/ネクスト情報はAC情報(例えばAC1)等を用いて伝送すれば、放送受信装置100側で、仮想D階層/仮想E階層に関するキャリア変調マッピング方式や畳込み符号化率や時間インターリーブの長さ等のパラメータを把握することができる。 Note that if this parameter is "0", the parameters such as the carrier modulation mapping method, coding rate, and time interleaving length shown in FIG. 5C will be changed between the virtual D layer/virtual E layer and the B layer/C layer. It is possible to make them different. In this case, if the current/next information of parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer is transmitted using AC information (for example, AC1), etc. On the broadcast receiving apparatus 100 side, parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer can be grasped.
 なお、変形例としては、追加階層伝送識別のビット(D階層伝送識別ビットおよび/またはE階層伝送識別ビット)が『0』の場合、副たる偏波で伝送されるTMCC情報のカレント情報/ネクスト情報のB階層および/またはC階層の伝送パラメータを、仮想D階層および/または仮想E階層の伝送パラメータの意味に切り替えるように構成しても良い。この場合、仮想D階層および/または仮想E階層が使用される場合、主たる偏波では、A階層、B階層、C階層が使用され、これらの階層の伝送パラメータは主たる偏波で伝送されるTMCC情報のカレント情報/ネクスト情報で伝送すれば良い。また、副たる偏波では、A階層、D階層、E階層が使用され、これらの階層の伝送パラメータは副たる偏波で伝送されるTMCC情報のカレント情報/ネクスト情報で伝送すれば良い。この場合でも、放送受信装置100側で、仮想D階層/仮想E階層に関するキャリア変調マッピング方式や畳込み符号化率や時間インターリーブの長さ等のパラメータを把握することができる。 In addition, as a modified example, if the additional layer transmission identification bit (D layer transmission identification bit and/or E layer transmission identification bit) is "0", the current information/next information of the TMCC information transmitted in the secondary polarization It may be configured to switch the meaning of the transmission parameters of the B layer and/or C layer of information to the transmission parameters of the virtual D layer and/or the virtual E layer. In this case, when the virtual D layer and/or the virtual E layer is used, the A layer, B layer, and C layer are used in the main polarization, and the transmission parameters of these layers are the TMCC transmitted in the main polarization. It is sufficient to transmit the current information/next information. Further, in the secondary polarization, the A layer, D layer, and E layer are used, and the transmission parameters of these layers may be transmitted using the current information/next information of the TMCC information transmitted in the secondary polarization. Even in this case, the broadcast receiving apparatus 100 side can grasp parameters such as the carrier modulation mapping method, convolutional coding rate, and time interleaving length regarding the virtual D layer/virtual E layer.
 また、伝送する放送波が高度地上デジタル放送サービスでない場合、或いは、高度地上デジタル放送サービスであっても単偏波伝送方式や階層分割多重伝送方式である場合、このパラメータはそれぞれ『1』に設定するように構成しても良い。 Also, if the broadcast waves to be transmitted are not an advanced terrestrial digital broadcasting service, or even if the broadcast waves are an advanced terrestrial digital broadcasting service but are based on a single polarization transmission method or a hierarchical division multiplex transmission method, this parameter should be set to "1". It may be configured to do so.
 なお、追加階層伝送識別のパラメータは、主たる偏波のTMCC情報と副たる偏波のTMCC情報の両者に格納しても良いが、少なくとも副たる偏波のTMCC情報に格納されていれば、上述の処理はいずれも実現可能である。 Note that the parameter for additional layer transmission identification may be stored in both the TMCC information of the main polarization and the TMCC information of the secondary polarization, but if it is stored in the TMCC information of the secondary polarization at least, the above-mentioned Both of these processes are possible.
 また、以上説明した追加階層伝送識別のビットの『0』と『1』の定義を上述の説明と逆にしても構わない。 Furthermore, the definitions of "0" and "1" of the additional layer transmission identification bits explained above may be reversed from the above explanation.
 なお、上述の4K信号伝送階層識別のパラメータがB階層で4K放送番組の伝送を行うことを示している場合、上記D階層伝送識別ビットがB階層を仮想D階層として使用することを示していても、放送受信装置100は当該D階層伝送識別ビットを無視するようにしても良い。同様に、4K信号伝送階層識別のパラメータがC階層で4K放送番組の伝送を行うことを示している場合、E階層伝送識別ビットがC階層を仮想E階層として使用することを示していても、放送受信装置100は当該E階層伝送識別ビットを無視するように構成しても良い。判断処理に用いるビットの優先順位をこのように明確にしておけば、放送受信装置100における判断処理のコンフリクトを防止することができる。 In addition, if the above-mentioned 4K signal transmission layer identification parameter indicates that the 4K broadcast program is transmitted in the B layer, the D layer transmission identification bit indicates that the B layer is used as the virtual D layer. However, the broadcast receiving apparatus 100 may ignore the D layer transmission identification bit. Similarly, if the 4K signal transmission layer identification parameter indicates that the 4K broadcast program is transmitted in the C layer, even if the E layer transmission identification bit indicates that the C layer is used as the virtual E layer, Broadcast receiving apparatus 100 may be configured to ignore the E layer transmission identification bit. By clarifying the priority order of the bits used in the determination process in this way, conflicts in the determination process in the broadcast receiving apparatus 100 can be prevented.
 また、伝送する放送波において、上述の周波数変換処理識別のビットや物理チャンネル番号識別のビットや主信号識別のビットや4K信号伝送識別のビットや追加階層伝送識別のビット等は、上述のシステム識別のパラメータが『10』でない場合にはすべてのビットが『1』に設定されることを原則とすれば良い。システム識別のパラメータが『10』でないが、何らかの問題で例外的に、周波数変換処理識別のビットや物理チャンネル番号識別のビットや主信号識別のビットや4K信号伝送識別のビットや追加階層伝送識別のビットが『1』でない場合であっても、放送受信装置100は、当該『1』でないビットを無視して、これらのすべてのビットが『1』であると判断するように構成しても良い。 In addition, in the broadcast waves to be transmitted, the above-mentioned frequency conversion process identification bit, physical channel number identification bit, main signal identification bit, 4K signal transmission identification bit, additional layer transmission identification bit, etc. In principle, if the parameter is not "10", all bits are set to "1". The system identification parameter is not "10", but due to some problem, the frequency conversion process identification bit, physical channel number identification bit, main signal identification bit, 4K signal transmission identification bit, or additional layer transmission identification bit Even if a bit is not "1", the broadcast receiving device 100 may be configured to ignore the bit that is not "1" and determine that all of these bits are "1". .
 図5Kに、図5Cに示される「符号化率」ビット、即ち誤り訂正の符号化率識別のビット割り当ての一例を示す。 FIG. 5K shows an example of the "coding rate" bits shown in FIG. 5C, that is, bit allocation for error correction coding rate identification.
 ここで、現行の2K放送の地上デジタル放送方式においては、「畳込み符号」専用の符号化率を伝送する識別ビットが伝送される。しかしながら、本実施例に係るデジタル放送では、4K放送の高度地上デジタル放送サービスを2K放送の地上デジタル放送サービスと混在して放送することができる。そして既に説明したとおり、当該4K放送の高度地上デジタル放送サービスでは、内符号としてLDPC符号を用いることができる。 Here, in the current digital terrestrial broadcasting system of 2K broadcasting, an identification bit that transmits a coding rate dedicated to "convolutional code" is transmitted. However, in the digital broadcasting according to this embodiment, the advanced terrestrial digital broadcasting service of 4K broadcasting can be broadcast together with the terrestrial digital broadcasting service of 2K broadcasting. As already explained, in the advanced terrestrial digital broadcasting service of 4K broadcasting, the LDPC code can be used as the inner code.
 そこで、図5Kに示す本実施例に係る誤り訂正の符号化率識別のビットは、現行の2K放送の地上デジタル放送方式とは異なり、畳込み符号専用の符号化率識別ビットではなく、LDPC符号にも対応するように構成している。 Therefore, unlike the current 2K broadcasting digital terrestrial broadcasting system, the coding rate identification bit for error correction according to the present embodiment shown in FIG. 5K is not a coding rate identification bit dedicated to convolutional codes, but is It is also configured to correspond to
 ここで、対象となる地上デジタル放送サービスの内符号が畳込み符号である場合でもLDPC符号である場合でも、共通の範囲に配置されるビットを、符号化率伝送の識別ビットとすることで、ビット数の節約を実現する。さらに、同一の識別ビットであっても、対象となる地上デジタル放送サービスの内符号が畳込み符号である場合と、LDPC符号である場合とでそれぞれ符号化率の設定を独立して設定することにより、デジタル放送システムとして、それぞれの符号化方式に好適な符号化率の選択肢群を採用することができる。 Here, whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code, by using bits arranged in a common range as identification bits for coding rate transmission, Achieve bit savings. Furthermore, even if the identification bits are the same, the coding rate can be set independently depending on whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code. As a result, a group of coding rate options suitable for each coding method can be adopted as a digital broadcasting system.
 具体的には、図5Kの例では、識別ビットが『000』の場合、内符号が畳込み符号であれば符号化率が1/2、内符号がLDPC符号であれば符号化率が2/3であることを示す。識別ビットが『001』の場合、畳込み符号であれば符号化率が2/3、内符号がLDPC符号であれば符号化率が3/4であることを示す。識別ビットが『010』の場合、内符号が畳込み符号であれば符号化率が3/4、内符号がLDPC符号であれば符号化率が5/6であることを示す。識別ビットが『011』の場合、内符号が畳込み符号であれば符号化率が5/6、内符号がLDPC符号であれば符号化率が2/16であることを示す。識別ビットが『100』の場合、内符号が畳込み符号であれば符号化率が7/8、内符号がLDPC符号であれば符号化率が6/16であることを示す。識別ビットが『101』の場合、内符号が畳込み符号であれば未定義、内符号がLDPC符号であれば符号化率が10/16であることを示す。識別ビットが『110』の場合、内符号が畳込み符号であれば未定義、内符号がLDPC符号であれば符号化率が14/16であることを示す。未使用の階層またはネクスト情報が存在しない場合には、このパラメータには『111』が設定される。なお、上述の符号化率2/3は符号化率81/120を代替しても良い。符号化率3/4は符号化率89/120を代替しても良い。符号化率5/6は符号化率101/120を代替しても良い。また、符号化率8/16や符号化率12/16等を割り当てても良い。 Specifically, in the example of FIG. 5K, when the identification bit is "000", the coding rate is 1/2 if the inner code is a convolutional code, and the coding rate is 2 if the inner code is an LDPC code. /3. When the identification bit is "001", it indicates that the coding rate is 2/3 if the inner code is a convolutional code, and that the coding rate is 3/4 if the inner code is an LDPC code. When the identification bit is "010", it indicates that the coding rate is 3/4 if the inner code is a convolutional code, and that the coding rate is 5/6 if the inner code is an LDPC code. When the identification bit is "011", it indicates that the coding rate is 5/6 if the inner code is a convolutional code, and that the coding rate is 2/16 if the inner code is an LDPC code. When the identification bit is "100", it indicates that the coding rate is 7/8 if the inner code is a convolutional code, and that the coding rate is 6/16 if the inner code is an LDPC code. When the identification bit is "101", it is undefined if the inner code is a convolutional code, and it indicates that the coding rate is 10/16 if the inner code is an LDPC code. When the identification bit is "110", it indicates that it is undefined if the inner code is a convolutional code, and that the coding rate is 14/16 if the inner code is an LDPC code. If there is no unused hierarchy or next information, this parameter is set to "111". Note that the above-mentioned coding rate 2/3 may be substituted for the coding rate 81/120. The coding rate 3/4 may be substituted for the coding rate 89/120. The coding rate 5/6 may be substituted for the coding rate 101/120. Alternatively, a coding rate of 8/16, a coding rate of 12/16, etc. may be assigned.
 なお、対象となる地上デジタル放送サービスの内符号が畳込み符号であるかLDPC符号であるかの識別は、当該地上デジタル放送サービスが現行の地上デジタル放送サービスであるか高度地上デジタル放送サービスであるかを識別した結果を用いて識別しても良い。当該識別は、図5Dまたは図5Iで説明した識別ビットを用いて行えば良い。ここで、対象となる地上デジタル放送サービスが現行の地上デジタル放送サービスである場合に内符号が畳込み符号であると識別すれば良い。また、対象となる地上デジタル放送サービスが高度地上デジタル放送サービスである場合に内符号がLDPC符号であると識別すれば良い。 In addition, the identification of whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code is determined by whether the digital terrestrial broadcasting service is a current digital terrestrial broadcasting service or an advanced digital terrestrial broadcasting service. The identification may be performed using the results of identification. The identification may be performed using the identification bits described in FIG. 5D or FIG. 5I. Here, if the target digital terrestrial broadcasting service is the current digital terrestrial broadcasting service, it is sufficient to identify that the inner code is a convolutional code. Further, if the target digital terrestrial broadcasting service is an advanced digital terrestrial broadcasting service, it is sufficient to identify that the inner code is an LDPC code.
 また、対象となる地上デジタル放送サービスの内符号が畳込み符号であるかLDPC符号であるかの識別の別の例としては、図6Iで後述する、誤り訂正方式の識別ビットに基づいて識別しても良い。 Another example of identifying whether the inner code of the target digital terrestrial broadcasting service is a convolutional code or an LDPC code is to identify it based on the identification bits of the error correction method, which will be described later in FIG. 6I. It's okay.
 以上説明した図5Kに示す誤り訂正の符号化率識別のビットによれば、複数の内符号の方式に対応しながら識別ビットのビット数の増加を防止することができ、好適である。 According to the error correction coding rate identification bits shown in FIG. 5K described above, it is possible to prevent an increase in the number of identification bits while supporting a plurality of inner code systems, which is preferable.
 なお、偏波両用伝送方式の高度地上デジタル放送サービスにおいて、水平偏波で伝送される伝送波のTMCC情報と垂直偏波で伝送される伝送波のTMCC情報とは、同一のものであっても良いし、異なるものであっても良い。同様に、階層分割多重伝送方式の高度地上デジタル放送サービスにおいて、上側階層で伝送される伝送波のTMCC情報と下側階層で伝送される伝送波のTMCC情報とは、同一のものであっても良いし、異なるものであっても良い。また、前述の周波数変換処理識別のパラメータや主信号識別のパラメータや追加階層伝送識別等は、副たる偏波で伝送される伝送波や下側階層で伝送される伝送波のTMCC情報のみに記載されても良い。 In addition, in the advanced digital terrestrial broadcasting service using dual-polarization transmission method, even if the TMCC information of the transmission wave transmitted with horizontal polarization and the TMCC information of the transmission wave transmitted with vertical polarization are the same, It's okay and it can be different. Similarly, in the advanced digital terrestrial broadcasting service using the layer division multiplex transmission system, even if the TMCC information of the transmission wave transmitted in the upper layer and the TMCC information of the transmission wave transmitted in the lower layer are the same, It's okay and it can be different. In addition, the aforementioned frequency conversion processing identification parameters, main signal identification parameters, additional layer transmission identification, etc. are described only in the TMCC information of the transmission waves transmitted in the secondary polarization and the transmission waves transmitted in the lower layer. It's okay to be.
 また、上述の説明では、周波数変換処理識別のパラメータ、主信号識別のパラメータ、偏波方向識別のパラメータ、第1信号第2信号識別のパラメータ、上下階層識別のパラメータ、4K信号伝送階層識別のパラメータ、および追加階層伝送識別のパラメータが、TMCC信号(TMCCキャリア)に含められて伝送される例を説明した。しかしながら、これらのパラメータは、TMCC信号ではなくAC信号(ACキャリア)に含められて伝送されても良い。即ち、これらのパラメータは、データキャリアの変調方式より状態数の少ないマッピングを行う変調方式で変調されるキャリア(TMCCキャリア、ACキャリアなど)の信号で伝送されれば良い。 In addition, in the above explanation, parameters for frequency conversion processing identification, parameters for main signal identification, parameters for polarization direction identification, parameters for first signal and second signal identification, parameters for upper and lower layer identification, and parameters for 4K signal transmission layer identification. , and an additional layer transmission identification parameter are included in a TMCC signal (TMCC carrier) and transmitted. However, these parameters may be included in an AC signal (AC carrier) and transmitted instead of the TMCC signal. That is, these parameters may be transmitted using a signal of a carrier (TMCC carrier, AC carrier, etc.) that is modulated using a modulation method that performs mapping with a smaller number of states than the data carrier modulation method.
 [AC信号]
 AC信号は、放送に関する付加情報信号であり、変調波の伝送制御に関する付加情報または地震動警報情報などである。なお、地震動警報情報はセグメント0のACキャリアを用いて伝送される。一方、変調波の伝送制御に関する付加情報は任意のACキャリアを用いて伝送可能である。図6Aに、AC信号のビット割り当ての一例を示す。AC信号は204ビット(B0~B203)で構成される。B0はACシンボルのための復調基準信号であり、所定の振幅および位相基準を有する。B1~B3はAC信号の構成を識別するための信号である。B4~B203は変調波の伝送制御に関する付加情報の伝送または地震動警報情報の伝送に用いられる。
[AC signal]
The AC signal is an additional information signal related to broadcasting, such as additional information related to transmission control of modulated waves or seismic motion warning information. Note that the seismic motion warning information is transmitted using the AC carrier of segment 0. On the other hand, additional information regarding transmission control of modulated waves can be transmitted using any AC carrier. FIG. 6A shows an example of bit allocation for an AC signal. The AC signal consists of 204 bits (B0 to B203). B0 is the demodulation reference signal for the AC symbol and has predetermined amplitude and phase references. B1 to B3 are signals for identifying the configuration of the AC signal. B4 to B203 are used to transmit additional information related to modulated wave transmission control or seismic motion warning information.
 図6Bに、AC信号の構成識別のビット割り当ての一例を示す。AC信号のB4~B203を用いて地震動警報情報を伝送する場合、このパラメータは『001』または『110』に設定する。地震動警報情報の伝送する場合の構成識別のパラメータ(『001』または『110』)は、TMCC信号の同期信号の先頭3ビット(B1~B3)と同一の符号とし、TMCC信号と同一のタイミングでフレームごとに交互に送出する。また、このパラメータが前述以外の値の場合は、AC信号のB4~B203を用いて変調波の伝送制御に関する付加情報を伝送していることを示す。この場合、AC信号の構成識別のパラメータは、『000』と『111』を、或いは『010』と『101』を、或いは『011』と『100』を、フレームごとに交互に送出する。 FIG. 6B shows an example of bit allocation for AC signal configuration identification. When transmitting seismic motion warning information using AC signals B4 to B203, this parameter is set to "001" or "110". When transmitting seismic motion warning information, the configuration identification parameter ('001' or '110') shall have the same code as the first 3 bits (B1 to B3) of the synchronization signal of the TMCC signal, and shall be transmitted at the same timing as the TMCC signal. Send alternately for each frame. Further, if this parameter has a value other than the above, it indicates that additional information regarding transmission control of modulated waves is transmitted using B4 to B203 of the AC signal. In this case, the parameters for identifying the configuration of the AC signal are "000" and "111", or "010" and "101", or "011" and "100", which are alternately transmitted for each frame.
 AC信号のB4~B203は、変調波の伝送制御に関する付加情報の伝送または地震動警報情報の伝送に用いられる。 B4 to B203 of the AC signal are used to transmit additional information related to modulated wave transmission control or seismic motion warning information.
 変調波の伝送制御に関する付加情報の伝送は、多様なビット構成により行われて良い。例えば、TMCC信号の説明において述べた、周波数変換処理識別や物理チャンネル番号識別や主信号識別や4K信号伝送階層識別や追加階層伝送識別等は、TMCC信号に変えてまたはTMCC信号に加えて、AC信号の変調波の伝送制御に関する付加情報にビットを割り当てて伝送するようにしても良い。このようにすれば、放送受信装置100において、これらのパラメータを用いて既にTMCC信号の説明において説明した各種識別処理を行うことができる。また、4K信号伝送階層識別のいずれかのパラメータが『0』の場合の4K放送番組の伝送階層に関する伝送パラメータ付加情報や、追加階層伝送識別のいずれかのパラメータが『0』の場合の仮想D階層/仮想E階層に関する伝送パラメータの、カレント/ネクスト情報を割り当てても良い。このようにすれば、放送受信装置100において、これらのパラメータを用いて各階層の伝送パラメータを取得することができ、各階層の復調処理を制御することができる。 Transmission of additional information regarding modulated wave transmission control may be performed using various bit configurations. For example, the frequency conversion processing identification, physical channel number identification, main signal identification, 4K signal transmission layer identification, additional layer transmission identification, etc. mentioned in the explanation of the TMCC signal can be used instead of or in addition to the TMCC signal. Bits may be assigned to additional information regarding transmission control of a modulated wave of a signal for transmission. In this way, the broadcast receiving apparatus 100 can use these parameters to perform the various identification processes already described in the description of the TMCC signal. In addition, transmission parameter additional information regarding the transmission layer of a 4K broadcast program when any parameter of the 4K signal transmission layer identification is "0", and virtual D when any parameter of the additional layer transmission identification is "0" Current/next information of transmission parameters regarding the hierarchy/virtual E hierarchy may be assigned. In this way, in the broadcast receiving apparatus 100, the transmission parameters of each layer can be acquired using these parameters, and the demodulation process of each layer can be controlled.
 地震動警報情報の伝送は、図6Cに示すビット割り当てにより行われて良い。地震動警報情報は、同期信号、開始/終了フラグ、更新フラグ、信号識別、地震動警報詳細情報、CRC、パリティビット、等で構成される。同期信号は、13ビットの符号で構成され、TMCC信号の同期信号の先頭3ビットを除く13ビット(B4~B16)と同一の符号とする。AC信号の構成識別が地震動警報情報を伝送することを示している場合、構成識別と同期信号を組み合わせた16ビットの符号は、TMCCの同期信号と同一の16ビットの同期ワードとなる。開始/終了フラグは、地震動警報情報の開始タイミング/終了タイミングのフラグとして、2ビットの符号で構成される。開始/終了フラグは、地震動警報情報の送出の開始時には『11』から『00』に変更され、地震動警報情報の送出の終了時には『00』から『11』に変更される。更新フラグは、2ビットの符号で構成され、開始/終了フラグが『00』の場合に伝送される一連の地震動警報詳細情報の内容に変更が生じるごとに、『00』を初期値として『1』ずつ増加される。『11』の次は『00』に戻るものとする。開始/終了フラグが『11』の場合は更新フラグも『11』となる。 Transmission of seismic motion warning information may be performed using the bit allocation shown in FIG. 6C. The seismic motion warning information includes a synchronization signal, start/end flag, update flag, signal identification, seismic motion warning detailed information, CRC, parity bit, and the like. The synchronization signal is composed of a 13-bit code, and is the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal. When the configuration identification of the AC signal indicates that seismic motion warning information is transmitted, the 16-bit code that combines the configuration identification and the synchronization signal becomes a 16-bit synchronization word that is the same as the TMCC synchronization signal. The start/end flag is composed of a 2-bit code as a flag for the start timing/end timing of the seismic motion warning information. The start/end flag is changed from "11" to "00" at the start of sending out seismic motion warning information, and from "00" to "11" at the end of sending out seismic motion warning information. The update flag consists of a 2-bit code, and each time there is a change in the contents of a series of seismic motion warning detailed information transmitted when the start/end flag is "00", the update flag is set to "1" with an initial value of "00". ” is increased. After "11", the number returns to "00". When the start/end flag is "11", the update flag is also "11".
 図6Dに、信号識別のビット割り当ての一例を示す。信号識別は、3ビットの符号で構成され、地震動警報詳細情報の種別を識別するために使用される。このパラメータが『000』の場合、『地震動警報詳細情報(該当地域あり)』を意味する。このパラメータが『001』の場合、『地震動警報詳細情報(該当地域なし)』を意味する。このパラメータが『010』の場合、『地震動警報詳細情報の試験信号(該当地域あり)』を意味する。このパラメータが『011』の場合、『地震動警報詳細情報の試験信号(該当地域なし)』を意味する。このパラメータが『111』の場合、『地震動警報詳細情報なし』を意味する。なお、開始/終了フラグが『00』の場合には、信号識別は、『000』、『001』、『010』、または『011』となる。開始/終了フラグが『11』の場合には、信号識別は『111』となる。 FIG. 6D shows an example of bit allocation for signal identification. The signal identification consists of a 3-bit code, and is used to identify the type of detailed seismic motion warning information. When this parameter is "000", it means "seismic motion warning detailed information (applicable area exists)". When this parameter is "001", it means "seismic motion warning detailed information (no applicable area)". When this parameter is "010", it means "test signal of earthquake motion warning detailed information (corresponding area exists)". When this parameter is "011", it means "test signal of earthquake motion warning detailed information (no applicable area)". When this parameter is "111", it means "no detailed earthquake motion warning information". Note that when the start/end flag is "00", the signal identification is "000", "001", "010", or "011". When the start/end flag is "11", the signal identification is "111".
 地震動警報詳細情報は、88ビットの符号で構成される。信号識別が、『000』、『001』、『010』、あるいは『011』である場合、地震動警報詳細情報は、地震動警報情報を送出する現在時刻に関する情報や地震動警報の対象となる地域を示す情報や地震動警報の対象となる地震の震源地の緯度/経度/震度、等の情報を伝送する。信号識別が、『000』、『001』、『010』、あるいは『011』である場合の地震動警報詳細情報のビット割り当ての一例を、図6Eに示す。また、信号識別が『111』の場合、地震動警報詳細情報のビットを用いて、放送事業者を識別するための符号等を伝送することが可能である。信号識別が『111』である場合の地震動警報詳細情報のビット割り当ての一例を、図6Fに示す。 The seismic motion warning detailed information is composed of an 88-bit code. When the signal identification is "000", "001", "010", or "011", the seismic motion warning detailed information indicates information regarding the current time when the seismic motion warning information is sent and the area targeted for the seismic motion warning. It transmits information such as the latitude/longitude/intensity of the epicenter of the earthquake that is the target of information and earthquake motion warnings. FIG. 6E shows an example of bit allocation of the seismic motion warning detailed information when the signal identification is "000", "001", "010", or "011". Further, when the signal identification is "111", it is possible to transmit a code for identifying the broadcaster using the bits of the seismic motion warning detailed information. FIG. 6F shows an example of bit allocation of the seismic motion warning detailed information when the signal identification is "111".
 CRCは、地震動警報情報のうちのB21~B111について、所定の生成多項式を用いて生成される符号である。パリティビットは、地震動警報情報のうちのB17~B121について、差集合巡回符号(273,191)の短縮符号(187,105)により生成される符号である。 CRC is a code generated using a predetermined generating polynomial for B21 to B111 of the seismic motion warning information. The parity bit is a code generated by the shortened code (187, 105) of the difference set cyclic code (273, 191) for B17 to B121 of the seismic motion warning information.
 放送受信装置100では、図6C、図6D、図6E、図6Fで説明した地震動警報に関するパラメータを用いて、緊急事態に対処するための各種制御を行うことが可能である。例えば、地震動警報に関する情報を提示制御、優先度の低い表示内容を地震動警報に関する表示に切り替える制御、アプリケーションの表示を終了して地震動警報に関する表示や放送番組映像に切り替える制御などを行うことが可能である。 In the broadcast receiving device 100, it is possible to perform various controls for dealing with an emergency situation using the parameters related to the seismic motion warning described in FIGS. 6C, 6D, 6E, and 6F. For example, it is possible to control the presentation of information related to earthquake motion warnings, control to switch display content with low priority to display related to earthquake motion warnings, control to terminate the application display and switch to display related to earthquake motion warnings or broadcast program video, etc. be.
 図6Gに、変調波の伝送制御に関する付加情報のビット割り当ての一例を示す。変調波の伝送制御に関する付加情報は、同期信号、カレント情報、ネクスト情報、パリティビット、等で構成される。同期信号は、13ビットの符号で構成され、TMCC信号の同期信号の先頭3ビットを除く13ビット(B4~B16)と同一の符号とする。同期信号はTMCC信号の同期信号の先頭3ビットを除く13ビット(B4~B16)と同一の符号でなくとも良い。AC信号の構成識別が変調波の伝送制御に関する付加情報を伝送することを示している場合、構成識別と同期信号を組み合わせた16ビットの符号は、TMCCの同期信号に準する16ビットの同期ワードとなる。TMCCの同期信号とは異なる16ビットの同期ワードであっても良い。カレント情報は、B階層またはC階層で4K放送番組を伝送する際の伝送パラメータ付加情報や、仮想D階層または仮想E階層に関する伝送パラメータの、現在の情報を示す。ネクスト情報は、B階層またはC階層で4K放送番組を伝送する際の伝送パラメータ付加情報や、仮想D階層または仮想E階層に関する伝送パラメータの、切り替え後の情報を示す。 FIG. 6G shows an example of bit allocation of additional information regarding modulated wave transmission control. Additional information regarding transmission control of modulated waves includes a synchronization signal, current information, next information, parity bit, and the like. The synchronization signal is composed of a 13-bit code, and is the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal. The synchronization signal does not have to have the same code as the 13 bits (B4 to B16) excluding the first three bits of the synchronization signal of the TMCC signal. If the configuration identification of the AC signal indicates that additional information regarding the transmission control of the modulated wave is transmitted, the 16-bit code that combines the configuration identification and the synchronization signal is a 16-bit synchronization word that conforms to the TMCC synchronization signal. becomes. It may be a 16-bit synchronization word different from the TMCC synchronization signal. The current information indicates current information on transmission parameter additional information when transmitting a 4K broadcast program in the B layer or C layer, or transmission parameters regarding the virtual D layer or the virtual E layer. The next information indicates transmission parameter additional information when transmitting a 4K broadcast program in the B layer or C layer, and information after switching of transmission parameters regarding the virtual D layer or the virtual E layer.
 図6Gの例において、カレント情報のB18~B30は、B階層伝送パラメータ付加情報の現在の情報であり、B階層で4K放送番組を伝送する際の伝送パラメータ付加情報の現在の情報を示すものである。また、カレント情報のB31~B43は、C階層伝送パラメータ付加情報の現在の情報であり、C階層で4K放送番組を伝送する際の伝送パラメータ付加情報の現在の情報を示すものである。また、ネクスト情報のB70~B82は、B階層伝送パラメータ付加情報の、伝送パラメータの切り替え後の情報であり、B階層で4K放送番組を伝送する際の伝送パラメータ付加情報の伝送パラメータの切り替え後の情報を示すものである。また、ネクスト情報のB83~B95は、C階層伝送パラメータ付加情報の伝送パラメータの切り替え後の情報であり、C階層で4K放送番組を伝送する際の伝送パラメータ付加情報の伝送パラメータの切り替え後の情報を示すものである。ここで、伝送パラメータ付加情報とは、図5Cに示すTMCC情報の伝送パラメータに追加して仕様を拡張する、変調に関する伝送パラメータである。伝送パラメータ付加情報の具体的な内容は後述する。 In the example of FIG. 6G, current information B18 to B30 is the current information of the B layer transmission parameter additional information, and indicates the current information of the transmission parameter additional information when transmitting a 4K broadcast program in the B layer. be. Further, current information B31 to B43 is current information of the C layer transmission parameter additional information, and indicates the current information of the transmission parameter additional information when transmitting a 4K broadcast program in the C layer. Further, B70 to B82 of the next information are information after switching the transmission parameters of the B layer transmission parameter additional information, and after switching the transmission parameters of the transmission parameter additional information when transmitting a 4K broadcast program in the B layer. It indicates information. Further, B83 to B95 of the next information are information after switching the transmission parameters of the C layer transmission parameter additional information, and information after switching the transmission parameters of the transmission parameter additional information when transmitting a 4K broadcast program in the C layer. This shows that. Here, the transmission parameter additional information is a transmission parameter related to modulation that is added to the transmission parameter of the TMCC information shown in FIG. 5C and whose specifications are expanded. The specific contents of the transmission parameter additional information will be described later.
 図6Gの例において、カレント情報のB44~B56は、仮想D階層を運用する場合の仮想D階層についての伝送パラメータの現在情報である。カレント情報のB57~B69は、仮想E階層を運用する場合の仮想E階層についての伝送パラメータの現在情報である。また、ネクスト情報のB96~B108は、仮想D階層を運用する場合の仮想D階層についての伝送パラメータの切り替え後の情報である。カレント情報のB109~B121は、仮想E階層を運用する場合の仮想E階層についての伝送パラメータの切り替え後の情報である。仮想D階層についての伝送パラメータと仮想E階層についての伝送パラメータに格納するパラメータは図5Cに示したものと同様で良い。 In the example of FIG. 6G, current information B44 to B56 is current information on transmission parameters for the virtual D layer when the virtual D layer is operated. Current information B57 to B69 is current information on transmission parameters for the virtual E layer when operating the virtual E layer. Further, B96 to B108 of the next information are information after the transmission parameters for the virtual D layer are switched when the virtual D layer is operated. Current information B109 to B121 is information after the transmission parameters for the virtual E layer are switched when the virtual E layer is operated. The parameters stored in the transmission parameters for the virtual D layer and the transmission parameters for the virtual E layer may be the same as those shown in FIG. 5C.
 仮想D階層と仮想E階層は、現行の地上デジタル放送に存在しない階層である。図5BのTMCC情報は、現行の地上デジタル放送と互換性を維持する必要があるためビット数の増加を行うことは容易ではない。そこで、本発明の実施例では、当該仮想D階層と仮想E階層についての伝送パラメータを、TMCC情報ではなく、図6Gに示すようにAC情報に格納する。 The virtual D layer and the virtual E layer are layers that do not exist in current digital terrestrial broadcasting. The TMCC information in FIG. 5B needs to maintain compatibility with current terrestrial digital broadcasting, so it is not easy to increase the number of bits. Therefore, in the embodiment of the present invention, the transmission parameters for the virtual D layer and the virtual E layer are stored in the AC information, as shown in FIG. 6G, instead of in the TMCC information.
 これにより、TMCC情報を現行の地上デジタル放送と互換性を維持したままとしながら、新たな仮想D階層と仮想E階層についての変調に関する情報を受信装置に伝送することが可能となる。これにより、本実施例に係る偏波両用地上デジタル放送サービスの放送波であって、副たる偏波で伝送される伝送波のB階層/C階層を仮想D階層/仮想E階層として使用する場合に、副たる偏波で伝送される伝送波の仮想D階層/仮想E階層の伝送パラメータを主たる偏波で伝送される伝送波のB階層/C階層の伝送パラメータと異ならせて設定することが可能となる。 This makes it possible to transmit information regarding modulation for the new virtual D layer and virtual E layer to the receiving device while maintaining compatibility of TMCC information with current terrestrial digital broadcasting. As a result, when using the B layer/C layer of the transmission wave of the dual polarization terrestrial digital broadcasting service according to this embodiment, which is transmitted with secondary polarization, as the virtual D layer/virtual E layer. In addition, it is possible to set the transmission parameters of the virtual D layer/virtual E layer of the transmission wave transmitted with the secondary polarization to be different from the transmission parameters of the B layer/C layer of the transmission wave transmitted with the main polarization. It becomes possible.
 なお、仮想D階層または仮想E階層が使用されない場合には、使用されない階層についての伝送パラメータの情報は、放送受信装置100において無視して問題ない。例えば、仮想D階層または仮想E階層について、図5JのTMCC情報の追加階層伝送識別のパラメータが『1』を示す場合(仮想D階層/仮想E階層を使用しないことを示す場合)、放送受信装置100は、当該使用されない仮想D階層または仮想E階層についての図6Gに示す伝送パラメータにいかなる値が入っていても無視するように構成すれば良い。 Note that if the virtual D layer or the virtual E layer is not used, the information on the transmission parameters for the unused layer can be safely ignored in the broadcast receiving apparatus 100. For example, for the virtual D layer or the virtual E layer, if the parameter of the additional layer transmission identification of the TMCC information in FIG. 5J indicates "1" (indicates that the virtual D layer/virtual E layer is not used), the broadcast receiving device 100 may be configured to ignore any value contained in the transmission parameters shown in FIG. 6G for the unused virtual D layer or virtual E layer.
 次に、図6Gで説明した伝送パラメータ付加情報の詳細について説明する。 Next, details of the transmission parameter additional information explained in FIG. 6G will be explained.
 図6Hに伝送パラメータ付加情報の具体的な一例を示す。伝送パラメータ付加情報には、誤り訂正方式のパラメータ、コンスタレーション形式のパラメータ等を含めることができる。 FIG. 6H shows a specific example of transmission parameter additional information. The transmission parameter additional information can include error correction method parameters, constellation format parameters, and the like.
 誤り訂正方式は、B階層またはC階層で4K放送番組(高度な地上デジタル放送サービス)を伝送する際に、内符号および外符号の誤り訂正方式としてどのような符号化方式を使用するかの設定を示す。図6Iに、誤り訂正方式のビット割り当ての一例を示す。このパラメータが『000』の場合、B階層またはC階層で4K放送番組を伝送する際に、内符号として畳込み符号を使用し、外符号として短縮化RS符号を使用する。このパラメータが『001』の場合、B階層またはC階層で4K放送番組を伝送する際に、内符号としてLDPC符号を使用し、外符号としてBCH符号を使用する。さらにその他の組み合わせを設定して選択できるようにしても良い。 The error correction method is a setting that determines what kind of encoding method is used as the error correction method for the inner code and outer code when transmitting 4K broadcast programs (advanced terrestrial digital broadcasting services) in the B or C layer. shows. FIG. 6I shows an example of bit allocation for the error correction method. When this parameter is "000", a convolutional code is used as the inner code and a shortened RS code is used as the outer code when transmitting a 4K broadcast program on the B layer or the C layer. When this parameter is "001", when transmitting a 4K broadcast program in the B layer or the C layer, the LDPC code is used as the inner code and the BCH code is used as the outer code. Furthermore, other combinations may be set and selected.
 また、B階層またはC階層で4K放送番組を伝送する際、キャリア変調マッピング方式として均一コンスタレーションだけでなく不均一コンスタレーション(Non Uniform Constellation:NUC)を採用することが可能である。図6Jに、コンスタレーション形式のビット割り当ての一例を示す。このパラメータが『000』の場合、TMCC情報の伝送パラメータで選択されたキャリア変調マッピング方式を均一コンスタレーションで適用する。このパラメータが『001』~『111』のいずれかである場合、TMCC情報の伝送パラメータで選択されたキャリア変調マッピング方式を不均一コンスタレーションで適用する。なお、不均一コンスタレーションを適用する場合、誤り訂正方式の種別およびその符号化率等に応じて、不均一コンスタレーションの最適値が異なる。よって、コンスタレーション形式のパラメータが『001』~『111』のいずれかである場合に、本実施例の放送受信装置100は、復調処理で使用する不均一コンスタレーションを、キャリア変調マッピング方式のパラメータと誤り訂正方式のパラメータとその符号化率のパラメータに基づいて、決定すれば良い。当該決定は、放送受信装置100が予め記憶している所定のテーブルを参照することなどで行えば良い。 Furthermore, when transmitting a 4K broadcast program on the B layer or the C layer, it is possible to employ not only a uniform constellation but also a non-uniform constellation (NUC) as a carrier modulation mapping method. FIG. 6J shows an example of bit allocation in a constellation format. When this parameter is "000", the carrier modulation mapping method selected by the transmission parameter of TMCC information is applied in a uniform constellation. When this parameter is one of "001" to "111", the carrier modulation mapping method selected by the transmission parameter of the TMCC information is applied in a non-uniform constellation. Note that when applying a non-uniform constellation, the optimal value of the non-uniform constellation differs depending on the type of error correction method, its coding rate, etc. Therefore, when the parameter of the constellation format is one of "001" to "111", the broadcast receiving apparatus 100 of this embodiment uses the non-uniform constellation used in the demodulation process as the parameter of the carrier modulation mapping method. This may be determined based on the parameters of the error correction method and its coding rate. This determination may be made by referring to a predetermined table stored in advance in the broadcast receiving apparatus 100.
 [高度地上デジタル放送サービスの伝送方式1]
 現行の地上デジタル放送サービスの視聴環境を維持しつつ、4K(水平3840画素×垂直2160画素)放送を実現するため、本発明の実施例に係る高度地上デジタル放送サービスの伝送方式の一例として、偏波両用伝送方式について説明する。本発明の実施例に係る偏波両用伝送方式は、現行の地上デジタル放送方式と一部の仕様を共通とする方式である。例えば、1つの物理チャンネルに相当する約6MHz帯域内の13セグメントを分割して、7セグメントを2K(水平1920画素×垂直1080画素)放送番組の伝送用に、5セグメントを4K放送番組の伝送用に、1セグメントを移動体受信(所謂ワンセグ放送)用に、それぞれ割り当てる。さらに、4K放送用の5セグメントは、水平偏波信号だけでなく垂直偏波信号も用いて、MIMO(Multiple-Input Multiple-Output)技術により合計10セグメント分の伝送容量を確保する。なお、2K放送番組は最新のMPEG-2 Video圧縮技術の最適化等による画質維持を行い、現行のテレビ受信機でも受信可能とし、4K放送番組についてはMPEG-2 Videoよりも高効率なHEVC圧縮技術の最適化や変調多値化等による画質確保を行う。なお、各放送用に対するセグメントの割り当て数は前述と異なっても良い。
[Transmission method 1 of advanced terrestrial digital broadcasting service]
In order to realize 4K (horizontal 3840 pixels x vertical 2160 pixels) broadcasting while maintaining the viewing environment of the current terrestrial digital broadcasting service, we will introduce a biased transmission method as an example of the transmission method of the advanced terrestrial digital broadcasting service according to the embodiment of the present invention. The dual-wave transmission method will be explained. The dual polarization transmission system according to the embodiment of the present invention is a system that shares some specifications with the current digital terrestrial broadcasting system. For example, by dividing 13 segments within the approximately 6 MHz band, which corresponds to one physical channel, 7 segments are used to transmit a 2K (horizontal 1920 pixels x vertical 1080 pixels) broadcast program, and 5 segments are used to transmit a 4K broadcast program. One segment is allocated to each for mobile reception (so-called one-segment broadcasting). Furthermore, the five segments for 4K broadcasting use not only horizontally polarized signals but also vertically polarized signals to ensure a total transmission capacity of 10 segments using MIMO (Multiple-Input Multiple-Output) technology. In addition, 2K broadcast programs maintain image quality by optimizing the latest MPEG-2 Video compression technology, so that they can be received by current TV receivers, and 4K broadcast programs use HEVC compression, which is more efficient than MPEG-2 Video. Image quality will be ensured through technology optimization and modulation/multi-value conversion. Note that the number of segments allocated to each broadcast may be different from the above.
 図7Aに、本発明の実施例に係る高度地上デジタル放送サービスにおける偏波両用伝送方式の一例を示す。地上デジタル放送サービスの放送波の伝送には470MHz~710MHzの周波数帯域が用いられる。470MHz~710MHzの周波数帯域における物理チャンネル数は13~52chの40チャンネルであり、各物理チャンネルは6MHzの帯域幅を有する。本発明の実施例に係る偏波両用伝送方式では、1つの物理チャンネル内で水平偏波信号と垂直偏波信号の両方を使用する。 FIG. 7A shows an example of a dual-polarization transmission system in an advanced terrestrial digital broadcasting service according to an embodiment of the present invention. A frequency band of 470 MHz to 710 MHz is used to transmit broadcast waves for digital terrestrial broadcasting services. The number of physical channels in the frequency band of 470 MHz to 710 MHz is 40 channels of 13 to 52 channels, and each physical channel has a bandwidth of 6 MHz. A dual polarization transmission system according to an embodiment of the present invention uses both horizontally polarized signals and vertically polarized signals within one physical channel.
 図7Aには、13セグメントの割り当て例について(1)と(2)の二つの例を示している。図7Aの(1)の例では、水平偏波信号のセグメント1~7(B階層)を用いて2K放送番組の伝送を行う。水平偏波信号のセグメント8~12(C階層)と垂直偏波信号のセグメント8~12(C階層)の合計10セグメントを用いて4K放送番組の伝送を行う。垂直偏波信号のセグメント1~7(B階層)は、水平偏波信号のセグメント1~7(B階層)で伝送する2K放送番組と同一の放送番組の伝送に用いても良い。または、垂直偏波信号のセグメント1~7(B階層)において水平偏波信号のセグメント1~7(B階層)で伝送する2K放送番組と異なる放送番組の伝送に用いても良い。または、垂直偏波信号のセグメント1~7(B階層)において、その他のデータ伝送に使用しても良いし、未使用でも良い。垂直偏波信号のセグメント1~7(B階層)をどのように使用するかの識別情報は、既に説明したTMCC信号の4K信号伝送階層識別のパラメータや追加階層伝送識別のパラメータ等により受信装置側に伝送可能である。放送受信装置100では、これらパラメータにより、垂直偏波信号のセグメント1~7(B階層)の扱いを識別することができる。また、水平偏波信号のB階層を用いて伝送する2K放送番組と水平/垂直両偏波信号のC階層を用いて伝送する4K放送番組とは、同一の内容の放送番組を異なる解像度で伝送するサイマル放送であっても良いし、異なる内容の放送番組を伝送するものであっても良い。水平/垂直両偏波信号のセグメント0は、同一のワンセグ放送番組の伝送を行う。 FIG. 7A shows two examples (1) and (2) regarding the allocation example of 13 segments. In the example (1) of FIG. 7A, a 2K broadcast program is transmitted using segments 1 to 7 (B layer) of the horizontally polarized signal. A 4K broadcast program is transmitted using a total of 10 segments: horizontally polarized signal segments 8 to 12 (C layer) and vertically polarized signal segments 8 to 12 (C layer). Segments 1 to 7 (layer B) of the vertically polarized signal may be used to transmit the same broadcast program as the 2K broadcast program transmitted by segments 1 to 7 (layer B) of the horizontally polarized signal. Alternatively, the vertically polarized signal segments 1 to 7 (B layer) may be used to transmit a broadcast program different from the 2K broadcast program transmitted in the horizontally polarized signal segments 1 to 7 (B layer). Alternatively, segments 1 to 7 (layer B) of the vertically polarized signal may be used for other data transmission or may be left unused. Identification information on how to use segments 1 to 7 (layer B) of the vertically polarized signal is determined by the receiving device based on the parameters of the 4K signal transmission layer identification of the TMCC signal and the parameters of the additional layer transmission identification, etc., which have already been explained. transmission is possible. In the broadcast receiving apparatus 100, using these parameters, it is possible to identify how to handle segments 1 to 7 (B layer) of the vertically polarized signal. Furthermore, a 2K broadcast program transmitted using the B layer of horizontally polarized signals and a 4K broadcast program transmitted using the C layer of both horizontal and vertically polarized signals are broadcast programs with the same content but transmitted at different resolutions. It may be a simulcast that transmits broadcast programs with different contents. Segment 0 of both the horizontal and vertical polarization signals transmits the same one-segment broadcast program.
 図7Aの(2)の例は、(1)とは別の変形例である。(2)の例では、水平偏波信号のセグメント1~5(B階層)と垂直偏波信号のセグメント1~5(B階層)の合計10セグメントを用いて4K放送番組の伝送を行う。水平偏波信号のセグメント6~12(C階層)を用いて2K放送番組の伝送を行う。(2)の例でも、垂直偏波信号のセグメント6~12(C階層)は、水平偏波信号のセグメント6~12(C階層)で伝送する2K放送番組と同一の放送番組の伝送に用いても良い。垂直偏波信号のセグメント6~12(C階層)は、水平偏波信号のセグメント6~12(C階層)で伝送する2K放送番組と異なる放送番組の伝送に用いても良い。また、垂直偏波信号のセグメント6~12(C階層)は、その他のデータ伝送に使用しても良いし、未使用でも良い。これらの識別情報についても(1)の例と同様であるため再度の説明を省略する。 The example (2) in FIG. 7A is a modification different from (1). In the example (2), a 4K broadcast program is transmitted using a total of 10 segments, segments 1 to 5 of horizontally polarized signals (layer B) and segments 1 to 5 of vertically polarized signals (layer B). A 2K broadcast program is transmitted using segments 6 to 12 (layer C) of horizontally polarized signals. In example (2), segments 6 to 12 (C layer) of the vertically polarized signal are used to transmit the same 2K broadcast program as the 2K broadcast program transmitted in segments 6 to 12 (C layer) of the horizontally polarized signal. It's okay. Segments 6 to 12 (layer C) of the vertically polarized signal may be used to transmit a broadcast program different from the 2K broadcast program transmitted by segments 6 to 12 (layer C) of the horizontally polarized signal. Further, segments 6 to 12 (layer C) of the vertically polarized signal may be used for other data transmission or may be left unused. These pieces of identification information are also the same as in the example (1), and therefore will not be explained again.
 なお、図7Aの(1),(2)の例はいずれも、水平偏波が主たる偏波である場合の例を説明したが、運用によっては、水平偏波と垂直偏波を逆にしても構わない。 In addition, in both examples (1) and (2) of Figure 7A, examples were explained in which horizontal polarization is the main polarization, but depending on the operation, horizontal polarization and vertical polarization may be reversed. I don't mind.
 図7Bに、本発明の実施例に係る偏波両用伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成の一例を示す。これは、偏波両用伝送方式を用いた高度地上デジタル放送サービスの送信側のシステムと受信側のシステムを共に示したものである。偏波両用伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成は、基本的に図1に示した放送システムの構成と同様であるが、放送局の設備である電波塔300Tは水平偏波信号と垂直偏波信号とを同時に送出可能な偏波共用送信アンテナとなる。また、図7Bの例では、放送受信装置100は第二チューナ/復調部130Tの選局/検波部131Hと選局/検波部131Vのみを抜粋して記載し、他の動作部は記載を省略している。 FIG. 7B shows an example of the configuration of a broadcasting system for an advanced terrestrial digital broadcasting service using a dual-polarization transmission system according to an embodiment of the present invention. This shows both the transmitting side system and the receiving side system of an advanced terrestrial digital broadcasting service using a dual-polarization transmission system. The configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the dual-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300T that is the equipment of the broadcasting station is horizontally polarized. This becomes a polarization shared transmitting antenna that can simultaneously transmit a wave signal and a vertically polarized signal. In addition, in the example of FIG. 7B, in the broadcast receiving apparatus 100, only the channel selection/detection section 131H and the channel selection/detection section 131V of the second tuner/demodulation section 130T are extracted and described, and the description of other operating sections is omitted. are doing.
 電波塔300Tから送出された水平偏波信号は、偏波共用受信アンテナであるアンテナ200Tの水平偏波受信用エレメントで受信され、同軸ケーブル202T1を介して、コネクタ部100F1から選局/検波部131Hに入力される。一方、電波塔300Tから送出された垂直偏波信号は、アンテナ200Tの垂直偏波受信用エレメントで受信され、同軸ケーブル202T2を介して、コネクタ部100F2から選局/検波部131Vに入力される。アンテナ(同軸ケーブル)とテレビ受信機とを接続するコネクタ部にはF型コネクタが使用されることが一般的である。 The horizontally polarized signal sent from the radio tower 300T is received by the horizontally polarized receiving element of the antenna 200T, which is a dual polarization receiving antenna, and is sent from the connector section 100F1 to the tuning/detection section 131H via the coaxial cable 202T1. is input. On the other hand, the vertically polarized signal transmitted from the radio tower 300T is received by the vertically polarized wave receiving element of the antenna 200T, and is inputted from the connector section 100F2 to the channel selection/detection section 131V via the coaxial cable 202T2. An F-type connector is generally used for a connector section that connects an antenna (coaxial cable) and a television receiver.
 ここで、ユーザが誤って、同軸ケーブル202T1をコネクタ部100F2に接続し、同軸ケーブル202T2をコネクタ部100F1に接続する可能性もある。この場合、選局/検波部131Hおよび選局/検波部131Vにおいて、入力された放送信号が水平偏波信号か垂直偏波信号かを識別できない等の不具合を生じる可能性がある。前述の不具合を防ぐためには、アンテナ(同軸ケーブル)とテレビ受信機とを接続するコネクタ部の一方、例えば、垂直偏波信号を伝送する同軸ケーブル202T2およびコネクタ部100F2のコネクタ部を、水平偏波信号を伝送する同軸ケーブル202T1とコネクタ部100F1のコネクタ部のF型コネクタとは異なる形状のコネクタ部にする等が考えられる。或いは、選局/検波部131Hおよび選局/検波部131Vが、それぞれ各入力信号のTMCC情報の主信号識別を参照することにより、入力された放送信号が水平偏波信号か垂直偏波信号かを識別して動作するように制御すれば良い。また、同軸ケーブル202T1と同軸ケーブル202T2の2本の同軸ケーブルに代替して、一本の多芯同軸ケーブルによりアンテナ200Tと放送受信装置100とを接続しても良い。 Here, there is a possibility that the user mistakenly connects the coaxial cable 202T1 to the connector section 100F2, and then connects the coaxial cable 202T2 to the connector section 100F1. In this case, a problem may occur in the channel selection/detection section 131H and the channel selection/detection section 131V, such as not being able to distinguish whether the input broadcast signal is a horizontally polarized signal or a vertically polarized signal. In order to prevent the above-mentioned problems, one of the connector parts that connects the antenna (coaxial cable) and the television receiver, for example, the coaxial cable 202T2 and the connector part of the connector part 100F2 that transmit the vertically polarized signal, should be connected to the horizontally polarized signal. It is conceivable that the coaxial cable 202T1 for transmitting signals and the connector section 100F1 have a connector section having a different shape from the F-type connector. Alternatively, the channel selection/detection section 131H and the channel selection/detection section 131V can determine whether the input broadcast signal is a horizontally polarized signal or a vertically polarized signal by referring to the main signal identification of the TMCC information of each input signal. All you have to do is identify it and control it. Furthermore, instead of using two coaxial cables, ie, coaxial cable 202T1 and coaxial cable 202T2, antenna 200T and broadcast receiving apparatus 100 may be connected by one multicore coaxial cable.
 図7Cに、本発明の実施例に係る偏波両用伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成の前述とは異なる構成例の一例を示す。図7Bに示したような、放送受信装置100が二つの放送信号入力用コネクタ部を備え、アンテナ200Tと放送受信装置100との接続に二本の同軸ケーブルを用いる構成は、設備のコスト面およびケーブル配線時の取り扱い等で必ずしも好適ではない場合がある。そこで、図7Cに示した構成では、アンテナ200Tの水平偏波受信用エレメントで受信された水平偏波信号とアンテナ200Tの垂直偏波受信用エレメントで受信された垂直偏波信号とを変換部(コンバータ)201Tに入力し、変換部201Tと放送受信装置100との接続を一本の同軸ケーブル202T3で行うようにする。コネクタ部100F3から入力された放送信号は、分波されて選局/検波部131Hと選局/検波部131Vに入力される。コネクタ部100F3は、変換部201Tに対して動作用電力を供給する機能を有して良い。 FIG. 7C shows an example of a configuration different from the above-mentioned configuration of a broadcasting system for an advanced terrestrial digital broadcasting service using a dual-polarization transmission system according to an embodiment of the present invention. The configuration shown in FIG. 7B, in which the broadcast receiving device 100 includes two broadcast signal input connector sections and uses two coaxial cables to connect the antenna 200T and the broadcast receiving device 100, is advantageous in terms of equipment cost and This may not necessarily be suitable for handling during cable wiring, etc. Therefore, in the configuration shown in FIG. 7C, a conversion unit ( converter) 201T, and connection between the converter 201T and the broadcast receiving device 100 is made using a single coaxial cable 202T3. The broadcast signal input from the connector section 100F3 is demultiplexed and input to the channel selection/detection section 131H and the channel selection/detection section 131V. The connector section 100F3 may have a function of supplying operating power to the conversion section 201T.
 変換部201Tは、放送受信装置100を設置する環境(例えば集合住宅など)の設備に属しても良い。または、アンテナ200Tと一体の装置として構成して住宅等に設置しても良い。変換部201Tは、アンテナ200Tの水平偏波受信用エレメントで受信された水平偏波信号とアンテナ200Tの垂直偏波受信用エレメントで受信された垂直偏波信号のいずれか一方に対して、周波数変換処理を行う。この処理により、同一周波数帯域の水平偏波と垂直偏波を使用して電波塔300Tからアンテナ200Tに伝送された水平偏波信号と垂直偏波信号を、互いに異なる周波数帯域に分離して、一本の同軸ケーブル202T3で同時に放送受信装置100に送信することが可能となる。なお、必要があれば、水平偏波信号と垂直偏波信号の両者に対して周波数変換処理を行っても良いが、この場合も周波数変換後の両者の周波数帯が互いに異なっている必要がある。また、放送受信装置100は1つの放送信号入力用コネクタ部100F3を備えれば良い。 The conversion unit 201T may belong to equipment in an environment (for example, an apartment complex, etc.) in which the broadcast receiving device 100 is installed. Alternatively, it may be configured as a device integrated with the antenna 200T and installed in a house or the like. The conversion unit 201T performs frequency conversion on either the horizontally polarized signal received by the horizontally polarized receiving element of the antenna 200T or the vertically polarized signal received by the vertically polarized receiving element of the antenna 200T. Perform processing. Through this processing, horizontally polarized signals and vertically polarized signals transmitted from radio tower 300T to antenna 200T using horizontally polarized waves and vertically polarized waves in the same frequency band are separated into different frequency bands and unified. It becomes possible to simultaneously transmit data to the broadcast receiving apparatus 100 using a single coaxial cable 202T3. If necessary, frequency conversion processing may be performed on both the horizontally polarized signal and the vertically polarized signal, but in this case as well, the frequency bands of both after frequency conversion must be different from each other. . Furthermore, the broadcast receiving apparatus 100 only needs to include one broadcast signal input connector section 100F3.
 図7Dに、周波数変換処理の一例を示す。この例では、垂直偏波信号に対して周波数変換処理を行っている。具体的には、470MHz~710MHzの周波数帯域(UHFの13ch~52chに相当する帯域)で伝送された水平偏波信号と垂直偏波信号のうち、垂直偏波信号の周波数帯域を470MHz~710MHzの周波数帯域から770MHz~1010MHzの周波数帯域に変換する。この処理により、同一周波数帯域の水平偏波と垂直偏波を使用して伝送された信号を、相互に干渉等することなく、一本の同軸ケーブル202T3で同時に放送受信装置100に送信できるようになる。なお、水平偏波信号に対して周波数変換処理を行っても良い。 FIG. 7D shows an example of frequency conversion processing. In this example, frequency conversion processing is performed on the vertically polarized signal. Specifically, among the horizontally polarized signal and the vertically polarized signal transmitted in the frequency band of 470MHz to 710MHz (corresponding to UHF channels 13ch to 52ch), the frequency band of the vertically polarized signal is set to 470MHz to 710MHz. Converts the frequency band to a frequency band of 770MHz to 1010MHz. Through this processing, signals transmitted using horizontally polarized waves and vertically polarized waves in the same frequency band can be simultaneously transmitted to the broadcast receiving apparatus 100 using a single coaxial cable 202T3 without mutual interference. Become. Note that frequency conversion processing may be performed on the horizontally polarized signal.
 また、周波数変換処理は、TMCC情報の主信号識別を参照した結果に応じて、副たる偏波で伝送された信号に対して行うようにすることが好ましい。図5Hを用いて説明したとおり、主たる偏波で伝送された信号は、副たる偏波で伝送された信号よりも現行の地上デジタル放送サービスが含められて伝送される可能性が高い。よって、現行の地上デジタル放送サービスとの互換性をより好適に維持するために、主たる偏波で伝送された信号は周波数変換せずに、副たる偏波で伝送された信号を周波数変換するのが好適といえる。 Furthermore, it is preferable that the frequency conversion process is performed on the signal transmitted with the secondary polarization according to the result of referring to the main signal identification of the TMCC information. As explained using FIG. 5H, the signal transmitted using the main polarization is more likely to be transmitted including the current digital terrestrial broadcasting service than the signal transmitted using the secondary polarization. Therefore, in order to better maintain compatibility with current digital terrestrial broadcasting services, it is recommended that the signals transmitted in the secondary polarization be frequency-converted without frequency-converting the signals transmitted in the main polarization. is suitable.
 また、副たる偏波で伝送された信号を周波数変換する場合には、変換後の信号において、主たる偏波で伝送された信号の周波数帯よりも副たる偏波で伝送された信号の周波数帯を高くすることが望ましい。これにより、放送受信装置100の初期スキャンにおいて、低周波数側から開始し高周波数側にスキャンを進めていけば、主たる偏波で伝送された信号を副たる偏波で伝送された信号よりも先に初期スキャンを行うことができる。これにより、現行の地上デジタル放送サービスの初期スキャンによる設定を、高度な地上デジタル放送サービスの初期スキャンによる設定に反映する処理などをより好適に行うことができる。 In addition, when converting the frequency of a signal transmitted by secondary polarization, the frequency band of the signal transmitted by secondary polarization is higher than the frequency band of the signal transmitted by primary polarization in the converted signal. It is desirable to increase the As a result, in the initial scan of the broadcast receiving device 100, if the scan starts from the low frequency side and advances to the high frequency side, the signal transmitted with the main polarization will be sent before the signal transmitted with the secondary polarization. An initial scan can be performed. As a result, it is possible to more appropriately perform a process of reflecting settings based on the initial scan of the current digital terrestrial broadcasting service to settings based on the initial scan of the advanced digital terrestrial broadcasting service.
 また、周波数変換処理は、高度地上デジタル放送サービスで使用するすべての物理チャンネルに対して行っても良いが、偏波両用伝送方式による信号伝送を用いている物理チャンネルに対してのみ行っても良い。 In addition, frequency conversion processing may be performed on all physical channels used in advanced terrestrial digital broadcasting services, but it may also be performed only on physical channels that use signal transmission using a dual-polarization transmission method. .
 なお、周波数変換処理による変換後の周波数帯域は、710MHz~1032MHzの間とすることが好ましい。即ち、地上デジタル放送サービスとBS/CSデジタル放送サービスとを同時に受信しようとする場合、アンテナ200Tで受信した地上デジタル放送サービスの放送信号とアンテナ200Bで受信したBS/CSデジタル放送サービスの放送信号とを混合して一本の同軸ケーブルで放送受信装置100に送信することが考えられる。この場合、BS/CS-IF信号が1032MHz~2150MHz程度の周波数帯域を使用するため、前記周波数変換処理による変換後の周波数帯域を710MHz~1032MHzの間となるようにしておけば、水平偏波信号と垂直偏波信号との干渉を避けつつ、地上デジタル放送サービスの放送信号とBS/CSデジタル放送サービスの放送信号との干渉も避けることが可能となる。また、ケーブルテレビ(Community Antenna TVまたはCable TV:CATV)局による再送信放送信号の受信等を考慮した場合、ケーブルテレビ局によるテレビ放送配信で770MHz以下の周波数帯域(UHFの62ch以下に相当する帯域)が使用されていることから、周波数変換処理による変換後の周波数帯域を、UHFの62chに相当する帯域を超える770MHz~1032MHzの間とすれば、より好ましい。 Note that the frequency band after conversion by the frequency conversion process is preferably between 710 MHz and 1032 MHz. That is, when attempting to receive a terrestrial digital broadcasting service and a BS/CS digital broadcasting service at the same time, the broadcasting signal of the terrestrial digital broadcasting service received by the antenna 200T and the broadcasting signal of the BS/CS digital broadcasting service received by the antenna 200B. It is conceivable to mix the signals and transmit them to the broadcast receiving apparatus 100 via a single coaxial cable. In this case, since the BS/CS-IF signal uses a frequency band of approximately 1032 MHz to 2150 MHz, if the frequency band after conversion by the frequency conversion process is set to be between 710 MHz and 1032 MHz, the horizontally polarized signal It becomes possible to avoid interference between the broadcast signal of the terrestrial digital broadcast service and the broadcast signal of the BS/CS digital broadcast service while avoiding interference between the broadcast signal and the vertically polarized wave signal. In addition, when considering the reception of retransmitted broadcast signals by cable television (Community Antenna TV or Cable TV: CATV) stations, the frequency band of 770 MHz or less (corresponding to UHF channel 62 or less) is used for TV broadcast distribution by cable television stations. is used, it is more preferable that the frequency band after frequency conversion processing is between 770 MHz and 1032 MHz, which exceeds the band corresponding to 62 channels of UHF.
 また、周波数変換処理による変換前の周波数帯域と変換後の周波数帯域との間の領域(図中のa部)の帯域幅は、1つの物理チャンネルの帯域幅(6MHz)の整数倍となるように設定することが好ましい。このようにすると、放送受信装置100において、周波数変換処理による変換前の周波数帯域の放送信号と変換後の周波数帯域の放送信号とを一括して周波数スキャンする場合等に、周波数設定制御が容易になる等の利点がある。 In addition, the bandwidth of the region between the frequency band before conversion and the frequency band after conversion by frequency conversion processing (part a in the figure) is set to be an integral multiple of the bandwidth of one physical channel (6 MHz). It is preferable to set it to . In this way, in the broadcast receiving apparatus 100, frequency setting control can be easily performed when, for example, frequency scanning is performed on broadcast signals in a frequency band before conversion by frequency conversion processing and broadcast signals in a frequency band after conversion. There are advantages such as:
 なお、前述のように、本発明の実施例に係る偏波両用伝送方式では、4K放送番組の伝送に水平偏波信号と垂直偏波信号の両方を使用する。したがって、4K放送番組を正しく再生するためには、受信側で、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号の物理チャンネルの組み合わせを正しく把握する必要がある。周波数変換処理を行って、同一物理チャンネルについての、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号とが互いに異なる周波数帯の信号として受信装置に入力される場合でも、本実施例の放送受信装置100では、図5Fから図5Jに示されるTMCC情報のパラメータ(例えば、主信号識別および物理チャンネル番号識別)を適宜参照することにより、同一物理チャンネルの水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号の組み合わせを正しく把握することが可能である。これにより、本実施例の放送受信装置100では、4K放送番組を好適に受信および復調して再生することが可能である。 Note that, as described above, in the dual-polarization transmission system according to the embodiment of the present invention, both horizontally polarized signals and vertically polarized signals are used to transmit 4K broadcast programs. Therefore, in order to correctly reproduce a 4K broadcast program, it is necessary for the receiving side to correctly understand the physical channel combination of the horizontally polarized broadcast signal and the vertically polarized broadcast signal. Even if frequency conversion processing is performed and broadcast signals transmitted in horizontally polarized waves and broadcast signals transmitted in vertically polarized waves on the same physical channel are input to the receiving device as signals in different frequency bands, In the broadcast receiving apparatus 100 of this embodiment, by appropriately referring to the parameters (for example, main signal identification and physical channel number identification) of the TMCC information shown in FIGS. 5F to 5J, transmission is performed using horizontally polarized waves of the same physical channel. It is possible to correctly grasp the combination of a broadcast signal transmitted by vertical polarization and a broadcast signal transmitted by vertical polarization. Thereby, the broadcast receiving apparatus 100 of this embodiment can suitably receive, demodulate, and reproduce a 4K broadcast program.
 なお、図7B、図7C、図7Dの例はいずれも、水平偏波が主たる偏波である場合の例を説明したが、運用によっては、水平偏波と垂直偏波を逆にしても構わない。 Note that although the examples in FIGS. 7B, 7C, and 7D are cases in which horizontal polarization is the main polarization, horizontal polarization and vertical polarization may be reversed depending on the operation. do not have.
 なお、以上説明した偏波両用伝送方式で伝送される地上デジタル放送の放送波は、上述のとおり、放送受信装置100の第二チューナ/復調部130Tで受信および再生が可能であるが、放送受信装置100の第一チューナ/復調部130Cでも受信可能である。当該地上デジタル放送の放送波を第一チューナ/復調部130Cで受信した場合、当該地上デジタル放送の放送波の放送信号のうち、高度地上デジタル放送サービスの階層で伝送された放送信号は無視されるが、現行の地上デジタル放送サービスの階層で伝送された放送信号については再生が行われる。 Note that the broadcast waves of digital terrestrial broadcasting transmitted using the dual-polarization transmission method described above can be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100. When the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
 <高度地上デジタル放送サービスのパススルー伝送方式>
 放送受信装置100は、パススルー伝送方式で伝送される信号を受信することが可能である。パススルー伝送方式は、ケーブルテレビ局等が受信した放送信号を、そのままの信号方式で、同一の周波数或いは周波数変換してCATVの配信システムに送出する方式である。
<Pass-through transmission method for advanced terrestrial digital broadcasting service>
Broadcast receiving device 100 is capable of receiving signals transmitted using a pass-through transmission method. The pass-through transmission method is a method in which a broadcast signal received by a cable television station or the like is transmitted to a CATV distribution system using the same signal method or after frequency conversion.
 パススルー方式は、(1)地上波受信アンテナ出力の各地上デジタル放送信号の伝送信号帯域抽出やレベル調整を行い、伝送信号周波数と同一周波数でCATV施設に伝送する方式と、(2)地上波受信アンテナ出力の各地上デジタル放送信号の伝送信号帯域抽出やレベル調整を行い、CATV施設管理者の設定したVHF帯域やMID帯域やSHB帯域やUHF帯域の周波数でCATV施設に伝送する方式と、がある。前記第一の方式の信号処理を行うための受信増幅器を構成する機器或いは前記第二の方式の信号処理を行うための受信増幅器および周波数変換器を構成する機器がOFDMシグナルプロセッサ(OFDM Signal Processor:OFDM-SP)である。 The pass-through method consists of two methods: (1) extracting the transmission signal band and adjusting the level of each terrestrial digital broadcasting signal output from the terrestrial reception antenna, and transmitting it to the CATV facility at the same frequency as the transmission signal frequency; and (2) terrestrial reception. There is a method that extracts the transmission signal band and adjusts the level of each terrestrial digital broadcast signal output from the antenna, and transmits it to the CATV facility at a frequency in the VHF band, MID band, SHB band, or UHF band set by the CATV facility manager. . The device constituting the receiving amplifier for performing signal processing of the first method or the device constituting the receiving amplifier and frequency converter for performing signal processing of the second method is an OFDM signal processor (OFDM Signal Processor). OFDM-SP).
 図7Eに、偏波両用伝送方式の高度地上デジタル放送サービスにパススルー伝送方式の前記第一の方式を適用した場合のシステム構成の一例を示す。図7Eには、ケーブルテレビ局のヘッドエンド設備400Cと放送受信装置100が示されている。また、図7Fに、その際の周波数変換処理の一例を示す。図7Fにおける(H・V)との表記は、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号の両者が同じ周波数帯に存在する放送信号の状態を示し、(H)との表記は水平偏波で伝送された放送信号を示し、(V)との表記は垂直偏波で伝送された放送信号を示すものである。以降の図7H、図7Iにおける表記も同様の意味である。 FIG. 7E shows an example of a system configuration when the first method of the pass-through transmission method is applied to the advanced terrestrial digital broadcasting service of the dual polarization transmission method. FIG. 7E shows head end equipment 400C of a cable television station and broadcast receiving apparatus 100. Further, FIG. 7F shows an example of frequency conversion processing at that time. The notation (H・V) in FIG. 7F indicates the state of the broadcast signal in which both the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization exist in the same frequency band, and (H ) indicates a broadcast signal transmitted by horizontal polarization, and (V) indicates a broadcast signal transmitted by vertical polarization. The notations in FIGS. 7H and 7I below also have the same meaning.
 本発明の実施例の偏波両用伝送方式の高度地上デジタル放送サービスに対して、前記第一の方式のパススルー伝送を適用する場合、水平偏波で伝送された放送信号に対しては、ケーブルテレビ局のヘッドエンド設備400Cにおいて信号帯域抽出やレベル調整を行い、伝送信号周波数と同一周波数での送出を行う。一方、垂直偏波で伝送された放送信号に対しては、ケーブルテレビ局のヘッドエンド設備400Cにおいて信号帯域抽出やレベル調整を行い、図7Dの説明と同様の周波数変換処理(垂直偏波で伝送された放送信号をUHFの13ch~62chに相当する帯域である470MHz~770MHzの周波数帯域よりも高い周波数帯に変換する処理)を行った後に送出を行う。この処理により、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号との周波数帯域が重複しなくなるので、一本の同軸ケーブル(または光ファイバケーブル)での信号伝送が可能となる。伝送された信号は、本実施例の放送受信装置100で受信可能である。本実施例の放送受信装置100において当該信号に含まれる水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号とを受信、復調する処理は、図7Dの説明と同様であるため、再度の説明を省略する。 When applying the pass-through transmission of the first method to the advanced terrestrial digital broadcasting service of the dual polarization transmission method according to the embodiment of the present invention, the cable television station The head end equipment 400C performs signal band extraction and level adjustment, and sends out at the same frequency as the transmission signal frequency. On the other hand, for broadcast signals transmitted with vertical polarization, signal band extraction and level adjustment are performed at the cable television station's headend equipment 400C, and frequency conversion processing similar to that explained in FIG. Transmission is performed after converting the broadcast signal into a frequency band higher than the frequency band of 470 MHz to 770 MHz, which is the band corresponding to UHF channels 13 to 62. This processing prevents the frequency bands of horizontally polarized broadcast signals and vertically polarized broadcast signals from overlapping, allowing signal transmission over a single coaxial cable (or optical fiber cable). becomes. The transmitted signal can be received by the broadcast receiving device 100 of this embodiment. The process of receiving and demodulating the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization included in the signal in the broadcast receiving apparatus 100 of this embodiment is similar to the explanation of FIG. 7D. Therefore, further explanation will be omitted.
 図7Gに、偏波両用伝送方式の高度地上デジタル放送サービスにパススルー伝送方式の前記第二の方式を適用した場合のシステム構成の一例を示す。図7Gには、ケーブルテレビ局のヘッドエンド設備400Cと放送受信装置100が示されている。また、図7Hに、その際の周波数変換処理の一例を示す。 FIG. 7G shows an example of a system configuration when the second method of the pass-through transmission method is applied to the advanced terrestrial digital broadcasting service of the dual polarization transmission method. FIG. 7G shows head end equipment 400C of a cable television station and broadcast receiving apparatus 100. Further, FIG. 7H shows an example of frequency conversion processing at that time.
 本発明の実施例の偏波両用伝送方式の高度地上デジタル放送サービスに対して、前記第二の方式のパススルー伝送を適用する場合、水平偏波で伝送された放送信号に対しては、ケーブルテレビ局のヘッドエンド設備400Cにおいて信号帯域抽出やレベル調整を行い、CATV施設管理者の設定した周波数への周波数変換処理を行った後に送出を行う。一方、垂直偏波で伝送された放送信号に対しては、ケーブルテレビ局のヘッドエンド設備400Cにおいて信号帯域抽出やレベル調整を行い、図7Dの説明と同様の周波数変換処理(垂直偏波で伝送された放送信号をUHFの13ch~62chの帯域である470MHz~770MHzの周波数帯域よりも高い周波数帯に変換する処理)を行った後に送出を行う。図7Hに示す周波数変換処理は、図7Fと異なり、水平偏波で伝送された放送信号が、UHFの13ch~62chの帯域である470MHz~770MHzの周波数帯域にとどまらず、より低い周波数帯域にまで範囲を広げて90MHz~770MHzの範囲で再配置するように周波数変換を行うものである。この処理により、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号との周波数帯域が重複しなくなるので、一本の同軸ケーブル(または光ファイバケーブル)での信号伝送が可能となる。伝送された信号は、本実施例の放送受信装置100で受信可能である。本実施例の放送受信装置100において当該信号に含まれる水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号とを受信、復調する処理は、図7Dの説明と同様であるため、再度の説明を省略する。 When applying the pass-through transmission of the second method to the advanced terrestrial digital broadcasting service of the dual-polarization transmission method according to the embodiment of the present invention, the cable television station The head end equipment 400C performs signal band extraction and level adjustment, and after frequency conversion processing to the frequency set by the CATV facility manager, transmission is performed. On the other hand, for broadcasting signals transmitted in vertically polarized waves, signal band extraction and level adjustment are performed at the cable television station's headend equipment 400C, and frequency conversion processing similar to that explained in FIG. After converting the broadcast signal into a frequency band higher than the frequency band of 470 MHz to 770 MHz, which is the UHF 13ch to 62ch band, transmission is performed. The frequency conversion process shown in FIG. 7H differs from FIG. 7F in that the broadcast signal transmitted with horizontal polarization is not limited to the frequency band of 470MHz to 770MHz, which is the UHF channel 13ch to 62ch band, but also extends to lower frequency bands. Frequency conversion is performed to widen the range and rearrange the frequencies in the range of 90 MHz to 770 MHz. This processing prevents the frequency bands of broadcast signals transmitted with horizontal polarization and broadcast signals transmitted with vertical polarization from overlapping, allowing signal transmission over a single coaxial cable (or optical fiber cable). becomes. The transmitted signal can be received by the broadcast receiving device 100 of this embodiment. The process of receiving and demodulating the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization included in the signal in the broadcast receiving apparatus 100 of this embodiment is similar to the explanation of FIG. 7D. Therefore, further explanation will be omitted.
 また、図7Gにおけるケーブルテレビ局のヘッドエンド設備400Cの周波数変換処理の別の変形例として、周波数変換後のパススルー出力時の放送信号を図7Hから図7Iに示す状態に変更しても良い。この場合、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号の双方に対して、信号帯域抽出やレベル調整を行い、CATV施設管理者の設定した周波数への周波数変換処理を行った後に送出を行うようにしても良い。図7Iの例では、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号の双方をともに、90MHz~770MHzの範囲(VHF1chからUHF62chまでの範囲)で再配置するように周波数変換を行うものであり、UHF62chを超えた範囲の周波数帯を使用しないので、放送信号の周波数帯利用効率が図7Hよりも高くなる。 Furthermore, as another modification of the frequency conversion process of the headend equipment 400C of the cable television station in FIG. 7G, the broadcast signal at the time of pass-through output after frequency conversion may be changed from FIG. 7H to the state shown in FIG. 7I. In this case, signal band extraction and level adjustment are performed for both horizontally polarized broadcast signals and vertically polarized broadcast signals, and frequency conversion processing is performed to the frequency set by the CATV facility manager. The transmission may be performed after performing the above. In the example of FIG. 7I, the frequency is changed so that both the broadcast signal transmitted with horizontal polarization and the broadcast signal transmitted with vertical polarization are rearranged in the range of 90 MHz to 770 MHz (range from VHF1ch to UHF62ch). Since this converter performs conversion and does not use a frequency band exceeding UHF62ch, the frequency band usage efficiency of the broadcast signal is higher than that in FIG. 7H.
 また、アンテナ受信時のUHFの13ch~52chの帯域である470MHz~710MHzの周波数帯域よりも放送信号を再配置する帯域が広くなるため、図7Iの例に示すように、水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号を交互に再配置することも可能である。このとき、図7Iの例に示すように、アンテナ受信時に同一の物理チャンネルであった水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号のペアを、アンテナ受信時の物理チャンネル順に、交互に再配置すれば、本実施例の放送受信装置100が低周波数側から初期スキャンを行う場合に、元々同一の物理チャンネルであった水平偏波で伝送された放送信号と垂直偏波で伝送された放送信号のペアを元々同一の物理チャンネル単位で順に初期設定を進めていくことができ、初期スキャンを効率良く行うことができる。 In addition, since the frequency band in which broadcast signals are rearranged is wider than the frequency band of 470 MHz to 710 MHz, which is the band of UHF channels 13 to 52 when receiving antennas, as shown in the example of Fig. It is also possible to alternately rearrange the vertically polarized broadcast signal and the vertically polarized broadcast signal. At this time, as shown in the example of Fig. 7I, a pair of a broadcast signal transmitted with horizontal polarization and a broadcast signal transmitted with vertical polarization, which were the same physical channel at the time of antenna reception, is If the channels are rearranged alternately in order, when the broadcast receiving apparatus 100 of this embodiment performs an initial scan from the low frequency side, the broadcast signals transmitted with horizontal polarization and vertical polarization, which were originally the same physical channel, can be Initial settings can be performed sequentially for pairs of broadcast signals transmitted by waves in units of originally the same physical channel, and initial scanning can be performed efficiently.
 なお、図7E、図7F、図7G、図7Hおよび図7Iの例はいずれも、水平偏波が主たる偏波である場合の例を説明したが、運用によっては、水平偏波と垂直偏波を逆にしても構わない。 Note that the examples in Figures 7E, 7F, 7G, 7H, and 7I are examples in which horizontal polarization is the main polarization, but depending on the operation, horizontal polarization and vertical polarization may be used. It doesn't matter if you reverse it.
 なお、以上説明したパススルー伝送方式がなされた偏波両用伝送方式の地上デジタル放送の放送波についても、上述のとおり、放送受信装置100の第二チューナ/復調部130Tで受信および再生が可能であるが、放送受信装置100の第一チューナ/復調部130Cでも受信可能である。当該地上デジタル放送の放送波を第一チューナ/復調部130Cで受信した場合、当該地上デジタル放送の放送波の放送信号のうち、高度地上デジタル放送サービスの階層で伝送された放送信号は無視されるが、現行の地上デジタル放送サービスの階層で伝送された放送信号については再生が行われる。 It should be noted that the broadcast waves of digital terrestrial broadcasting using the dual-polarization transmission method using the pass-through transmission method described above can also be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. However, it can also be received by the first tuner/demodulator 130C of the broadcast receiving apparatus 100. When the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
 [高度地上デジタル放送サービスの伝送方式2]
 現行の地上デジタル放送サービスの視聴環境を維持しつつ、4K放送を実現するため、本発明の実施例に係る高度地上デジタル放送サービスの伝送方式の前述とは異なる一例として、単偏波伝送方式について説明する。本発明の実施例に係る単偏波伝送方式は、現行の地上デジタル放送方式と一部の仕様を共通とする方式であり、水平偏波信号と垂直偏波信号の何れか一方を用いて、SISO(Single-Input Single-Output)技術によりデータ伝送を行う方式である。例えば、1つの物理チャンネルに相当する約6MHz帯域内の13セグメントを分割して、8セグメントを2K放送番組の伝送用に、4セグメントを4K放送番組の伝送用に、1セグメントを移動体受信用に、それぞれ割り当てる。なお、2K放送番組は最新のMPEG-2 Video圧縮技術の最適化等による画質維持を行い、現行のテレビ受信機でも受信可能とし、4K放送番組についてはMPEG-2 Videoよりも高効率なHEVC圧縮技術やVVC圧縮技術等を採用し、更に変調多値化やNUC等の技術の採用による画質確保を行う。なお、各放送用に対するセグメントの割り当て数は前述と異なっても良い。
[Transmission method 2 of advanced terrestrial digital broadcasting service]
In order to realize 4K broadcasting while maintaining the viewing environment of the current digital terrestrial broadcasting service, we will introduce a single polarization transmission method as an example of a transmission method for the advanced digital terrestrial broadcasting service according to the embodiment of the present invention, which is different from the above-described transmission method. explain. The single polarization transmission system according to the embodiment of the present invention is a system that shares some specifications with the current terrestrial digital broadcasting system, and uses either a horizontally polarized signal or a vertically polarized signal, This is a method of data transmission using SISO (Single-Input Single-Output) technology. For example, by dividing 13 segments within the approximately 6MHz band, which corresponds to one physical channel, 8 segments are used for transmitting 2K broadcast programs, 4 segments are used for transmitting 4K broadcast programs, and 1 segment is used for mobile reception. Assign each. In addition, 2K broadcast programs maintain image quality by optimizing the latest MPEG-2 Video compression technology, so that they can be received by current TV receivers, and 4K broadcast programs use HEVC compression, which is more efficient than MPEG-2 Video. The image quality is ensured by adopting technologies such as VVC compression technology and VVC compression technology, as well as modulation multilevel conversion and NUC technology. Note that the number of segments allocated to each broadcast may be different from the above.
 図7Jに、本発明の実施例に係る高度地上デジタル放送サービスにおける単偏波伝送方式の一例を示す。地上デジタル放送サービスの放送波の伝送には470MHz~710MHzの周波数帯域が用いられる。前記周波数帯域における物理チャンネル数は13~52chの40チャンネルであり、各物理チャンネルは6MHzの帯域幅を有する。本発明の実施例に係る単偏波伝送方式では、1つの物理チャンネル内で2K放送サービスの伝送と4K放送サービスの伝送とを同時に行う。 FIG. 7J shows an example of a single polarization transmission system in the advanced digital terrestrial broadcasting service according to the embodiment of the present invention. A frequency band of 470 MHz to 710 MHz is used to transmit broadcast waves for digital terrestrial broadcasting services. The number of physical channels in the frequency band is 40 channels ranging from 13 to 52 channels, and each physical channel has a bandwidth of 6 MHz. In the single polarization transmission system according to the embodiment of the present invention, 2K broadcasting service and 4K broadcasting service are transmitted simultaneously within one physical channel.
 図7Jには、13セグメントの割り当て例について(1)と(2)の二つの例を示している。(1)の例では、セグメント1~4(B階層)を用いて4K放送番組の伝送を行う。セグメント5~12(C階層)を用いて2K放送番組の伝送を行う。B階層を用いて伝送する4K放送番組とC階層を用いて伝送する2K放送番組とは、同一の内容の放送番組を異なる解像度で伝送するサイマル放送であっても良いし、異なる内容の放送番組を伝送するものであっても良い。(2)の例は、(1)とは別の変形例である。(2)の例では、セグメント1~8(B階層)を用いて2K放送番組の伝送を行う。セグメント9~12(C階層)を用いて4K放送番組の伝送を行う。 FIG. 7J shows two examples (1) and (2) regarding the allocation example of 13 segments. In the example (1), segments 1 to 4 (layer B) are used to transmit a 4K broadcast program. 2K broadcast programs are transmitted using segments 5 to 12 (C layer). The 4K broadcast program transmitted using the B layer and the 2K broadcast program transmitted using the C layer may be simulcasting in which the same content is transmitted at different resolutions, or they may be broadcast programs with different content. It may also be something that transmits. Example (2) is a modification different from (1). In the example (2), segments 1 to 8 (layer B) are used to transmit a 2K broadcast program. 4K broadcast programs are transmitted using segments 9 to 12 (C layer).
 図7Kに、本発明の実施例に係る単偏波伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成の一例を示す。これは、単偏波伝送方式を用いた高度地上デジタル放送サービスの送信側のシステムと受信側のシステムを共に示したものである。単偏波伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成は、基本的に図1に示した放送システムの構成と同様であるが、放送局の設備である電波塔300Sは水平偏波信号と垂直偏波信号の何れか一方を送出可能な単偏波送信アンテナとなる。また、図7Kの例では、放送受信装置100は第二チューナ/復調部130Tの選局/検波部131Hのみを抜粋して記載し、他の動作部は記載を省略している。 FIG. 7K shows an example of the configuration of a broadcasting system for advanced digital terrestrial broadcasting service using a single polarization transmission method according to an embodiment of the present invention. This shows both the transmitting side system and the receiving side system of an advanced digital terrestrial broadcasting service using a single polarization transmission method. The configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the single-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300S, which is the equipment of the broadcasting station, is horizontally polarized. This becomes a single polarization transmitting antenna that can transmit either a wave signal or a vertically polarized signal. Further, in the example of FIG. 7K, in the broadcast receiving apparatus 100, only the channel selection/detection section 131H of the second tuner/demodulation section 130T is excerpted and described, and other operating sections are omitted.
 電波塔300Sから送出された単偏波信号は、単偏波受信アンテナであるアンテナ200Sで受信され、同軸ケーブル202Sを介して、コネクタ部100F3から選局/検波部131Hに入力される。アンテナ(同軸ケーブル)とテレビ受信機とを接続するコネクタ部にはF型コネクタが使用されることが一般的である。単偏波伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成ではアンテナ200Sと放送受信装置100とを一本の同軸ケーブル202Sで接続することが可能であり、周波数変換処理(変換部)も不要となるため、好適である。 The single-polarized signal transmitted from the radio tower 300S is received by the antenna 200S, which is a single-polarized receiving antenna, and is input to the channel selection/detection section 131H from the connector section 100F3 via the coaxial cable 202S. An F-type connector is generally used for a connector section that connects an antenna (coaxial cable) and a television receiver. In the configuration of a broadcasting system for advanced terrestrial digital broadcasting services using a single polarization transmission method, it is possible to connect the antenna 200S and the broadcast receiving device 100 with a single coaxial cable 202S, and frequency conversion processing (conversion unit) is possible. This is preferable because it also eliminates the need for
 なお、以上説明した単偏波伝送方式で伝送される地上デジタル放送の放送波は、上述のとおり、放送受信装置100の第二チューナ/復調部130Tで受信および再生が可能であるが、放送受信装置100の第一チューナ/復調部130Cでも受信可能である。当該地上デジタル放送の放送波を第一チューナ/復調部130Cで受信した場合、当該地上デジタル放送の放送波の放送信号のうち、高度地上デジタル放送サービスの階層で伝送された放送信号は無視されるが、現行の地上デジタル放送サービスの階層で伝送された放送信号については再生が行われる。 Note that the broadcast waves of digital terrestrial broadcasting transmitted by the single polarization transmission method described above can be received and reproduced by the second tuner/demodulator 130T of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100. When the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
 前述のように、放送受信装置100では、単偏波伝送方式で伝送される地上デジタル放送の放送波のうち、現行の地上デジタル放送サービスの階層(図7Jの2K放送を伝送する階層)で伝送された放送信号は第一チューナ/復調部130Cでも受信可能である。このため、第二チューナ/復調部130Tと第一チューナ/復調部130Cとを同時に使用するダブルチューナの構成とすることにより、高度地上デジタル放送サービスの階層で伝送された放送信号と現行の地上デジタル放送サービスの階層で伝送された放送信号とを同時に受信/再生することが可能となる。 As described above, in the broadcast receiving apparatus 100, among the broadcast waves of digital terrestrial broadcasting transmitted using the single polarization transmission method, the broadcast waves are transmitted in the layer of the current digital terrestrial broadcasting service (the layer that transmits 2K broadcasting in FIG. 7J). The broadcast signal can also be received by the first tuner/demodulator 130C. Therefore, by adopting a double tuner configuration in which the second tuner/demodulator 130T and the first tuner/demodulator 130C are used simultaneously, it is possible to combine the broadcast signals transmitted in the layer of the advanced terrestrial digital broadcasting service with the current terrestrial digital It becomes possible to simultaneously receive and reproduce broadcast signals transmitted in the broadcast service layer.
 図7Lに、本発明の実施例に係る単偏波伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成であって、前述のダブルチューナとなる構成の一例を示す。これは、単偏波伝送方式を用いた高度地上デジタル放送サービスの送信側のシステムと受信側のシステムを共に示したものである。単偏波伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成は、基本的に図1に示した放送システムの構成と同様であるが、放送局の設備である電波塔300Sは水平偏波信号と垂直偏波信号の何れか一方を送出可能な単偏波送信アンテナとなる。また、図7Lの例では、放送受信装置100は第一チューナ/復調部130Cの選局/検波部131Cと第二チューナ/復調部130Tの選局/検波部131Hのみを抜粋して記載し、他の動作部は記載を省略している。 FIG. 7L shows an example of the configuration of a broadcasting system for an advanced digital terrestrial broadcasting service using a single-polarization transmission method according to an embodiment of the present invention, which provides the aforementioned double tuner. This shows both the transmitting side system and the receiving side system of an advanced digital terrestrial broadcasting service using a single polarization transmission method. The configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the single-polarization transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300S, which is the equipment of the broadcasting station, is horizontally polarized. This becomes a single polarization transmitting antenna that can transmit either a wave signal or a vertically polarized signal. In addition, in the example of FIG. 7L, in the broadcast receiving apparatus 100, only the channel selection/detection section 131C of the first tuner/demodulation section 130C and the channel selection/detection section 131H of the second tuner/demodulation section 130T are extracted and described, The description of other operating parts is omitted.
 電波塔300Sから送出された単偏波信号は、単偏波受信アンテナであるアンテナ200Sで受信され、同軸ケーブル202Sを介して、コネクタ部100F3から放送受信装置100に入力される。放送受信装置100に入力された単偏波信号は分波されて、それぞれ選局/検波部131Cと選局/検波部131Hに入力される。選局/検波部131Cでは、現行の地上デジタル放送サービスの放送波に対する選局/検波処理が行われ、選局/検波部131Hでは、高度地上デジタル放送サービスの放送波に対する選局/検波処理が行われる。 The single-polarized signal transmitted from the radio tower 300S is received by the antenna 200S, which is a single-polarized receiving antenna, and is input to the broadcast receiving device 100 from the connector section 100F3 via the coaxial cable 202S. The single polarized wave signal input to the broadcast receiving apparatus 100 is demultiplexed and input to the channel selection/detection section 131C and the channel selection/detection section 131H, respectively. The tuning/detection unit 131C performs tuning/detection processing for broadcast waves of current digital terrestrial broadcasting services, and the tuning/detection unit 131H performs tuning/detection processing for broadcast waves of advanced digital terrestrial broadcasting services. It will be done.
 このような構成とすることにより、現行の地上デジタル放送サービスと高度地上デジタル放送サービスとが提供される放送システムにおいて、前記現行の地上デジタル放送サービスと前記高度地上デジタル放送サービスとを同時に受信することが可能となる。特に、チャンネル設定処等で効率の良い処理が可能となる。なお、現行の地上デジタル放送サービスと高度地上デジタル放送サービスとは、同一の物理チャンネルを用いて伝送される方式であっても良いし、異なる物理チャンネルを用いて伝送される方式であっても良い。また、現行の地上デジタル放送サービスと高度地上デジタル放送サービスとは、サイマル放送サービスのペアであっても良いし、ペアでなくとも良い。 With such a configuration, in a broadcasting system in which the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service are provided, it is possible to simultaneously receive the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service. becomes possible. In particular, efficient processing becomes possible in the channel setting section and the like. The current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service may be transmitted using the same physical channel or may be transmitted using different physical channels. . Further, the current digital terrestrial broadcasting service and the advanced digital terrestrial broadcasting service may or may not be a pair of simulcasting services.
 更に、図7Lの例は、単偏波伝送方式を用いた高度地上デジタル放送サービスの放送サービスを受信する場合の例であるが、同様の構成は偏波両用伝送方式を用いた高度地上デジタル放送サービスの放送サービスを受信する場合にも適用することが可能である。この場合、偏波両受信用アンテナであるアンテナ200Tで受信され、変換部201Tを介してコネクタ部100F3から放送受信装置100に入力された偏波両用信号は、分波されて、それぞれ選局/検波部131Cと選局/検波部131Hと選局/検波部131Vとに入力されれば良い。選局/検波部131Cが水平偏波信号と垂直偏波信号の何れか一方で伝送された現行の地上デジタル放送サービスの放送波に対する選局/検波処理を行い、選局/検波部131H及び選局/検波部131Vが、水平偏波信号と垂直偏波信号で伝送された高度地上デジタル放送サービスの放送波に対する選局/検波処理が行われる。 Furthermore, although the example in FIG. 7L is an example of receiving a broadcast service of an advanced digital terrestrial broadcasting service using a single-polarization transmission method, a similar configuration is applicable to an advanced digital terrestrial broadcasting service using a dual-polarization transmission method. It can also be applied to the case of receiving a broadcast service. In this case, the dual-polarization signal received by the antenna 200T, which is a dual-polarization receiving antenna, and input to the broadcast receiving apparatus 100 from the connector 100F3 via the converter 201T is demultiplexed, and the dual-polarization signal is divided into two channels for channel selection and reception. It is sufficient if the signal is input to the detection section 131C, the channel selection/detection section 131H, and the channel selection/detection section 131V. The tuning/detection unit 131C performs tuning/detection processing on the broadcast waves of the current digital terrestrial broadcasting service transmitted by either the horizontally polarized signal or the vertically polarized signal, and the tuning/detection unit 131H and the selection The station/detection unit 131V performs tuning/detection processing on broadcast waves of the advanced digital terrestrial broadcasting service transmitted as horizontally polarized signals and vertically polarized signals.
 [高度地上デジタル放送サービスの伝送方式3]
 現行の地上デジタル放送サービスの視聴環境を維持しつつ、4K放送を実現するため、本発明の実施例に係る高度地上デジタル放送サービスの伝送方式の前述とは異なる一例として、階層分割多重伝送方式について説明する。本発明の実施例に係る階層分割多重伝送方式は、現行の地上デジタル放送方式と一部の仕様を共通とする方式である。例えば、現行の2K放送サービスの放送波と同一チャンネルに信号レベルが低レベルの4K放送サービスの放送波を多重して伝送する。なお、2K放送は所要C/N以下に4K放送の受信レベルを抑制して、従来どおりの受信を行う。4K放送については変調多値化等による伝送容量の拡大等を行いつつ、LDM(階層分割多重)技術に対応した受信技術を用いて、2K放送波をキャンセルし、残った4K放送波で受信を行う。
[Transmission method 3 of advanced terrestrial digital broadcasting service]
In order to realize 4K broadcasting while maintaining the viewing environment of the current digital terrestrial broadcasting service, we will introduce a layered division multiplex transmission method as an example of a transmission method for the advanced digital terrestrial broadcasting service according to the embodiment of the present invention, which is different from the above-described transmission method. explain. The hierarchical division multiplex transmission system according to the embodiment of the present invention is a system that shares some specifications with the current terrestrial digital broadcasting system. For example, the broadcast waves of a 4K broadcast service with a low signal level are multiplexed and transmitted on the same channel as the broadcast waves of the current 2K broadcast service. Note that 2K broadcasting is received as before by suppressing the reception level of 4K broadcasting to below the required C/N. Regarding 4K broadcasting, while expanding the transmission capacity through modulation and multi-value modulation, etc., we will cancel the 2K broadcast waves and receive the remaining 4K broadcast waves using reception technology compatible with LDM (layer division multiplexing) technology. conduct.
 図8Aに、本発明の実施例に係る高度地上デジタル放送サービスにおける階層分割多重伝送方式の一例を示す。上側階層を現行の2K放送の変調波で構成し、下側階層を4K放送の変調波で構成し、前記上側階層と下側階層とを多重し、同一周波数帯で合成波として出力する。例えば、上側階層では変調方式として64QAM等を用い、下側階層では変調方式として256QAM等を用いる構成にすれば良い。なお、上側階層を用いて伝送する2K放送番組と下側階層を用いて伝送する4K放送番組とは、同一の内容の放送番組を異なる解像度で伝送するサイマル放送であっても良いし、異なる内容の放送番組を伝送するものであっても良い。ここで、上側階層は高電力で送信され、下側階層は低電力で送信される。なお、上側階層の変調波レベルと下側階層の変調波レベルとの差(電力の差)はインジェクションレベル(IL:Injection Level)と呼ばれる。インジェクションレベルは、放送局側で設定される値である。インジェクションレベルは、一般的に、変調波レベルの差(電力の差)を対数表現の相対比(dB)で表した値として示される。 FIG. 8A shows an example of a hierarchical division multiplexing transmission system in the advanced digital terrestrial broadcasting service according to the embodiment of the present invention. The upper layer is made up of modulated waves of current 2K broadcasting, the lower layer is made up of modulated waves of 4K broadcasting, and the upper layer and lower layer are multiplexed and output as a composite wave in the same frequency band. For example, the upper layer may use 64QAM or the like as a modulation method, and the lower layer may use 256QAM or the like as a modulation method. Note that the 2K broadcast program transmitted using the upper layer and the 4K broadcast program transmitted using the lower layer may be simulcasting that transmits the same content broadcast program at different resolutions, or may be simulcasting that transmits the same content broadcast program at different resolutions, or The broadcast program may be transmitted. Here, the upper layer is transmitted with high power, and the lower layer is transmitted with low power. Note that the difference (difference in power) between the modulated wave level of the upper layer and the modulated wave level of the lower layer is called an injection level (IL). The injection level is a value set by the broadcasting station. The injection level is generally expressed as a value expressed as a relative ratio (dB) of a difference in modulated wave level (difference in power) in logarithmic expression.
 図8Bに、本発明の実施例に係る階層分割多重伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成の一例を示す。階層分割多重伝送方式を用いた高度地上デジタル放送サービスの放送システムの構成は、基本的に図1に示した放送システムの構成と同様であるが、放送局の設備である電波塔300Lは、上側階層の2K放送と下側階層の4K放送とを多重した放送信号を送出する送信アンテナである。また、図8Bの例では、放送受信装置100は第三チューナ/復調部130Lの選局/検波部131Lのみを抜粋して記載し、他の動作部は記載を省略している。 FIG. 8B shows an example of the configuration of a broadcasting system for an advanced digital terrestrial broadcasting service using a hierarchical division multiplex transmission system according to an embodiment of the present invention. The configuration of the broadcasting system for the advanced terrestrial digital broadcasting service using the hierarchical division multiplex transmission method is basically the same as the configuration of the broadcasting system shown in Figure 1, but the radio tower 300L, which is the equipment of the broadcasting station, is located on the upper side. This is a transmitting antenna that sends out a broadcast signal that is a multiplex of 2K broadcasting in the hierarchy and 4K broadcasting in the lower hierarchy. Further, in the example of FIG. 8B, in the broadcast receiving apparatus 100, only the channel selection/detection section 131L of the third tuner/demodulation section 130L is extracted and described, and the description of other operating sections is omitted.
 アンテナ200Lで受信された放送信号は、変換部(コンバータ)201Lおよび同軸ケーブル202Lを介して、コネクタ部100F4から選局/検波部131Lに入力される。ここで、前記構成にて、アンテナ200Lから放送受信装置100に放送信号が送信される際、図8Cに示すように、変換部201Lにおいて、周波数変換増幅処理を放送信号に対して施すようにしても良い。即ち、マンション等の屋上にアンテナ200Lを設置し、ケーブル長の長い同軸ケーブル202Lにより各部屋の放送受信装置100まで放送信号の送信を行う場合、放送信号が減衰してしまい、選局/検波部131Lにおいて特に下側階層の4K放送波が正しく受信できないという不具合を生じる可能性が考えられる。 The broadcast signal received by the antenna 200L is input to the channel selection/detection unit 131L from the connector unit 100F4 via the converter 201L and the coaxial cable 202L. Here, in the above configuration, when a broadcast signal is transmitted from the antenna 200L to the broadcast receiving apparatus 100, as shown in FIG. 8C, the converter 201L performs frequency conversion amplification processing on the broadcast signal. Also good. That is, when installing the antenna 200L on the roof of an apartment building or the like and transmitting the broadcast signal to the broadcast receiving device 100 in each room using the long coaxial cable 202L, the broadcast signal will be attenuated and the channel selection/detection section 131L, there is a possibility that a problem may occur in which 4K broadcast waves in the lower hierarchy cannot be received correctly.
 そこで、前述の不具合を防ぐため、変換部201Lでは、下側階層の4K放送信号に対して周波数変換増幅処理を行う。周波数変換増幅処理は、下側階層の4K放送信号の周波数帯域を470~710MHzの周波数帯域(UHFの13ch~52chに相当する帯域)から、例えば、UHFの62chに相当する帯域を超える770~1010MHzの周波数帯域に変換する。さらに、下側階層の4K放送信号をケーブルでの減衰の影響が問題とならない程度の信号レベルに増幅する処理を行う。このような処理を行うことにより、2K放送信号と4K放送信号との干渉を避けつつ、同軸ケーブル送信中の放送信号の減衰の影響も避けることが可能となる。なお、同軸ケーブル202Lのケーブル長が短い場合等、減衰の影響が問題とならない場合には、変換部201Lおよび周波数変換増幅処理は不要としても良い。 Therefore, in order to prevent the above-mentioned problems, the conversion unit 201L performs frequency conversion and amplification processing on the 4K broadcast signal of the lower layer. Frequency conversion amplification processing changes the frequency band of the 4K broadcast signal in the lower layer from a frequency band of 470 to 710 MHz (a band corresponding to UHF channels 13 to 52) to a frequency band of 770 to 1010 MHz, which exceeds the band corresponding to UHF channels 62, for example. frequency band. Furthermore, processing is performed to amplify the 4K broadcast signal in the lower hierarchy to a signal level where the influence of cable attenuation is not a problem. By performing such processing, it is possible to avoid interference between the 2K broadcast signal and the 4K broadcast signal, and also to avoid the effects of attenuation of the broadcast signal during coaxial cable transmission. Note that if the influence of attenuation is not a problem, such as when the cable length of the coaxial cable 202L is short, the converter 201L and the frequency conversion amplification process may be unnecessary.
 また、図8Dに示すように、放送受信装置100の第三チューナ/復調部130Lが備える選局/検波部を、上側階層(2K放送)の変調波に対して選局/検波等の処理を行う選局/検波部131L1と下側階層(4K放送)の変調波に対して選局/検波等の処理を行う選局/検波部131L2とで構成するようにしても良い。このような構成とすることにより、変換部201Lにおいて周波数変換増幅処理を施された信号に対して、同一の物理チャンネルを用いて放送局から送出された2K放送信号と4K放送信号とで同時に選局/検波等の処理を行うことが可能となり、特にサイマル放送時等に好適な処理を行うことができる。 Further, as shown in FIG. 8D, the tuning/detection section included in the third tuner/demodulation section 130L of the broadcast receiving apparatus 100 performs processing such as tuning/detection on the modulated wave of the upper layer (2K broadcast). The channel selection/detection section 131L1 may be configured to perform tuning/detection, and the tuning/detection section 131L2 may perform processing such as tuning/detection on the modulated wave of the lower layer (4K broadcasting). With such a configuration, a 2K broadcast signal and a 4K broadcast signal transmitted from a broadcast station using the same physical channel can be simultaneously selected for the signal subjected to frequency conversion and amplification processing in the conversion unit 201L. It becomes possible to perform processing such as station/wave detection, and it is possible to perform processing particularly suitable for simultaneous broadcasting.
 また、周波数変換増幅処理による変換後の周波数帯域は、UHFの52chに相当する帯域を超える710~1032MHzの間またはUHFの62chに相当する帯域を超える770~1032MHzの間(ケーブルテレビ局による再送信等の場合)とすることが好ましいこと、周波数変換増幅処理による変換前の周波数帯域と変換後の周波数帯域との間の領域の帯域幅は、1つの物理チャンネルの帯域幅(6MHz)の整数倍となるように設定することが好ましいこと、周波数変換増幅処理は、階層分割多重伝送方式による信号伝送を用いている物理チャンネルに対してのみ行っても良いこと、等は、いずれも既に説明した周波数変換に係る本実施例の説明と同様であるため、再度の説明は省略する。 In addition, the frequency band after conversion by frequency conversion amplification processing is between 710 and 1032 MHz, which exceeds the band corresponding to 52 channels of UHF, or between 770 and 1032 MHz, which exceeds the band corresponding to 62 channels of UHF (retransmission by cable TV stations, etc.) ), and the bandwidth of the region between the frequency band before conversion and the frequency band after conversion by frequency conversion amplification processing is an integral multiple of the bandwidth of one physical channel (6 MHz). It is preferable to set the frequency conversion so that Since the explanation is the same as that of the present embodiment, the explanation will be omitted again.
 なお、本実施例の放送受信装置100は、受信した放送信号が下側階層で伝送された放送信号であるのか上側階層で伝送された放送信号であるのかを、図5Hで説明したTMCC情報の上下階層識別ビットを用いて識別することが可能である。また、本実施例の放送受信装置100は、受信した放送信号が、アンテナ受信後に周波数変換がなされた放送信号であるか否かを、図5Fで説明したTMCC情報の周波数変換処理識別ビットを用いて識別することが可能である。また、本実施例の放送受信装置100は、受信した放送信号が、下側階層で4K番組を伝送しているか否かを、図5Iで説明したTMCC情報の4K信号伝送階層識別ビットを用いて識別することが可能である。これらの識別処理は、データキャリアを復調してストリーム内に含まれる制御情報を参照して行うことも不可能ではないが、データキャリアの復調が必要であり処理が複雑になる。上述のTMCC情報のパラメータを参照して識別する方が、処理がより簡便で高速になるため、例えば、放送受信装置100の初期スキャンをより高速化することが可能である。 Note that the broadcast receiving apparatus 100 of this embodiment determines whether the received broadcast signal is a broadcast signal transmitted in a lower hierarchy or a broadcast signal transmitted in an upper hierarchy, based on the TMCC information explained in FIG. 5H. It is possible to identify using upper and lower hierarchy identification bits. Furthermore, the broadcast receiving apparatus 100 of the present embodiment uses the frequency conversion process identification bit of the TMCC information described in FIG. It is possible to identify the Furthermore, the broadcast receiving apparatus 100 of this embodiment uses the 4K signal transmission layer identification bit of the TMCC information described in FIG. 5I to determine whether the received broadcast signal transmits a 4K program in the lower layer. It is possible to identify. Although it is possible to perform these identification processes by demodulating the data carrier and referring to control information included in the stream, demodulation of the data carrier is required and the process becomes complicated. Identification by referring to the parameters of the TMCC information described above makes the process simpler and faster, so that, for example, it is possible to speed up the initial scan of the broadcast receiving apparatus 100.
 なお、本発明の実施例に係る放送受信装置100の第三チューナ/復調部130Lの選局/検波部131Lは、既に説明したとおり、LDM(階層分割多重)技術に対応した受信機能を有しているので、アンテナ200Lから放送受信装置100の間に必ずしも図8Bに示す変換部201Lが必要ではない。 Note that, as already explained, the channel selection/detection section 131L of the third tuner/demodulation section 130L of the broadcast receiving apparatus 100 according to the embodiment of the present invention has a reception function compatible with LDM (layer division multiplexing) technology. Therefore, the conversion unit 201L shown in FIG. 8B is not necessarily required between the antenna 200L and the broadcast receiving apparatus 100.
 なお、以上説明した階層分割多重伝送方式で伝送される地上デジタル放送の放送波は、上述のとおり、放送受信装置100の第三チューナ/復調部130Lで受信および再生が可能であるが、放送受信装置100の第一チューナ/復調部130Cでも受信可能である。当該地上デジタル放送の放送波を第一チューナ/復調部130Cで受信した場合、当該地上デジタル放送の放送波の放送信号のうち、高度地上デジタル放送サービスの階層で伝送された放送信号は無視されるが、現行の地上デジタル放送サービスの階層で伝送された放送信号については再生が行われる。 Note that the broadcast waves of digital terrestrial broadcasting transmitted by the hierarchical division multiplexing transmission method described above can be received and reproduced by the third tuner/demodulator 130L of the broadcast receiving apparatus 100, as described above. It can also be received by the first tuner/demodulator 130C of the device 100. When the first tuner/demodulator 130C receives the broadcast wave of the digital terrestrial broadcast, the broadcast signal transmitted in the layer of the advanced digital terrestrial broadcast service is ignored among the broadcast signals of the digital terrestrial broadcast. However, broadcast signals transmitted in the current terrestrial digital broadcasting service layer are played back.
 [MPEG-2 TS方式]
 本実施例の放送システムは、映像や音声等のデータを伝送するメディアトランスポート方式として、現行の地上デジタル放送サービス等で採用されているMPEG-2 TSに対応可能である。具体的には、図4D(1)のOFDM伝送波によって伝送されるストリームの方式はMPEG-2 TSであり、図4D(2)および図4D(3)のOFDM伝送波のうち、現行の地上デジタル放送サービスが伝送される階層で伝送するストリームの方式はMPEG-2 TSである。また、図2の放送受信装置100の第一チューナ/復調部130Cで伝送波を復調して得るストリームの方式はMPEG-2 TSである。また、第二チューナ/復調部130Tで伝送波を復調して得るストリームのうち、現行の地上デジタル放送サービスが伝送される階層に対応するストリームの方式はMPEG-2 TSである。同様に、第三チューナ/復調部130Lで伝送波を復調して得るストリームのうち、現行の地上デジタル放送サービスが伝送される階層に対応するストリームの方式はMPEG-2 TSである。
[MPEG-2 TS method]
The broadcasting system of this embodiment is compatible with MPEG-2 TS, which is used in current digital terrestrial broadcasting services, as a media transport method for transmitting data such as video and audio. Specifically, the stream format transmitted by the OFDM transmission wave in Figure 4D (1) is MPEG-2 TS, and among the OFDM transmission waves in Figure 4D (2) and Figure 4D (3), the current terrestrial The stream format transmitted in the layer where digital broadcasting services are transmitted is MPEG-2 TS. Furthermore, the stream system obtained by demodulating the transmission wave in the first tuner/demodulator 130C of the broadcast receiving apparatus 100 in FIG. 2 is MPEG-2 TS. Furthermore, among the streams obtained by demodulating the transmission wave by the second tuner/demodulator 130T, the stream format corresponding to the layer in which the current digital terrestrial broadcasting service is transmitted is MPEG-2 TS. Similarly, among the streams obtained by demodulating the transmission wave in the third tuner/demodulator 130L, the stream format corresponding to the layer in which the current digital terrestrial broadcasting service is transmitted is MPEG-2 TS.
 MPEG-2 TSは、番組を構成する映像や音声等のコンポーネントを、制御信号やクロックと共に1つのパケットストリームに多重することを特徴とする。MPEG-2 TSでは、クロックも含めて1つのパケットストリームとして扱うため、伝送品質が確保された1つの伝送路で1つのコンテンツを伝送するのに適しており、現行の多くのデジタル放送システムで採用されている。また、MPEG-2 TSでは、固定網/携帯網等の双方向網を介して双方向通信を実現することが可能である。すなわち、MPEG-2 TSは、デジタル放送サービスにブロードバンドネットワークを利用した機能を連携させ、ブロードバンドネットワークを介した付加コンテンツの取得やサーバ装置における演算処理、携帯端末機器との連携による提示処理等をデジタル放送サービスと組み合わせる放送通信連携システムに対応可能である。 MPEG-2 TS is characterized by multiplexing components such as video and audio that make up a program into one packet stream along with control signals and clocks. MPEG-2 TS handles the clock as one packet stream, making it suitable for transmitting one content over one transmission path with guaranteed transmission quality, and is used in many current digital broadcasting systems. has been done. Furthermore, with MPEG-2 TS, it is possible to realize two-way communication via a two-way network such as a fixed network/mobile network. In other words, MPEG-2 TS links digital broadcasting services with functions that utilize a broadband network, and digitally performs acquisition of additional content via the broadband network, arithmetic processing in a server device, presentation processing by linking with a mobile terminal device, etc. It is compatible with broadcast communication cooperation systems that are combined with broadcast services.
 図9Aに、MPEG-2 TSを用いる放送システムにおける伝送信号のプロトコルスタックの一例を示す。MPEG-2 TSにおいて、PSIやSI、その他の制御信号等は、セクション形式で伝送される。 FIG. 9A shows an example of a protocol stack of a transmission signal in a broadcasting system using MPEG-2 TS. In MPEG-2 TS, PSI, SI, and other control signals are transmitted in section format.
 [MPEG-2 TS方式を用いる放送システムの制御信号]
 MPEG-2 TS方式の制御情報としては、主として番組配列情報で使用されるテーブルと番組配列情報以外で使用されるテーブルがある。テーブルはセクション形式で伝送され、記述子はテーブル内に配置される。
[Control signal for broadcasting system using MPEG-2 TS system]
The control information of the MPEG-2 TS system mainly includes tables used for program sequence information and tables used for purposes other than program sequence information. Tables are transmitted in section format, and descriptors are placed within the tables.
 <番組配列情報で使用されるテーブル>
 図9Bに、MPEG-2 TS方式の放送システムの番組配列情報で使用されるテーブルの一覧を示す。本実施例では、番組配列情報で使用されるテーブルとして以下に示すものが用いられる。
<Table used for program sequence information>
FIG. 9B shows a list of tables used in the program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the table shown below is used as the table used in the program sequence information.
(1)PAT(Program Association Table)
(2)CAT(Conditional Access Table)
(3)PMT(Program Map Table)
(4)NIT(Network Information Table)
(5)SDT(Service Description Table)
(6)BAT(Bouquet Association Table)
(7)EIT(Event Information Table)
(8)RST(Running Status Table)
(9)TDT(Time and Date Table)
(10)TOT(Time Offset Table)
(1) PAT (Program Association Table)
(2) CAT (Conditional Access Table)
(3) PMT (Program Map Table)
(4) NIT (Network Information Table)
(5) SDT (Service Description Table)
(6) BAT (Bouquet Association Table)
(7) EIT (Event Information Table)
(8) RST (Running Status Table)
(9) TDT (Time and Date Table)
(10) TOT (Time Offset Table)
(11)LIT(Local Event Information Table)
(12)ERT(Event Relation Table)
(13)ITT(Index Transmission Table)
(14)PCAT(Partial Content Announcement Table)
(15)ST(Stuffing Table)
(16)BIT(Broadcaster Information Table)
(17)NBIT(Network Board Information Table)
(18)LDT(Linked Description Table)
(19)AMT(Address Map Table)
(20)INT(IP/MAC Notification Table)
(21)事業者が設定するテーブル
(11) LIT (Local Event Information Table)
(12) ERT (Event Relation Table)
(13) ITT (Index Transmission Table)
(14) PCAT (Partial Content Announcement Table)
(15) ST(Stuffing Table)
(16) BIT (Broadcaster Information Table)
(17) NBIT (Network Board Information Table)
(18) LDT (Linked Description Table)
(19) AMT (Address Map Table)
(20) INT (IP/MAC Notification Table)
(21) Table set by the operator
 <デジタル放送で使用されるテーブル>
 図9Cに、MPEG-2 TS方式の放送システムの番組配列情報以外で使用されるテーブルの一覧を示す。本実施例では、番組配列情報以外で使用されるテーブルとして以下に示すものが用いられる。
<Table used in digital broadcasting>
FIG. 9C shows a list of tables used for purposes other than program sequence information in the MPEG-2 TS broadcasting system. In this embodiment, the table shown below is used for purposes other than program sequence information.
(1)ECM(Entitlement Control Message)
(2)EMM(Entitlement Management Message)
(3)DCT(Download Control Table)
(4)DLT(DownLoad Table)
(5)DIT(Discontinuity Information Table)
(6)SIT(Selection Information Table)
(7)SDTT(Software Download Trigger Table)
(8)CDT(Common Data Table)
(9)DSM-CCセクション
(10)AIT(Application Information Table)
(11)DCM(Download Control Message)
(12)DMM(Download Management Message)
(13)事業者が設定するテーブル
(1) ECM (Entitlement Control Message)
(2) EMM (Entitlement Management Message)
(3) DCT (Download Control Table)
(4) DLT (Download Table)
(5) DIT (Discontinuity Information Table)
(6) SIT (Selection Information Table)
(7) SDTT (Software Download Trigger Table)
(8) CDT (Common Data Table)
(9) DSM-CC section (10) AIT (Application Information Table)
(11) DCM (Download Control Message)
(12) DMM (Download Management Message)
(13) Table set by the operator
 <番組配列情報で使用される記述子>
 図9Dと図9Eと図9Fに、MPEG-2 TS方式の放送システムの番組配列情報で使用される記述子の一覧を示す。本実施例では、番組配列情報で使用される記述子として以下に示すものが用いられる。
<Descriptors used in program sequence information>
FIGS. 9D, 9E, and 9F show a list of descriptors used in the program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in the program sequence information.
(1)限定受信方式記述子(Conditional Access Descriptor)
(2)著作権記述子(Copyright Descriptor)
(3)ネットワーク名記述子(Network Name Descriptor)
(4)サービスリスト記述子(Service List Descriptor)
(5)スタッフ記述子(Stuffing Descriptor)
(6)衛星分配システム記述子(Satellite Delivery System Descriptor)
(7)地上分配システム記述子(Terrestrial Delivery System Descriptor)
(8)ブーケ名記述子(Bouquet Name Descriptor)
(9)サービス記述子(Service Descriptor)
(10)国別受信可否記述子(Country Availability Descriptor)
(1) Conditional Access Descriptor
(2) Copyright Descriptor
(3) Network Name Descriptor
(4) Service List Descriptor
(5) Stuffing Descriptor
(6) Satellite Delivery System Descriptor
(7) Terrestrial Delivery System Descriptor
(8) Bouquet Name Descriptor
(9) Service Descriptor
(10) Country Availability Descriptor
(11)リンク記述子(Linkage Descriptor)
(12)NVOD基準サービス記述子(NVOD Reference Descriptor)
(13)タイムシフトサービス記述子(Time Shifted Service Descriptor)
(14)短形式イベント記述子(Short Event Descriptor)
(15)拡張形式イベント記述子(Extended Event Descriptor)
(16)タイムシフトイベント記述子(Time Shifted Event Descriptor)
(17)コンポーネント記述子(Component Descriptor)
(18)モザイク記述子(Mosaic Descriptor)
(19)ストリーム識別記述子(Stream Identifier Descriptor)
(20)CA識別記述子(CA Identifier Descriptor)
(11) Linkage Descriptor
(12) NVOD Reference Descriptor
(13) Time Shifted Service Descriptor
(14) Short Event Descriptor
(15) Extended Event Descriptor
(16) Time Shifted Event Descriptor
(17) Component Descriptor
(18) Mosaic Descriptor
(19) Stream Identifier Descriptor
(20) CA Identifier Descriptor
(21)コンテント記述子(Content Descriptor)
(22)パレンタルレート記述子(Parental Rating Descriptor)
(23)階層伝送記述子(Hierarchical Transmission Descriptor)
(24)デジタルコピー制御記述子(Digital Copy Control Descriptor)
(25)緊急情報記述子(Emergency Information Descriptor)
(26)データ符号化方式記述子(Data Component Descriptor)
(27)システム管理記述子(System Management Descriptor)
(28)ローカル時間オフセット記述子(Local Time Offset Descriptor)
(29)音声コンポーネント記述子(Audio Component Descriptor)
(30)対象地域記述子(Target Region Descriptor)
(21) Content Descriptor
(22) Parental Rating Descriptor
(23) Hierarchical Transmission Descriptor
(24) Digital Copy Control Descriptor
(25) Emergency Information Descriptor
(26) Data encoding method descriptor (Data Component Descriptor)
(27) System Management Descriptor
(28) Local Time Offset Descriptor
(29) Audio Component Descriptor
(30) Target Region Descriptor
(31)ハイパーリンク記述子(Hyperlink Descriptor)
(32)データコンテンツ記述子(Data Content Descriptor)
(33)ビデオデコードコントロール記述子(Video Decode Control Descriptor)
(34)基本ローカルイベント記述子(Basic Local Event Descriptor)
(35)リファレンス記述子(Reference Descriptor)
(36)ノード関係記述子(Node Relation Descriptor)
(37)短形式ノード情報記述子(Short Node Information Descriptor)
(38)STC参照記述子(STC Reference Descriptor)
(39)部分受信記述子(Partial Reception Descriptor)
(40)シリーズ記述子(Series Descriptor)
(31) Hyperlink Descriptor
(32) Data Content Descriptor
(33) Video Decode Control Descriptor
(34) Basic Local Event Descriptor
(35) Reference Descriptor
(36) Node Relation Descriptor
(37) Short Node Information Descriptor
(38) STC Reference Descriptor
(39) Partial Reception Descriptor
(40) Series Descriptor
(41)イベントグループ記述子(Event Group Descriptor)
(42)SI伝送パラメータ記述子(SI Parameter Descriptor)
(43)ブロードキャスタ名記述子(Broadcaster Name Descriptor)
(44)コンポーネントグループ記述子(Component Group Descriptor)
(45)SIプライムTS記述子(SI Prime TS Descriptor)
(46)掲示板情報記述子(Board Information Descriptor)
(47)LDTリンク記述子(LDT Linkage Descriptor)
(48)連結送信記述子(Connected Transmission Descriptor)
(49)TS情報記述子(TS Information Descriptor)
(50)拡張ブロードキャスタ記述子(Extended Broadcaster Descriptor)
(41) Event Group Descriptor
(42) SI Transmission Parameter Descriptor
(43) Broadcaster Name Descriptor
(44) Component Group Descriptor
(45) SI Prime TS Descriptor
(46) Board Information Descriptor
(47) LDT Linkage Descriptor
(48) Connected Transmission Descriptor
(49) TS Information Descriptor
(50) Extended Broadcaster Descriptor
(51)ロゴ伝送記述子(Logo Transmission Descriptor)
(52)コンテント利用記述子(Content Availability Descriptor)
(53)カルーセル互換複合記述子(Carousel Compatible Composite Descriptor)
(54)限定再生方式記述子(Conditional Playback Descriptor)
(55)AVCビデオ記述子(AVC Video Descriptor)
(56)AVCタイミングHRD記述子(AVC Timing and HRD Descriptor)
(57)サービスグループ記述子(Service Group Descriptor)
(58)MPEG-4オーディオ記述子(MPEG-4 Audio Descriptor)
(59)MPEG-4オーディオ拡張記述子(MPEG-4 Audio Extension Descriptor)
(60)登録記述子(Registration Descriptor)
(51) Logo Transmission Descriptor
(52) Content Availability Descriptor
(53) Carousel Compatible Composite Descriptor
(54) Conditional Playback Descriptor
(55) AVC Video Descriptor
(56) AVC Timing and HRD Descriptor
(57) Service Group Descriptor
(58) MPEG-4 Audio Descriptor
(59) MPEG-4 Audio Extension Descriptor
(60) Registration Descriptor
(61)データブロードキャスト識別記述子(Data Broadcast Id Descriptor)
(62)アクセス制御記述子(Access Control Descriptor)
(63)エリア放送情報記述子(Area Broadcasting Information Descriptor)
(64)素材情報記述子(Material Information Descriptor)
(65)HEVCビデオ記述子(HEVC Video Descriptor)
(66)階層符号化記述子(Hierarchy Descriptor)
(67)通信連携情報記述子(Hybrid Information Descriptor)
(68)スクランブル方式記述子(Scrambler Descriptor)
(69)事業者が設定する記述子
(61) Data Broadcast Id Descriptor
(62) Access Control Descriptor
(63) Area Broadcasting Information Descriptor
(64) Material Information Descriptor
(65) HEVC Video Descriptor
(66) Hierarchy Descriptor
(67) Communication coordination information descriptor (Hybrid Information Descriptor)
(68) Scrambler Descriptor
(69) Descriptor set by the operator
 <デジタル放送で使用される記述子>
 図9Gに、MPEG-2 TS方式の放送システムの番組配列情報以外で使用される記述子の一覧を示す。本実施例では、番組配列情報以外で使用される記述子として以下に示すものが用いられる。
<Descriptors used in digital broadcasting>
FIG. 9G shows a list of descriptors used in other than program sequence information of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used for purposes other than program sequence information.
(1)パーシャルトランスポートストリーム記述子
     (Partial Transport Stream Descriptor)
(2)ネットワーク識別記述子(Network Identification Descriptor)
(3)パーシャルトランスポートストリームタイム記述子
     (Partial Transport Stream Time Descriptor)
(4)ダウンロードコンテンツ記述子(Download Content Descriptor)
(5)CA_EMM_TS_記述子(CA EMM TS Descriptor)
(6)CA契約情報記述子(CA Contract Information Descriptor)
(7)CAサービス記述子(CA Service Descriptor)
(8)カルーセル識別記述子(Carousel Identifier Descriptor)
(9)アソシエーションタグ記述子(Association Tag Descriptor)
(10)拡張アソシエーションタグ記述子
       (Deferred Association tags Descriptor)
(11)ネットワークダウロードコンテンツ記述子
       (Network Download Content Descriptor)
(12)ダウンロード保護記述子(Download Protection Descriptor)
(13)CA起動記述子(CA Startup Descriptor)
(14)事業者が設定する記述子
(1) Partial Transport Stream Descriptor
(2) Network Identification Descriptor
(3) Partial Transport Stream Time Descriptor
(4) Download Content Descriptor
(5) CA_EMM_TS_Descriptor (CA_EMM_TS_Descriptor)
(6) CA Contract Information Descriptor
(7) CA Service Descriptor
(8) Carousel Identifier Descriptor
(9) Association Tag Descriptor
(10) Deferred Association tags Descriptor
(11) Network Download Content Descriptor
(12) Download Protection Descriptor
(13) CA Startup Descriptor
(14) Descriptor set by the operator
 <INTで使用される記述子>
 図9Hに、MPEG-2 TS方式の放送システムのINTで使用される記述子の一覧を示す。本実施例では、INTで使用される記述子として以下に示すものが用いられる。なお、前述の番組配列情報で使用される記述子および番組配列情報以外で使用される記述子は、INTでは使用しない。
<Descriptors used in INT>
FIG. 9H shows a list of descriptors used in the INT of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in INT. Note that the descriptors used in the above-mentioned program sequence information and descriptors used in other than the program sequence information are not used in INT.
(1)ターゲットスマートカード記述子(Target Smartcard Descriptor)
(2)ターゲットIPアドレス記述子(Target IP Address Descriptor)
(3)ターゲットIPv6アドレス記述子(Target IPv6 Address Descriptor)
(4)IP/MACプラットフォーム名記述子(IP/MAC Platform Name Descriptor)
(5)IP/MACプラットフォームプロバイダ名記述子
     (IP/MAC Platform Provider Name Descriptor)
(6)IP/MACストリーム配置記述子(IP/MAC Stream Location Descriptor)
(7)事業者が設定する記述子
(1) Target Smartcard Descriptor
(2) Target IP Address Descriptor
(3) Target IPv6 Address Descriptor
(4) IP/MAC Platform Name Descriptor
(5) IP/MAC Platform Provider Name Descriptor
(6) IP/MAC Stream Location Descriptor
(7) Descriptor set by the operator
 <AITで使用される記述子>
 図9Iに、MPEG-2 TS方式の放送システムのAITで使用される記述子の一覧を示す。本実施例では、AITで使用される記述子として以下に示すものが用いられる。なお、前述の番組配列情報で使用される記述子および番組配列情報以外で使用される記述子は、INTでは使用しない。
<Descriptors used in AIT>
FIG. 9I shows a list of descriptors used in the AIT of the MPEG-2 TS broadcasting system. In this embodiment, the following descriptors are used in AIT. Note that the descriptors used in the above-mentioned program sequence information and descriptors used in other than the program sequence information are not used in INT.
(1)アプリケーション記述子(Application Descriptor)
(2)伝送プロトコル記述子(Transport Protocol Descriptor)
(3)簡易アプリケーションロケーション記述子
     (Simple Application Location Descriptor)
(4)アプリケーション境界権限設定記述子
     (Application Boundary and Permission Descriptor)
(5)起動優先情報記述子(Autostart Priority Descriptor)
(6)キャッシュ情報記述子(Cache Control Info Descriptor)
(7)確率的適用遅延記述子(Randomized Latency Descriptor)
(8)外部アプリケーション制御記述子
     (External Application Control Descriptor)
(9)録画再生アプリケーション記述子(Playback Application Descriptor)
(10)簡易録画再生アプリケーションロケーション記述子
       (Simple Playback Application Location Descriptor)
(11)アプリケーション有効期限記述子(Application Expiration Descriptor)
(12)事業者が設定する記述子
(1) Application Descriptor
(2) Transport Protocol Descriptor
(3) Simple Application Location Descriptor
(4) Application Boundary and Permission Descriptor
(5) Autostart Priority Descriptor
(6) Cache Control Info Descriptor
(7) Randomized Latency Descriptor
(8) External Application Control Descriptor
(9) Playback Application Descriptor
(10) Simple Playback Application Location Descriptor
(11) Application Expiration Descriptor
(12) Descriptor set by the operator
 [MMT方式]
 本実施例の放送システムは、映像や音声等のデータを伝送するメディアトランスポート方式として、MMT方式に対応することも可能である。具体的には、図4D(2)および図4D(3)のOFDM伝送波に対応するストリームのうち、高度な地上デジタル放送サービスが伝送される階層で伝送するストリームの方式は、原則としてMMT方式である。また、図2の放送受信装置100の第二チューナ/復調部130Tで伝送波を復調して得るストリームのうち、高度な地上デジタル放送サービスが伝送される階層に対応するストリームの方式も、原則としてMMT方式である。同様に、第三チューナ/復調部130Lで伝送波を復調して得るストリームのうち、高度な地上デジタル放送サービスが伝送される階層に対応するストリームの方式は、原則としてMMT方式である。また、第四チューナ/復調部130Bで伝送波を復調して得るストリームの方式も、MMT方式である。なお、変形例としては、高度な地上デジタル放送サービスでMPEG-2 TSのストリームを運用しても構わない。
[MMT method]
The broadcasting system of this embodiment can also support the MMT system as a media transport system for transmitting data such as video and audio. Specifically, among the streams corresponding to the OFDM transmission waves in Figures 4D (2) and 4D (3), the stream format transmitted in the layer where advanced digital terrestrial broadcasting services are transmitted is, in principle, the MMT format. It is. Furthermore, among the streams obtained by demodulating the transmission waves in the second tuner/demodulator 130T of the broadcast receiving apparatus 100 in FIG. It is an MMT method. Similarly, among the streams obtained by demodulating the transmission wave in the third tuner/demodulator 130L, the stream format corresponding to the hierarchy in which advanced digital terrestrial broadcasting services are transmitted is, in principle, the MMT format. Furthermore, the method of the stream obtained by demodulating the transmission wave in the fourth tuner/demodulator 130B is also the MMT method. Note that as a modification, an MPEG-2 TS stream may be operated in an advanced terrestrial digital broadcasting service.
 MMT方式は、近年のコンテンツの多様化、コンテンツを利用する機器の多様化、コンテンツを配信する伝送路の多様化、コンテンツ蓄積環境の多様化、等、コンテンツ配信に関する環境変化に対してMPEG-2 TS方式の機能に限界があることから、新たに策定されたメディアトランスポート方式である。 The MMT system is based on MPEG-2 in response to changes in the environment related to content distribution, such as the recent diversification of content, the diversification of devices that use content, the diversification of transmission paths for distributing content, and the diversification of content storage environments. This is a newly developed media transport method due to the limitations of the TS method.
 放送番組の映像信号および音声信号の符号はMFU(Media Fragment Unit)/MPU(Media Processing Unit)とし、MMTP(MMT Protocol)ペイロードに乗せてMMTPパケット化し、IPパケットで伝送する。また、放送番組に関連するデータコンテンツや字幕の信号についてもMFU/MPUの形式とし、MMTPペイロードに乗せてMMTPパケット化し、IPパケットで伝送する。 The video and audio signals of the broadcast program are coded as MFU (Media Fragment Unit)/MPU (Media Processing Unit), placed on an MMTP (MMT Protocol) payload, converted into MMTP packets, and transmitted as IP packets. Further, data content and subtitle signals related to broadcast programs are also in MFU/MPU format, put on an MMTP payload, converted into MMTP packets, and transmitted as IP packets.
 MMTPパケットの伝送には、放送伝送路ではUDP/IP(User Datagram Protocol/Internet Protocol)が用いられ、通信回線では、UDP/IPまたはTCP/IP(Transmission Control Protocol/Internet Protocol)が用いられる。また、放送伝送路においては、IPパケットの効率的な伝送のためにTLV多重化方式が用いられても良い。 To transmit MMTP packets, UDP/IP (User Datagram Protocol/Internet Protocol) is used on the broadcast transmission path, and UDP/IP or TCP/IP (Transmission Control Protocol) is used on the communication line. col/Internet Protocol) is used. Further, in the broadcast transmission path, a TLV multiplexing method may be used for efficient transmission of IP packets.
 図10Aに、放送伝送路におけるMMTのプロトコルスタックを示す。また、図10Bに、通信回線におけるMMTのプロトコルスタックを示す。MMT方式では、MMT-SIとTLV-SIの二種類の制御情報を伝送する仕組みを用意する。MMT-SIは、放送番組の構成等を示す制御情報である。MMTの制御メッセージの形式とし、MMTPペイロードに乗せてMMTPパケット化して、IPパケットで伝送する。TLV-SIは、IPパケットの多重に関する制御情報であり、選局のための情報やIPアドレスとサービスの対応情報を提供する。 FIG. 10A shows the MMT protocol stack in the broadcast transmission path. Further, FIG. 10B shows an MMT protocol stack in a communication line. The MMT system provides a mechanism for transmitting two types of control information: MMT-SI and TLV-SI. MMT-SI is control information indicating the structure of a broadcast program. It is in the format of an MMT control message, is placed on an MMTP payload, converted into an MMTP packet, and transmitted as an IP packet. TLV-SI is control information regarding multiplexing of IP packets, and provides information for channel selection and correspondence information between IP addresses and services.
 [MMT方式を用いる放送システムの制御信号]
 前述のように、MMT方式では、制御情報としてTLV-SIとMMT-SIを用意する。TLV-SIは、テーブルと記述子で構成される。テーブルはセクション形式で伝送され、記述子はテーブル内に配置される。MMT-SIは、テーブルや記述子を格納するメッセージ、特定の情報を示す要素や属性を持つテーブル、より詳細な情報を示す記述子の三階層で構成される。
[Control signal for broadcasting system using MMT system]
As described above, in the MMT method, TLV-SI and MMT-SI are prepared as control information. TLV-SI consists of tables and descriptors. Tables are transmitted in section format, and descriptors are placed within the tables. MMT-SI consists of three layers: messages that store tables and descriptors, tables that have elements and attributes that indicate specific information, and descriptors that indicate more detailed information.
 <TLV-SIで使用するテーブル>
 図10Cに、MMT方式の放送システムのTLV-SIで使用されるテーブルの一覧を示す。本実施例では、TLV-SIのテーブルとして以下に示すものが用いられる。また、図9B、図9Cに示した各テーブルと同義のテーブルをさらに用いても良い。
<Tables used in TLV-SI>
FIG. 10C shows a list of tables used in TLV-SI of the MMT broadcasting system. In this embodiment, the table shown below is used as the TLV-SI table. Furthermore, tables having the same meaning as the tables shown in FIGS. 9B and 9C may be further used.
(1)TLV用ネットワーク情報テーブル(Network Information Table for TLV)
(2)アドレスマップテーブル(Address Map Table)
(3)事業者が設定するテーブル
(1) Network Information Table for TLV
(2) Address Map Table
(3) Table set by the operator
 <TLV-SIで使用する記述子>
 図10Dに、MMT方式の放送システムのTLV-SIで使用される記述子の一覧を示す。本実施例では、TLV-SIの記述子として以下に示すものが用いられる。また、図9D、図9E、図9F、図9G、図9H、図9Iに示した各記述子と同義の記述子をさらに用いても良い。
<Descriptors used in TLV-SI>
FIG. 10D shows a list of descriptors used in TLV-SI of the MMT broadcasting system. In this embodiment, the following is used as a TLV-SI descriptor. Furthermore, descriptors having the same meaning as the descriptors shown in FIGS. 9D, 9E, 9F, 9G, 9H, and 9I may also be used.
(1)サービスリスト記述子(Service List Descriptor)
(2)衛星分配システム記述子(Satellite Delivery System Descriptor)
(3)システム管理記述子(System Management Descriptor)
(4)ネットワーク名記述子(Network Name Descriptor)
(5)リモートコントロールキー記述子(Remote Control Key Descriptor)
(6)事業者が設定する記述子
(1) Service List Descriptor
(2) Satellite Delivery System Descriptor
(3) System Management Descriptor
(4) Network Name Descriptor
(5) Remote Control Key Descriptor
(6) Descriptor set by the operator
 <MMT-SIで使用するメッセージ>
 図10Eに、MMT方式の放送システムのMMT-SIで使用されるメッセージの一覧を示す。本実施例では、MMT-SIのメッセージとして以下に示すものが用いられる。
<Messages used in MMT-SI>
FIG. 10E shows a list of messages used in MMT-SI of the MMT broadcasting system. In this embodiment, the following messages are used as MMT-SI messages.
(1)PA(Package Access)メッセージ
(2)M2セクションメッセージ
(3)CAメッセージ
(4)M2短セクションメッセージ
(5)データ伝送メッセージ
(6)事業者が設定するメッセージ
(1) PA (Package Access) message (2) M2 section message (3) CA message (4) M2 short section message (5) Data transmission message (6) Message set by the operator
 <MMT-SIで使用するテーブル>
 図10Fに、MMT方式の放送システムのMMT-SIで使用されるテーブルの一覧を示す。本実施例では、MMT-SIのテーブルとして以下に示すものが用いられる。また、図9B、図9Cに示した各テーブルと同義のテーブルをさらに用いても良い。
<Tables used in MMT-SI>
FIG. 10F shows a list of tables used in MMT-SI of the MMT broadcasting system. In this embodiment, the following MMT-SI table is used. Furthermore, tables having the same meaning as the tables shown in FIGS. 9B and 9C may be further used.
(1)MPT(MMT Package Table)
(2)PLT(Package List Table)
(3)LCT(Layout Configuration Table)
(4)ECM(Entitlement Control Message)
(5)EMM(Entitlement Management Message)
(6)CAT(MH)(Conditional Access Table (MH))
(7)DCM(Download Control Message)
(8)DMM(Download Management Message)
(9)MH-EIT(MH-Event Information Table)
(10)MH-AIT(MH-Application Information Table)
(1) MPT (MMT Package Table)
(2) PLT (Package List Table)
(3) LCT (Layout Configuration Table)
(4) ECM (Entitlement Control Message)
(5) EMM (Entitlement Management Message)
(6) CAT (MH) (Conditional Access Table (MH))
(7) DCM (Download Control Message)
(8) DMM (Download Management Message)
(9) MH-EIT (MH-Event Information Table)
(10) MH-AIT (MH-Application Information Table)
(11)MH-BIT(MH-Broadcaster Information Table)
(12)MH-SDTT(MH-Software Download Trigger Table)
(13)MH-SDT(MH-Service Description Table)
(14)MH-TOT(MH-Time Offset Table)
(15)MH-CDT(MH-Common Data Table)
(16)MH-DIT(MH-Discontinuity Information Table)
(17)MH-SIT(MH-Selection Information Table)
(18)DDMテーブル(Data Directory Management Table)
(19)DAMテーブル(Data Asset Management Table)
(20)DCCテーブル(Data Content Configuration Table)
(21)EMT(Event Message Table)
(22)事業者が設定するテーブル
(11) MH-BIT (MH-Broadcaster Information Table)
(12) MH-SDTT (MH-Software Download Trigger Table)
(13) MH-SDT (MH-Service Description Table)
(14) MH-TOT (MH-Time Offset Table)
(15) MH-CDT (MH-Common Data Table)
(16) MH-DIT (MH-Discontinuity Information Table)
(17) MH-SIT (MH-Selection Information Table)
(18) DDM table (Data Directory Management Table)
(19) DAM table (Data Asset Management Table)
(20) DCC table (Data Content Configuration Table)
(21) EMT (Event Message Table)
(22) Table set by the operator
 <MMT-SIで使用する記述子>
 図10Gと図10Hと図10Iに、MMT方式の放送システムのMMT-SIで使用される記述子の一覧を示す。本実施例では、MMT-SIの記述子として以下に示すものが用いられる。また、図9D、図9E、図9F、図9G、図9H、図9Iに示した各記述子と同義の記述子をさらに用いても良い。
<Descriptors used in MMT-SI>
FIGS. 10G, 10H, and 10I show a list of descriptors used in MMT-SI of the MMT broadcasting system. In this embodiment, the following is used as the MMT-SI descriptor. Furthermore, descriptors having the same meaning as the descriptors shown in FIGS. 9D, 9E, 9F, 9G, 9H, and 9I may also be used.
(1)アセットグループ記述子(Asset Group Descriptor)
(2)イベントパッケージ記述子(Event Package Descriptor)
(3)背景色指定記述子(Background Color Descriptor)
(4)MPU提示領域指定記述子(MPU Presentation Region Descriptor)
(5)MPUタイムスタンプ記述子(MPU Timestamp Descriptor)
(6)依存関係記述子(Dependency Descriptor)
(7)アクセス制御記述子(Access Control Descriptor)
(8)スクランブル方式記述子(Scrambler Descriptor)
(9)メッセージ認証方式記述子(Message Authentication Method Descriptor)
(10)緊急情報記述子(Emergency Information Descriptor)
(1) Asset Group Descriptor
(2) Event Package Descriptor
(3) Background Color Descriptor
(4) MPU Presentation Region Descriptor
(5) MPU Timestamp Descriptor
(6) Dependency Descriptor
(7) Access Control Descriptor
(8) Scrambler Descriptor
(9) Message Authentication Method Descriptor
(10) Emergency Information Descriptor
(11)MH-MPEG-4オーディオ記述子(MH-MPEG-4 Audio Descriptor)
(12)MH-MPEG-4オーディオ拡張記述子
       (MH-MPEG-4 Audio Extension Descriptor)
(13)MH-HEVC記述子(MH-HEVC Descriptor)
(14)MH-リンク記述子(MH-Linkage Descriptor)
(15)MH-イベントグループ記述子(MH-Event Group Descriptor)
(16)MH-サービスリスト記述子(MH-Service List Descriptor)
(17)MH-短形式イベント記述子(MH-Short Event Descriptor)
(18)MH-拡張形式イベント記述子(MH-Extended Event Descriptor)
(19)映像コンポーネント記述子(Video Component Descriptor)
(20)MH-ストリーム識別記述子(MH-Stream Identifier Descriptor)
(11) MH-MPEG-4 Audio Descriptor
(12) MH-MPEG-4 Audio Extension Descriptor
(13) MH-HEVC Descriptor
(14) MH-Linkage Descriptor
(15) MH-Event Group Descriptor
(16) MH-Service List Descriptor
(17) MH-Short Event Descriptor
(18) MH-Extended Event Descriptor
(19) Video Component Descriptor
(20) MH-Stream Identifier Descriptor
(21)MH-コンテント記述子(MH-Content Descriptor)
(22)MH-パレンタルレート記述子(MH-Parental Rating Descriptor)
(23)MH-音声コンポーネント記述子(MH-Audio Component Descriptor)
(24)MH-対象地域記述子(MH-Target Region Descriptor)
(25)MH-シリーズ記述子(MH-Series Descriptor)
(26)MH-SI伝送パラメータ記述子(MH-SI Parameter Descriptor)
(27)MH-ブロードキャスタ名記述子(MH-Broadcaster Name Descriptor)
(28)MH-サービス記述子(MH-Service Descriptor)
(29)IPデータフロー記述子(IP Data Flow Descriptor)
(30)MH-CA起動記述子(MH-CA Startup Descriptor)
(21) MH-Content Descriptor
(22) MH-Parental Rating Descriptor
(23) MH-Audio Component Descriptor
(24) MH-Target Region Descriptor
(25) MH-Series Descriptor
(26) MH-SI transmission parameter descriptor (MH-SI Parameter Descriptor)
(27) MH-Broadcaster Name Descriptor
(28) MH-Service Descriptor
(29) IP Data Flow Descriptor
(30) MH-CA Startup Descriptor
(31)MH-Type記述子(MH-Type Descriptor)
(32)MH-Info記述子(MH-Info Descriptor)
(33)MH-Expire記述子(MH-Expire Descriptor)
(34)MH-CompressionType記述子
       (MH-Compression Type Descriptor)
(35)MH-データ符号化方式記述子(MH-Data Component Descriptor)
(36)UTC-NPT参照記述子(UTC-NPT Reference Descriptor)
(37)イベントメッセージ記述子(Event Message Descriptor)
(38)MH-ローカル時間オフセット記述子(MH-Local Time Offset Descriptor)
(39)MH-コンポーネントグループ記述子(MH-Component Group Descriptor)
(40)MH-ロゴ伝送記述子(MH-Logo Transmission Descriptor)
(31) MH-Type Descriptor
(32) MH-Info Descriptor
(33) MH-Expire Descriptor
(34) MH-Compression Type Descriptor
(35) MH-Data Encoding Method Descriptor (MH-Data Component Descriptor)
(36) UTC-NPT Reference Descriptor
(37) Event Message Descriptor
(38) MH-Local Time Offset Descriptor (MH-Local Time Offset Descriptor)
(39) MH-Component Group Descriptor
(40) MH-Logo Transmission Descriptor
(41)MPU拡張タイムスタンプ記述子(MPU Extended Timestamp Descriptor)
(42)MPUダウンロードコンテンツ記述子(MPU Download Content Descriptor)
(43)MH-ネットワークダウンロードコンテンツ記述子
       (MH-Network Download Content Descriptor)
(44)アプリケーション記述子(MH-Application Descriptor)
(45)MH-伝送プロトコル記述子(MH-Transport Protocol Descriptor)
(46)MH-簡易アプリケーションロケーション記述子
       (MH-Simple Application Location Descriptor)
(47)アプリケーション境界権限設定記述子
       (MH-Application Boundary and Permission Descriptor)
(48)MH-起動優先情報記述子(MH-Autostart Priority Descriptor)
(49)MH-キャッシュ情報記述子(MH-Cache Control Info Descriptor)
(50)MH-確率的適用遅延記述子(MH-Randomized Latency Descriptor)
(41) MPU Extended Timestamp Descriptor
(42) MPU Download Content Descriptor
(43) MH-Network Download Content Descriptor
(44) Application descriptor (MH-Application Descriptor)
(45) MH-Transport Protocol Descriptor
(46) MH-Simple Application Location Descriptor
(47) Application Boundary and Permission Descriptor (MH-Application Boundary and Permission Descriptor)
(48) MH-Autostart Priority Information Descriptor (MH-Autostart Priority Descriptor)
(49) MH-Cache Control Info Descriptor
(50) MH-Randomized Latency Descriptor
(51)リンク先PU記述子(Linked PU Descriptor)
(52)ロックキャッシュ指定記述子(Locked Cache Descriptor)
(53)アンロックキャッシュ指定記述子(Unlocked Cache Descriptor)
(54)MH-ダウンロード保護記述子(MH-DL Protection Descriptor)
(55)アプリケーションサービス記述子(Application Service Descriptor)
(56)MPUノード記述子(MPU Node Descriptor)
(57)PU構成記述子(PU Structure Descriptor)
(58)MH-階層符号化記述子(MH-Hierarchy Descriptor)
(59)コンテンツコピー制御記述子(Content Copy Control Descriptor)
(60)コンテンツ利用制御記述子(Content Usage Control Descriptor)
(51) Linked PU Descriptor
(52) Locked Cache Descriptor
(53) Unlocked Cache Descriptor
(54) MH-DL Protection Descriptor
(55) Application Service Descriptor
(56) MPU Node Descriptor
(57) PU Structure Descriptor
(58) MH-Hierarchy Descriptor
(59) Content Copy Control Descriptor
(60) Content Usage Control Descriptor
(61)緊急ニュース記述子(Emergency News Descriptor)
(62)MH-CA契約情報記述子(MH-CA Contract Info Descriptor)
(63)MH-CAサービス記述子(MH-CA Service Descriptor)
(64)MH-外部アプリケーション制御記述子
       (MH-External Application Control Descriptor)
(65)MH-録画再生アプリケーション記述子
       (MH-Playback Application Descriptor)
(66)MH-簡易録画再生アプリケーションロケーション記述子
       (MH-Simple Playback Application Location Descriptor)
(67)MH-アプリケーション有効期限記述子
       (MH-Application Expiration Descriptor)
(68)関連ブロードキャスタ記述子(Related Broadcaster Descriptor)
(69)マルチメディアサービス情報記述子(Multimedia Service Descriptor)
(70)MH-スタッフ記述子(MH-Stuffing Descriptor)
(71)MH-放送ID記述子(MH-Broadcast ID Descriptor)
(72)MH-ネットワーク識別記述子(MH-Network Identification Descriptor)
(73)事業者が設定する記述子
(61) Emergency News Descriptor
(62) MH-CA Contract Info Descriptor
(63) MH-CA Service Descriptor
(64) MH-External Application Control Descriptor
(65) MH-Playback Application Descriptor (MH-Playback Application Descriptor)
(66) MH-Simple Playback Application Location Descriptor
(67) MH-Application Expiration Descriptor
(68) Related Broadcaster Descriptor
(69) Multimedia Service Information Descriptor
(70) MH-Stuffing Descriptor
(71) MH-Broadcast ID Descriptor
(72) MH-Network Identification Descriptor
(73) Descriptor set by the operator
 <MMT方式におけるデータ伝送と各制御情報の関係>
 図10Jに、MMT方式の放送システムにおけるデータ伝送と代表的なテーブルの関係を示す。
<Relationship between data transmission and each control information in MMT method>
FIG. 10J shows the relationship between data transmission and typical tables in an MMT broadcasting system.
 MMT方式の放送システムでは、放送伝送路を介したTLVストリームや通信回線を介したIPデータフロー等、複数の経路でデータ伝送を行うことができる。TLVストリームには、TLV-NITやAMTなどのTLV-SIと、IPパケットのデータフローであるIPデータフローが含まれる。IPデータフロー内には、一連の映像MPUを含む映像アセットや一連の音声MPUを含む音声アセットが含まれる。さらに、IPデータフロー内には、一連の字幕MPUを含む字幕アセット、一連の文字スーパーMPUを含む文字スーパーアセット、一連のデータMPUを含むデータアセット等が含まれても良い。これらの各種アセットは、PAメッセージに格納されて伝送されるMPT(MMTパッケージテーブル)により、パッケージ単位で関連付けられる。具体的には、MPTにパッケージIDと当該パッケージに含まれる各アセットのアセットIDとを関連付けて記載すれば良い。 In an MMT broadcasting system, data can be transmitted through multiple routes, such as a TLV stream via a broadcast transmission path and an IP data flow via a communication line. The TLV stream includes TLV-SI such as TLV-NIT and AMT, and an IP data flow that is an IP packet data flow. The IP data flow includes a video asset including a series of video MPUs and an audio asset including a series of audio MPUs. Further, the IP data flow may include a subtitle asset including a series of subtitle MPUs, a text super asset including a series of text superimpose MPUs, a data asset including a series of data MPUs, and the like. These various assets are associated in units of packages by an MPT (MMT package table) that is stored and transmitted in a PA message. Specifically, the package ID and the asset ID of each asset included in the package may be written in association with each other in the MPT.
 パッケージを構成するアセットはTLVストリーム内のアセットのみとすることもできるが、図10Jに示したように、通信回線のIPデータフローで伝送されるアセットを含めることもできる。これは、当該パッケージに含まれる各アセットのロケーション情報をMPT内に含めて、放送受信装置100が各アセットの参照先を把握可能とすることにより実現できる。各アセットのロケーション情報としては、
(1)MPTと同一のIPデータフローに多重されているデータ
(2)IPv4データフローに多重されているデータ
(3)IPv6データフローに多重されているデータ
(4)放送のMPEG2-TSに多重されているデータ
(5)IPデータフロー内にMPEG2-TS形式で多重されているデータ
(6)指定するURLにあるデータ
等、様々な伝送経路で伝送される各種データ指定することが可能である。
The assets that make up the package can be only those in the TLV stream, but as shown in FIG. 10J, they can also include assets that are transmitted in the IP data flow of the communication line. This can be realized by including location information of each asset included in the package in the MPT, so that the broadcast receiving device 100 can grasp the reference destination of each asset. The location information for each asset is as follows:
(1) Data multiplexed in the same IP data flow as MPT (2) Data multiplexed in IPv4 data flow (3) Data multiplexed in IPv6 data flow (4) Multiplexed in broadcast MPEG2-TS (5) Data multiplexed in MPEG2-TS format within the IP data flow (6) Data at a specified URL It is possible to specify various data to be transmitted via various transmission routes. .
 MMT方式の放送システムでは、さらにイベントという概念を有する。イベントは、M2セクションメッセージに含められて送られるMH-EITが扱う、所謂番組を示す概念である。具体的には、MH-EITに格納されたイベントパッケージ記述子が指し示すパッケージにおいて、MH-EITに格納された開示時刻から、継続時間分の期間に含まれる一連のデータが、当該イベントの概念に含まれるデータである。MH-EITは、放送受信装置100において当該イベント単位での各種処理(例えば、番組表の生成処理や、録画予約や視聴予約の制御、一時蓄積などの著作権管理処理、等)などに用いることができる。 The MMT broadcasting system also has the concept of an event. An event is a concept indicating a so-called program that is handled by an MH-EIT that is sent while being included in an M2 section message. Specifically, in the package pointed to by the event package descriptor stored in the MH-EIT, a series of data included in the duration period from the disclosure time stored in the MH-EIT is included in the concept of the event. This is the data included. The MH-EIT can be used in the broadcast receiving device 100 for various processing on an event-by-event basis (for example, program guide generation processing, control of recording reservations and viewing reservations, copyright management processing such as temporary storage, etc.). I can do it.
 [放送受信装置のチャンネル設定処理]
 <初期スキャン>
 現行の地上デジタル放送では、送出マスター単位でネットワークIDが異なり、NITに他局の情報が記載されないことが一般的である。したがって、現行の地上デジタル放送に対する互換性を有する、本発明の実施例の放送受信装置100は、本発明の実施例の地上デジタル放送(高度地上デジタル放送、または高度地上デジタル放送と現行の地上デジタル放送とが別階層で同時に伝送される地上デジタル放送)について、受信地点における全受信可能チャンネルをサーチ(スキャン)して、サービスIDに基づくサービスリスト(受信可能周波数テーブル)の作成を行う機能を有する必要がある。なお、MFN(Multi Frequency Network:多周波数ネットワーク)により、同一ネットワークIDを異なる物理チャンネルで受信可能な地域では、基本的に受信C/NまたはBER(Bit Error Rate)の良好なチャンネルを選択してサービスリストに記憶するように動作すれば良い。
[Broadcast receiving device channel setting process]
<Initial scan>
In current digital terrestrial broadcasting, network IDs differ for each transmission master, and information about other stations is generally not recorded in the NIT. Therefore, the broadcast receiving apparatus 100 according to the embodiment of the present invention, which is compatible with the current terrestrial digital broadcasting, is compatible with the terrestrial digital broadcasting (advanced terrestrial digital broadcasting, or advanced terrestrial digital broadcasting and current terrestrial digital broadcasting) according to the embodiment of the present invention. For digital terrestrial broadcasting (which is transmitted simultaneously on a separate layer from broadcasting), it has the function of searching (scanning) all receivable channels at the receiving point and creating a service list (receivable frequency table) based on the service ID. There is a need. In addition, in areas where the same network ID can be received on different physical channels due to MFN (Multi Frequency Network), basically select a channel with good reception C/N or BER (Bit Error Rate). All it has to do is to store it in the service list.
 なお、本発明の実施例の放送受信装置100の第四チューナ/復調部130Bで受信する高度BSデジタル放送または高度CSデジタル放送では、放送受信装置100がTLV-NITに格納されるサービスリストを取得して記憶すれば良く、サービスリストを作成する必要はない。したがって、第四チューナ/復調部130Bで受信する高度BSデジタル放送または高度CSデジタル放送については、初期スキャンおよび後述する再スキャンは不要である。 Note that in the advanced BS digital broadcast or advanced CS digital broadcast received by the fourth tuner/demodulator 130B of the broadcast receiving device 100 according to the embodiment of the present invention, the broadcast receiving device 100 acquires the service list stored in the TLV-NIT. There is no need to create a service list. Therefore, for advanced BS digital broadcasting or advanced CS digital broadcasting received by the fourth tuner/demodulator 130B, initial scanning and rescanning described later are not necessary.
 <再スキャン>
 本発明の実施例の放送受信装置100は、新規の開局や新中継局設置やテレビ受信機の受信地点変更等の場合に備えた再スキャン機能を有する。既設定の情報を変更する場合、放送受信装置100は、その旨をユーザに報知することが可能である。
<Rescan>
The broadcast receiving apparatus 100 according to the embodiment of the present invention has a rescan function in preparation for the opening of a new station, installation of a new relay station, change of reception point of a television receiver, and the like. When changing the preset information, the broadcast receiving apparatus 100 can notify the user to that effect.
 <初期スキャン/再スキャン時の動作例>
 図11Aに、本発明の実施例の放送受信装置100のチャンネル設定処理(初期スキャン/再スキャン)の動作シーケンスの一例を示す。なお、同図ではメディアトランスポート方式としてMPEG-2 TSを採用する場合の例を示すが、MMT方式を採用した場合も基本的に同様の処理となる。
<Example of operation during initial scan/rescan>
FIG. 11A shows an example of an operation sequence of channel setting processing (initial scan/rescan) of the broadcast receiving apparatus 100 according to the embodiment of the present invention. Note that although the figure shows an example where MPEG-2 TS is adopted as the media transport method, the processing is basically the same when the MMT method is adopted.
 チャンネル設定処理では、まず受信機能制御部1102が、ユーザの指示に基づいて、居住地域の設定(放送受信装置100の設置された地域の選択)を行う(S101)。このときユーザの指示に替えて、所定の処理により取得した放送受信装置100の設置位置情報に基づいて、居住地域の設定を自動的に行っても良い。設置位置情報の取得処理の例としては、LAN通信部121が接続するネットワークから情報を取得しても良く、デジタルインタフェース部125が接続する外部機器から設置位置に関する情報を取得しても良い。次に、スキャンする周波数範囲の初期値を設定し、前記設定した周波数へのチューニングを行うようにチューナ/復調部(第一チューナ/復調部130Cと第二チューナ/復調部130Tと第三チューナ/復調部130Lを区別しない場合はこのように記述する。以下同様。)に対して指示する(S102)。 In the channel setting process, the reception function control unit 1102 first sets the residential area (selects the area where the broadcast receiving device 100 is installed) based on the user's instruction (S101). At this time, instead of the user's instructions, the residential area may be automatically set based on the installation position information of the broadcast receiving device 100 acquired through predetermined processing. As an example of the installation position information acquisition process, information may be acquired from a network connected to the LAN communication unit 121, or information regarding the installation position may be acquired from an external device connected to the digital interface unit 125. Next, the initial value of the frequency range to be scanned is set, and the tuner/demodulators (the first tuner/demodulator 130C, the second tuner/demodulator 130T, and the third tuner/demodulator If the demodulation unit 130L is not distinguished, it is described like this.The same applies hereafter).) (S102).
 チューナ/復調部は、前記指示に基づいてチューニングを実行し(S103)、前記設定した周波数へのロックに成功した場合(S103:Yes)はS104の処理に進む。ロックに成功しなかった場合(S103:No)はS111の処理に進む。S104の処理では、C/Nの確認を行い(S104)、所定以上のC/Nが得られている場合(S104:Yes)はS105の処理に進み、受信確認処理を行う。所定以上のC/Nが得られていない場合(S104:No)はS111の処理に進む。 The tuner/demodulator executes tuning based on the instruction (S103), and if it succeeds in locking to the set frequency (S103: Yes), the process proceeds to S104. If the lock is not successful (S103: No), the process advances to S111. In the process of S104, the C/N is confirmed (S104), and if the C/N is greater than or equal to a predetermined value (S104: Yes), the process proceeds to S105 and a reception confirmation process is performed. If the C/N is not higher than the predetermined value (S104: No), the process proceeds to S111.
 受信確認処理では、受信機能制御部1102が、まず受信した放送波のBERを取得する(S105)。次に、NITを取得して照合することにより、NITが有効なデータか否かを確認する(S106)。S106の処理で取得したNITが有効なデータである場合、受信機能制御部1102は、NITからトランスポートストリームIDやオリジナルネットワークID等の情報を取得する。また、地上分配システム記述子から各トランスポートストリームID/オリジナルネットワークIDに対応する放送伝送路の物理的条件に関する分配システム情報を取得する。また、サービスリスト記述子からサービスIDの一覧を取得する。 In the reception confirmation process, the reception function control unit 1102 first obtains the BER of the received broadcast wave (S105). Next, by acquiring and comparing the NIT, it is confirmed whether the NIT is valid data or not (S106). If the NIT acquired in the process of S106 is valid data, the reception function control unit 1102 acquires information such as the transport stream ID and original network ID from the NIT. Furthermore, distribution system information regarding the physical conditions of the broadcast transmission path corresponding to each transport stream ID/original network ID is acquired from the terrestrial distribution system descriptor. Additionally, a list of service IDs is acquired from the service list descriptor.
 次に、受信機能制御部1102は、受信装置に記憶しているサービスリストを確認することにより、S106の処理で取得したトランスポートストリームIDが既取得であるか否かの確認を行う(S107)。S106の処理で取得したトランスポートストリームIDが既取得ではない場合(S107:No)、S106の処理で取得した各種情報をトランスポートストリームIDと関連付けてサービスリストに追加する(S108)。S106の処理で取得したトランスポートストリームIDが既取得である場合(S107:Yes)、S105の処理で取得したBERとサービスリストに記載済みのトランスポートストリームIDを取得した際のBERとの比較を行う(S109)。その結果、S105の処理で取得したBERのほうが良好な場合(S109:Yes)は、S106の処理で取得した各種情報を以ってサービスリストを更新する(S110)。S105の処理で取得したBERのほうが良好でない場合(S109:No)は、S106の処理で取得した各種情報は破棄する。 Next, the reception function control unit 1102 checks the service list stored in the reception device to check whether the transport stream ID acquired in the process of S106 has already been acquired (S107). . If the transport stream ID acquired in the process of S106 is not already acquired (S107: No), the various information acquired in the process of S106 is associated with the transport stream ID and added to the service list (S108). If the transport stream ID acquired in the process of S106 has already been acquired (S107: Yes), compare the BER acquired in the process of S105 with the BER when the transport stream ID already recorded in the service list was acquired. Execute (S109). As a result, if the BER obtained in S105 is better (S109: Yes), the service list is updated using the various information obtained in S106 (S110). If the BER obtained in step S105 is not better (S109: No), the various information obtained in step S106 is discarded.
 また、前述のサービスリスト作成(追加/更新)処理の際に、TS情報記述子からリモコンキーIDを取得し、トランスポートストリームごとの代表的なサービスとリモコンキーとの関連付けを行っても良い。この処理により、後述のワンタッチ選局が可能となる。 Additionally, during the service list creation (addition/update) process described above, the remote control key ID may be acquired from the TS information descriptor and the remote control key may be associated with a typical service for each transport stream. This process enables one-touch channel selection, which will be described later.
 受信確認処理を終えると、受信機能制御部1102は、現在の周波数設定がスキャンする周波数範囲の最終値か否かを確認する(S111)。現在の周波数設定がスキャンする周波数範囲の最終値でない場合(S111:No)は、チューナ/復調部に設定された周波数値をアップさせて(S112)、S103~S110の処理を繰り返す。現在の周波数設定がスキャンする周波数範囲の最終値である場合(S111:Yes)は、S113の処理に進む。 Upon completion of the reception confirmation process, the reception function control unit 1102 confirms whether the current frequency setting is the final value of the frequency range to be scanned (S111). If the current frequency setting is not the final value of the frequency range to be scanned (S111: No), the frequency value set in the tuner/demodulator is increased (S112), and the processes of S103 to S110 are repeated. If the current frequency setting is the final value of the frequency range to be scanned (S111: Yes), the process advances to S113.
 S113の処理では、前述の処理で作成(追加/更新)したサービスリストを、チャンネル設定処理の結果としてユーザに提示する(S113)。また、リモコンキーの重複等がある場合にはその旨をユーザに報知し、リモコンキー設定の変更等を行う(S114)ように促しても良い。前述の処理で作成/更新したサービスリストは、放送受信装置100のROM103やストレージ(蓄積)部110等の不揮発性メモリに記憶される。 In the process of S113, the service list created (added/updated) in the above process is presented to the user as a result of the channel setting process (S113). Further, if there is a duplication of remote control keys, the user may be notified of this fact and urged to change the remote control key settings (S114). The service list created/updated through the above processing is stored in a nonvolatile memory such as the ROM 103 or the storage unit 110 of the broadcast receiving device 100.
 図11Bに、NITのデータ構造の一例を示す。図中の『transpotrt_stream_id』が前述のトランスポートストリームIDに、『original_network_id』がオリジナルネットワークIDに、それぞれ対応する。また、図11Cに、地上分配システム記述子のデータ構造の一例を示す。図中の『guard_interval』や『transmission_mode』や『frequency』等が前述の分配システム情報に対応する。図11Dに、サービスリスト記述子のデータ構造の一例を示す。図中の『service_id』が前述のサービスIDに対応する。図11Eに、TS情報記述子のデータ構造の一例を示す。図中の『remote_control_key_id』が前述のリモコンキーIDに対応する。 FIG. 11B shows an example of the data structure of NIT. In the figure, "transpotrt_stream_id" corresponds to the above-mentioned transport stream ID, and "original_network_id" corresponds to the original network ID. Further, FIG. 11C shows an example of the data structure of the ground distribution system descriptor. "guard_interval", "transmission_mode", "frequency", etc. in the figure correspond to the above-mentioned distribution system information. FIG. 11D shows an example of the data structure of a service list descriptor. "service_id" in the figure corresponds to the above-mentioned service ID. FIG. 11E shows an example of the data structure of the TS information descriptor. "remote_control_key_id" in the figure corresponds to the above-mentioned remote control key ID.
 なお、放送受信装置100では、前述のスキャンする周波数範囲を、受信する放送サービスに応じて適宜変更するように制御しても良い。例えば、放送受信装置100が現行の地上デジタル放送サービスの放送波を受信している場合には、470MHz~770MHzの周波数範囲(物理チャンネルの13ch~62chに相当)をスキャンするように制御する。即ち、前記周波数範囲の初期値を470MHz~476MHz(中心周波数473MHz)と設定し、周波数範囲の最終値を764MHz~770MHz(中心周波数767MHz)と設定し、S112の処理では+6MHzの周波数値アップを実施するように制御を行う。 Note that the broadcast receiving device 100 may control the frequency range to be scanned as described above to be changed as appropriate depending on the broadcast service to be received. For example, when the broadcast receiving device 100 is receiving broadcast waves of the current digital terrestrial broadcasting service, it is controlled to scan the frequency range of 470 MHz to 770 MHz (corresponding to physical channels 13 ch to 62 ch). That is, the initial value of the frequency range is set to 470 MHz to 476 MHz (center frequency 473 MHz), the final value of the frequency range is set to 764 MHz to 770 MHz (center frequency 767 MHz), and the frequency value is increased by +6 MHz in the process of S112. control so that
 また、放送受信装置100が高度地上デジタル放送サービスを含む放送波を受信している場合には、470MHz~1010MHzの周波数範囲(図7Dに示した周波数変換処理や図8Cに示した周波数変換増幅処理を行っている可能性があるため)をスキャンするように制御する。即ち、前記周波数範囲の初期値を470MHz~476MHz(中心周波数473MHz)と設定し、周波数範囲の最終値を1004mHz~1010MHz(中心周波数1007MHz)と設定し、S112の処理では+6MHzの周波数値アップを実施するように制御を行う。なお、放送受信装置100が高度地上デジタル放送サービスを受信している場合であっても、前述の周波数変換処理や周波数変換増幅処理を行っていないと判断される場合には、470MHz~770MHzの周波数範囲のみをスキャンするように制御すれば良い。スキャンする周波数範囲の選択制御は、放送受信装置100が、TMCC情報のシステム識別および周波数変換処理識別等に基づいて行うことが可能である。 In addition, when the broadcast receiving device 100 is receiving broadcast waves including advanced terrestrial digital broadcasting services, the frequency range of 470 MHz to 1010 MHz (frequency conversion processing shown in FIG. 7D or frequency conversion amplification processing shown in FIG. 8C) control to scan). That is, the initial value of the frequency range is set to 470 MHz to 476 MHz (center frequency 473 MHz), the final value of the frequency range is set to 1004 MHz to 1010 MHz (center frequency 1007 MHz), and the frequency value is increased by +6 MHz in the process of S112. control so that Note that even if the broadcast receiving device 100 is receiving an advanced terrestrial digital broadcasting service, if it is determined that the frequency conversion process or frequency conversion amplification process described above is not performed, the frequency of 470 MHz to 770 MHz is All you have to do is control it so that only the range is scanned. The broadcast receiving apparatus 100 can control the selection of the frequency range to be scanned based on the system identification, frequency conversion processing identification, etc. of the TMCC information.
 また、本発明の実施例の放送システムが、例えば図7Cに示した構成であって、放送受信装置100が偏波両用伝送方式の高度地上デジタル放送サービスを受信している場合、選局/検波部131Hと選局/検波部131Vの一方で470MHz~770MHzの周波数範囲をスキャンし、他方で770MHz~1010MHzの周波数範囲をスキャンするようにしても良い(当該他方の選局/検波部で検波した偏波で伝送波について周波数変換処理が施されている場合)。TMCC情報のシステム識別および周波数変換処理識別に基づいて、このように制御すれば、不要な周波数範囲におけるスキャンを省くことが可能となり、チャンネル設定に要する時間を縮減することが可能となる。さらに、この場合、選局/検波部131Hと選局/検波部131Vの双方で図11Aの動作シーケンスを並行して進めて、図11Aの動作シーケンスにおける周波数アップS112のループを同期させても良い。このとき、図11Aの動作シーケンスにおける周波数アップのループにおける同タイミングのループにおいて、同一物理チャンネルで伝送されていた水平偏波信号と垂直偏波信号のペアについて、それぞれ並行して受信するように構成すれば、当該水平偏波信号と垂直偏波信号のペアで伝送される高度地上デジタルサービスのパケットストリーム内部の制御情報等をデコードして、当該ループ処理中に取得可能になる。これにより、効率良くスキャンとサービスリストの作成が進むため、好適である。 Furthermore, if the broadcasting system according to the embodiment of the present invention has the configuration shown in FIG. 7C, for example, and the broadcast receiving apparatus 100 receives an advanced terrestrial digital broadcasting service using a dual-polarization transmission method, the channel selection/detection One of the channel selection/detection section 131H and the channel selection/detection section 131V may scan the frequency range of 470 MHz to 770 MHz, and the other may scan the frequency range of 770 MHz to 1010 MHz. (if frequency conversion processing is applied to the transmitted wave with polarization). By controlling in this manner based on the system identification and frequency conversion process identification of the TMCC information, it becomes possible to omit scanning in unnecessary frequency ranges, and it becomes possible to reduce the time required for channel setting. Furthermore, in this case, the operation sequence of FIG. 11A may be advanced in parallel in both the channel selection/detection section 131H and the channel selection/detection section 131V to synchronize the loop of frequency up S112 in the operation sequence of FIG. 11A. . At this time, in the loop at the same timing in the frequency up loop in the operation sequence of FIG. 11A, the configuration is configured so that the pair of horizontally polarized signal and vertically polarized signal transmitted on the same physical channel is received in parallel. Then, control information and the like inside the packet stream of the advanced terrestrial digital service transmitted as a pair of the horizontally polarized signal and the vertically polarized signal can be decoded and acquired during the loop processing. This is preferable because scanning and creation of the service list proceed efficiently.
 同様に、放送受信装置100が図8Bに示した構成でさらにチューナ/復調部(選局/検波部)が複数備えられた所謂ダブルチューナの構成(例えば、第三チューナ/復調部130Lを複数備える構成、図8Dに示した構成でも良い)であって、階層分割多重伝送方式の高度地上デジタル放送サービスを受信している場合、前記ダブルチューナの一方で470MHz~770MHzの周波数範囲をスキャンし、他方で770MHz~1010MHzの周波数範囲をスキャンするようにしても良い(周波数変換増幅処理が施されている場合)。このように制御すれば、前述と同様にチャンネル設定に要する時間を縮減することが可能となる。 Similarly, the broadcast receiving apparatus 100 has the configuration shown in FIG. 8B and has a so-called double tuner configuration in which a plurality of tuners/demodulators (channel selection/detection units) are further provided (for example, a plurality of third tuner/demodulators 130L are provided). configuration (the configuration shown in FIG. 8D may also be used), and when receiving an advanced terrestrial digital broadcasting service using a hierarchical division multiplex transmission method, one of the double tuners scans the frequency range of 470 MHz to 770 MHz, and the other Alternatively, the frequency range of 770 MHz to 1010 MHz may be scanned (if frequency conversion amplification processing is performed). By controlling in this manner, it is possible to reduce the time required for channel setting, as described above.
 なお、図8A、図8Bおよび図8Cで説明したとおり、図8Bに示した構成で、上側階層または下側階層のいずれか一方で伝送される地上デジタル放送サービスは、現行の地上デジタル放送サービスである。よって、例えば、470MHz~770MHzの周波数範囲と770MHz~1010MHzの周波数範囲のうち、現行の地上デジタル放送サービスが伝送される周波数範囲について第一チューナ/復調部130Cでスキャンを行い、他方の周波数範囲について並行して第三チューナ/復調部130Lでスキャンを行っても良い。この場合も、上述の第三チューナ/復調部130Lのダブルチューナによる並行スキャンと同様に、チャンネル設定に要する時間を縮減することが可能となる。470MHz~770MHzの周波数範囲と770MHz~1010MHzの周波数範囲のうちいずれにおいて、現行の地上デジタル放送サービスが伝送されているか、高度な地上デジタル放送サービスが伝送されているかは、初期スキャン/再スキャンの動作シーケンスを始める前に、それぞれの周波数範囲について1点ずつ合計2点、例えば、470MHz~476MHz(中心周波数473MHz)と770MHz~776MHz(中心周波数773MHz)の2点について、第三チューナ/復調部130Lで受信を行い、それぞれの周波数で伝送されるTMCC情報を取得して、当該TMCC情報に格納されるパラメータ(例えば、システム識別のパラメータ)を参照することにより識別可能である。 As explained in FIGS. 8A, 8B, and 8C, the terrestrial digital broadcasting service that is transmitted in either the upper layer or the lower layer in the configuration shown in FIG. 8B is the current terrestrial digital broadcasting service. be. Therefore, for example, among the frequency range of 470 MHz to 770 MHz and the frequency range of 770 MHz to 1010 MHz, the first tuner/demodulator 130C scans the frequency range in which the current digital terrestrial broadcasting service is transmitted, and scans the frequency range of the other frequency range. The third tuner/demodulator 130L may perform scanning in parallel. In this case as well, it is possible to reduce the time required for channel setting, similar to the above-described parallel scanning using the double tuner of the third tuner/demodulator 130L. Whether current digital terrestrial broadcasting services or advanced digital terrestrial broadcasting services are being transmitted in the frequency range of 470 MHz to 770 MHz or 770 MHz to 1010 MHz depends on the initial scan/rescan operation. Before starting the sequence, the third tuner/demodulator 130L selects two points in total, one for each frequency range, for example, 470 MHz to 476 MHz (center frequency 473 MHz) and 770 MHz to 776 MHz (center frequency 773 MHz). Identification is possible by performing reception, acquiring TMCC information transmitted on each frequency, and referring to parameters (for example, system identification parameters) stored in the TMCC information.
 なお、偏波両用伝送方式の高度地上デジタル放送サービスで、例えば、図7Aの階層分割例(1)に示したC階層の4K放送番組のような、水平偏波信号と垂直偏波信号の両方を使用して伝送を行う放送番組を有するチャンネルの場合、470MHz~770MHzの周波数範囲と770MHz~1010MHzの周波数範囲の双方のスキャンで同一のトランスポートIDを検出するが、これは1つのチャンネルとしてサービスリストに記載する。また、同図に示したB階層の2K放送番組の場合、水平偏波信号のB階層と垂直偏波信号のB階層とで同一の放送番組が伝送されている場合には、同一のトランスポートIDを検出しても1つのチャンネルとしてサービスリストに記憶すれば良い。即ち、異なる偏波で伝送される同一階層において、同一の放送番組が伝送されている場合には、1つのチャンネルにマージして認識し、別々のチャンネルとは認識しない。このようにすれば、サービスリストを用いた選局処理において、別チャンネルで全く同一の放送番組が存在することによるユーザの混乱等を回避することができる。 In addition, in advanced terrestrial digital broadcasting services using dual-polarization transmission methods, for example, a 4K broadcast program on the C layer shown in layer division example (1) in FIG. 7A, both horizontally polarized signals and vertically polarized signals are transmitted. In the case of a channel that has a broadcast program that is transmitted using Write it on the list. In addition, in the case of a 2K broadcast program on the B layer shown in the same figure, if the same broadcast program is transmitted on the B layer of horizontally polarized signals and the B layer of vertically polarized signals, the same transport Even if an ID is detected, it is sufficient to store it in the service list as one channel. That is, if the same broadcast program is being transmitted on the same layer transmitted with different polarizations, it will be recognized as being merged into one channel, and will not be recognized as separate channels. In this way, in the channel selection process using the service list, it is possible to avoid confusion among users due to the presence of exactly the same broadcast program on different channels.
 これに対し、偏波両用伝送方式の高度地上デジタル放送サービスで、水平偏波信号のB階層と垂直偏波信号のB階層とで異なる放送番組が伝送されている場合(垂直偏波信号のB階層を仮想D階層として扱う場合)には、異なるチャンネルとしてサービスリストに記憶する。水平偏波信号のB階層と垂直偏波信号のB階層とで同一の放送番組が伝送されているか否かは、放送受信装置100において、TMCC情報の追加階層伝送識別パラメータ等を参照することにより判断すれば識別できる。 On the other hand, in an advanced digital terrestrial broadcasting service using a dual-polarization transmission method, when different broadcast programs are transmitted on the B layer of horizontally polarized signals and the B layer of vertically polarized signals (B If the layer is treated as a virtual D layer), it is stored in the service list as a different channel. Whether or not the same broadcast program is being transmitted in the B layer of the horizontally polarized signal and the B layer of the vertically polarized signal can be determined by referring to the additional layer transmission identification parameter of the TMCC information in the broadcast receiving device 100. It can be identified by judgment.
 [放送受信装置の選局処理]
 本発明の実施例の放送受信装置100は、番組選局の機能として、リモコンのワンタッチキーによるワンタッチ選局や、リモコンのチャンネルアップ/ダウンキーによるチャンネルアップ/ダウン選局や、リモコンの10キーを用いた3桁番号の直接入力によるダイレクト選局等の機能を有する。いずれの選局機能も、上述した初期スキャン/再スキャンで生成したサービスリストに記憶される情報を用いて行えば良い。また、選局後は、バナー表示等により選局したチャンネルの情報(ダイレクト選局に用いる3桁番号、枝番、TS名、サービス名、ロゴ、映像解像度情報(UHDやHDやSDの区別等)、映像解像度アップ/ダウンコンバートの有無、音声チャンネル数、音声ダウンミックスの有無、等)を表示する。このようにすれば、ユーザは、選局後のチャンネルの情報を視覚的に得ることができ、所望のチャンネルに選局できたか否かを確認することができる。以下に、各選局方法における処理の一例を記述する。
[Tuning selection process of broadcast receiving device]
The broadcast receiving device 100 according to the embodiment of the present invention has program selection functions such as one-touch tuning using the one-touch key on the remote controller, channel up/down tuning using the channel up/down key on the remote controller, and channel up/down tuning using the remote controller's channel up/down key. It has functions such as direct channel selection by directly inputting the 3-digit number used. Any channel selection function may be performed using the information stored in the service list generated by the above-mentioned initial scan/rescan. In addition, after selecting a channel, information on the selected channel is displayed on a banner, etc. (3-digit number used for direct channel selection, branch number, TS name, service name, logo, video resolution information (distinguishing between UHD, HD, SD, etc.) ), presence or absence of video resolution up/down conversion, number of audio channels, presence or absence of audio downmix, etc.). In this way, the user can visually obtain information about the selected channel, and can confirm whether or not the desired channel has been selected. An example of processing in each channel selection method will be described below.
 <ワンタッチ選局の処理例>
(1)リモコンのワンタッチキー押下により、『remote_control_key_id』で指定される『service_id』のサービスを選局する。
(2)ラストモードを設定し、選局後のチャンネル情報表示を行う。
<Processing example of one-touch channel selection>
(1) By pressing the one-touch key on the remote control, select the service of "service_id" specified by "remote_control_key_id".
(2) Set the last mode and display channel information after tuning.
 <チャンネルアップダウンボタンによるアップダウン選局の処理例>
(1)リモコンのチャンネルアップ/ダウンキー押下により、ダイレクト選局に用いる3桁番号順の選局を行う。
(1-1)アップキーが押下された場合は、3桁番号の上側隣接サービスを選局する。但し、現在の3桁番号の値がサービスリスト最大値の場合には、最小値の番号のサービスを選局する。
(1-2)ダウンキーが押下された場合は、3桁番号の下側隣接サービスを選局する。但し、現在の3桁番号の値がサービスリスト最小値の場合には、最大値の番号のサービスを選局する。
(3)ラストモードを設定し、選局後のチャンネル情報表示を行う。
<Example of processing for up/down channel selection using channel up/down buttons>
(1) By pressing the channel up/down key on the remote control, channels are selected in the order of the three-digit numbers used for direct channel selection.
(1-1) When the up key is pressed, select the adjacent service above the 3-digit number. However, if the current value of the three-digit number is the maximum value in the service list, the service with the minimum number is selected.
(1-2) When the down key is pressed, select the adjacent service below the 3-digit number. However, if the current value of the three-digit number is the minimum value in the service list, the service with the maximum number is selected.
(3) Set the last mode and display channel information after tuning.
 <ダイレクト選局の処理例>
(1)ダイレクト選局が選択されると、3桁番号の入力待ち状態となる。
(2-1)所定時間(5秒程度)に3桁番号の入力が完了しない場合は、通常モードに復帰し、現在選局されているサービスのチャンネル情報表示を行う。
(2-2)3桁番号の入力が完了した場合には、受信可能周波数テーブルのサービスリストにそのチャンネルが存在するかを判定し、無ければ『このチャンネルは存在しません』等のメッセージを表示する。
(3)チャンネルが存在する場合には選局処理を行い、ラストモードを設定し、選局後のチャンネル情報表示を行う。
<Direct channel selection processing example>
(1) When direct channel selection is selected, the system waits for input of a 3-digit number.
(2-1) If the input of the 3-digit number is not completed within the predetermined time (about 5 seconds), the device returns to the normal mode and displays the channel information of the currently selected service.
(2-2) When the input of the 3-digit number is completed, it is determined whether the channel exists in the service list of the receivable frequency table, and if not, a message such as ``This channel does not exist'' is displayed. do.
(3) If a channel exists, perform channel selection processing, set last mode, and display channel information after channel selection.
 なお、選局動作はSIに基づいて行われるものであり、放送休止中と判断した場合には、その旨を表示してユーザに報知する機能も有して良い。 Note that the channel selection operation is performed based on SI, and if it is determined that broadcasting is suspended, it may also have a function to display and notify the user of this fact.
 <放送受信装置のリモコン>
 図12Aに、本発明の実施例の放送受信装置100に対する操作指示の入力に使用するリモコン(リモートコントローラー)の外観図の一例を示す。
<Remote control for broadcast receiving device>
FIG. 12A shows an example of an external view of a remote controller used to input operation instructions to the broadcast receiving apparatus 100 according to the embodiment of the present invention.
 リモコン180Rは、放送受信装置100の電源オン/オフ(スタンバイオン/オフ)を行うための電源キー180R1と、カーソルを上下左右に移動させるためのカーソルキー(上、下、左、右)180R2と、カーソル位置の項目を選択項目として決定するための決定キー180R3と、戻るキー180R4と、を備える。 The remote control 180R includes a power key 180R1 for powering on/off (standby on/off) the broadcast receiving device 100, and cursor keys (up, down, left, right) 180R2 for moving the cursor up, down, left and right. , a determination key 180R3 for determining the item at the cursor position as a selection item, and a return key 180R4.
 また、リモコン180Rは、放送受信装置100が受信する放送ネットワークを切り替えるためのネットワーク切替キー(高度地デジ、地デジ、高度BS、BS、CS)180R5を備える。また、リモコン180Rは、ワンタッチ選局に使用するワンタッチキー(1~12)180R6と、チャンネルアップ/ダウン選局に使用するチャンネルアップ/ダウンキー180R7と、ダイレクト選局の際に3桁番号の入力に使用する10キーと、を備える。なお、同図に示した例では、10キーはワンタッチキー180R6と兼用され、ダイレクト選局の際には直接キー180R8の押下後にワンタッチキー180R6を操作することで3桁番号の入力が可能となる。 The remote control 180R also includes a network switching key (altitude terrestrial digital, terrestrial digital, advanced BS, BS, CS) 180R5 for switching the broadcast network received by the broadcast receiving apparatus 100. In addition, the remote control 180R has one-touch keys (1 to 12) 180R6 used for one-touch tuning, a channel up/down key 180R7 used for channel up/down tuning, and a 3-digit number input for direct tuning. It is equipped with 10 keys used for. In the example shown in the figure, the 10 key is also used as the one-touch key 180R6, and during direct channel selection, it is possible to input a 3-digit number by operating the one-touch key 180R6 after pressing the direct key 180R8. .
 また、リモコン180Rは、番組表を表示するためのEPGキー180R9と、システムメニューを表示するためのメニューキー180RAと、を備える。番組表やシステムメニューは、カーソルキー180R2や決定キー180R3や戻るキー180R4により詳細操作が可能である。 The remote control 180R also includes an EPG key 180R9 for displaying a program guide and a menu key 180RA for displaying a system menu. The program guide and system menu can be operated in detail using the cursor key 180R2, enter key 180R3, and return key 180R4.
 また、リモコン180Rは、データ放送サービスやマルチメディアサービス等に用いるdキー180RBと、放送通信連携サービスやその対応アプリの一覧等の表示のための連携キー180RCと、カラーキー(青、赤、緑、黄)180RDと、を備える。データ放送サービスやマルチメディアサービスや放送通信連携サービス等では、カーソルキー180R2や決定キー180R3や戻るキー180R4やカラーキー180RDにより詳細操作が可能である。 In addition, the remote control 180R includes a d key 180RB used for data broadcasting services and multimedia services, a cooperation key 180RC for displaying a list of broadcasting and communication cooperation services and their compatible applications, and color keys (blue, red, green). , yellow) 180RD. For data broadcasting services, multimedia services, broadcasting communication cooperation services, etc., detailed operations are possible using the cursor key 180R2, enter key 180R3, return key 180R4, and color key 180RD.
 また、リモコン180Rは、関連する映像を選択するための映像キー180REと、音声ESの切り替えや二か国語の切り替えのための音声キー180RFと、字幕のオン/オフの切り替えや字幕言語の切り替えのための字幕キー180RGと、を備える。また、リモコン180Rは、音声出力の音量アップ/ダウンのための音量キー180RHと、音声出力のオン/オフの切り替えのための消音キー180RIと、を備える。 The remote control 180R also has a video key 180RE for selecting related video, an audio key 180RF for switching audio ES and bilingual, and a key 180RF for switching on/off of subtitles and switching the subtitle language. and a subtitle key 180RG. The remote controller 180R also includes a volume key 180RH for increasing/decreasing the volume of audio output, and a mute key 180RI for switching on/off the audio output.
 <高度地デジキーによるネットワーク切り替えの処理例>
 本発明の実施例の放送受信装置100のリモコン180Rは、ネットワーク切替キー180R5として、『高度地デジキー』と『地デジキー』と『高度BSキー』と『BSキー』と『CSキー』を備える。ここで、『高度地デジキー』と『地デジキー』は、高度地上デジタル放送サービスにおいて、例えば、異なる階層で4K放送番組と2K放送番組のサイマル放送が実施されている場合に、『高度地デジキー』押下状態ではチャンネル選択時に4K放送番組の選局を優先し、『地デジキー』押下状態ではチャンネル選択時に2K放送番組の選局を優先するように構成しても良い。このように制御することにより、例えば、4K放送番組の受信が可能な状況下で4K放送番組の伝送波にエラーが多いような場合、『地デジキー』押下を行うことにより、強制的に2K放送番組を選局できる等の制御が可能となる。また、異なる階層で4K放送番組と2K放送番組のサイマル放送が実施されている場合で、4K放送番組の受信が可能な状況下で4K放送番組の伝送波にエラーが多いような場合には、『高度地デジキー』押下状態であっても2K放送番組(選択中の4K放送番組のサイマル)を選局するようにしても良い。
<Example of processing for network switching using advanced terrestrial digital key>
The remote control 180R of the broadcast receiving apparatus 100 according to the embodiment of the present invention includes an "altitude terrestrial digital key", "terrestrial digital key", "altitude BS key", "BS key", and "CS key" as network switching keys 180R5. Here, "altitude terrestrial digital key" and "terrestrial digital key" are used in advanced terrestrial digital broadcasting service, for example, when simultaneous broadcasting of 4K broadcast program and 2K broadcast program is carried out in different layers, "altitude terrestrial digital key" In the pressed state, priority is given to selecting a 4K broadcast program when selecting a channel, and when the "terrestrial digital key" is pressed, priority is given to selecting a 2K broadcast program when selecting a channel. By controlling in this way, for example, if there are many errors in the transmission wave of the 4K broadcast program in a situation where it is possible to receive the 4K broadcast program, pressing the "terrestrial digital key" will force the 2K broadcast to be stopped. Control such as being able to select a program becomes possible. In addition, when simultaneous broadcasting of 4K broadcast programs and 2K broadcast programs is carried out on different layers, and there are many errors in the transmission wave of the 4K broadcast program in a situation where the 4K broadcast program can be received, Even when the "altitude terrestrial digital key" is pressed, a 2K broadcast program (a simulcast of the currently selected 4K broadcast program) may be selected.
 <選局時の画面表示例>
 前述のように、本発明の実施例の放送受信装置100は、ワンタッチ選局やチャンネルアップ/ダウン選局やダイレクト選局等によるチャンネル選択を実行した際に、バナー表示等により選局したチャンネルの情報を表示する機能を有する。
<Example of screen display when selecting a channel>
As described above, when performing channel selection by one-touch tuning, channel up/down tuning, direct tuning, etc., the broadcast receiving device 100 according to the embodiment of the present invention displays the selected channel by displaying a banner or the like. It has the function of displaying information.
 図12Bに、選局時のバナー表示の一例を示す。バナー表示192A1は2K放送番組を選局した際に表示されるバナー表示の例であり、例えば、番組名と番組の開始時刻/終了時刻とネットワーク種別とリモコンのダイレクト選局キーの番号とサービスロゴと3桁番号と、を表示すれば良い。また、バナー表示192A2は4K放送番組を選局した際に表示されるバナー表示の例であり、例えば、前述のバナー表示192A1と同様の各情報の他、受信中の番組が4K放送番組であることを示す『高度』を記号化したマークがさらに表示される。また、解像度変換処理やダウンミックス処理等が行われた場合には、その旨を示す表示を行っても良い。バナー表示192A2の例では、一例として、UHD解像度からHD解像度へのダウンコンバート処理および22.2chから5.1chへのダウンミックス処理が行われたことを表示している。 FIG. 12B shows an example of a banner display when selecting a channel. Banner display 192A1 is an example of a banner display that is displayed when a 2K broadcast program is selected. For example, the program name, program start time/end time, network type, remote control direct channel selection key number, and service logo are displayed. All you have to do is display the 3-digit number. In addition, the banner display 192A2 is an example of a banner display that is displayed when a 4K broadcast program is selected. A mark symbolizing "altitude" will also be displayed. Furthermore, when resolution conversion processing, downmix processing, etc. have been performed, a display may be displayed to indicate this. In the example of the banner display 192A2, for example, it is displayed that down-conversion processing from UHD resolution to HD resolution and downmix processing from 22.2ch to 5.1ch have been performed.
 放送受信装置100において、これらの表示を行うことにより、サイマル放送等により同一コンテンツが、2K放送番組と4K放送番組などの異なる品質の放送番組として、同時に放送されている場合に、いずれの放送番組を表示しているかが、ユーザが好適に把握できるようになる。 By performing these displays in the broadcast receiving device 100, when the same content is being simultaneously broadcast as broadcast programs of different quality, such as a 2K broadcast program and a 4K broadcast program, by simulcasting etc., which broadcast program This allows the user to better understand whether or not the information is being displayed.
 以上説明した本発明の実施例に係る各機能の一部または全部の機能を有する高度デジタル放送サービスのシステムによれば、現行のデジタル放送サービスとの互換性も考慮した、より高機能な高度デジタル放送サービスの送信技術および受信技術を提供することが可能となる。即ち、高度デジタル放送サービスをより好適に送信または受信する技術を提供することができる。 According to the advanced digital broadcasting service system having some or all of the functions according to the embodiments of the present invention described above, a more sophisticated advanced digital broadcasting service that takes into consideration compatibility with the current digital broadcasting service It becomes possible to provide transmission technology and reception technology for broadcasting services. That is, it is possible to provide a technique for more suitably transmitting or receiving advanced digital broadcasting services.
 (実施例2)
 [高度な音声信号]
 本実施例は、高度な音声信号の取り扱いに関するものである。現行のシステムの音声信号は、スピーカに対応したチャンネルベースの信号である。チャンネルベースの信号としては、5.1chのものと22.2chのものがある(ここで「ch」は「チャンネル」の略である)。これに対して、本実施例においては、チャンネルベースの信号の他に、オブジェクトベースの信号とHOA(Higher Order Ambisonics)方式の信号を含んだ音声信号を取り扱う。
(Example 2)
[Advanced audio signal]
This embodiment relates to advanced audio signal handling. The audio signal in current systems is a channel-based signal that corresponds to a speaker. Channel-based signals include those of 5.1ch and those of 22.2ch (here, "ch" is an abbreviation for "channel"). In contrast, in this embodiment, in addition to channel-based signals, audio signals including object-based signals and HOA (Higher Order Ambisonics) signals are handled.
 オブジェクトベースの信号とは、ナレーターの音声など、右側に配置するか、左側に配置するか等、受信機側で再生位置を変更することができる音声信号である。再生位置は固定せず、ダイナミックに変更しても構わない。 An object-based signal is an audio signal that allows the receiver to change the playback position, such as the voice of a narrator, such as placing it on the right or left side. The playback position is not fixed and may be changed dynamically.
 HOA方式の信号とは、音場を球面調和関数の和として展開した信号である。伝送容量に上限があるので、有限次数までの展開を使用する。チャンネルベース信号は標準的なスピーカ配置に対応したマイク位置での収録を基本とするので、標準的或いはそれに近い配置のスピーカ群での音声再生に適している。これに対して、HOA方式では、特定のスピーカ配置とは独立に空間の音場情報を記録する方式のため、任意のスピーカ配置に対応する場合に好適である。 The HOA method signal is a signal that expands the sound field as a sum of spherical harmonics. Since there is an upper limit to the transmission capacity, we use expansion up to a finite degree. Since the channel base signal is basically recorded at a microphone position corresponding to a standard speaker arrangement, it is suitable for audio reproduction using a group of speakers arranged at or near the standard arrangement. On the other hand, the HOA method records spatial sound field information independently of a specific speaker arrangement, so it is suitable for supporting any speaker arrangement.
 標準的なスピーカ配置の例を図13A、図13Bおよび図13Cに示す。スピーカ群は、図13Aに示すように、設置位置の高さにより、上層、中層、下層の3つのグループに分けられる。そして、それぞれのグループ毎の配置は図13B、図13Cに示されたようになる。図13Bは、22.2chのスピーカシステムの配置であり、図13Cは、7.1chのスピーカシステムの配置である。ch数表示の小数点以下の数字が低周波数用の信号のチャンネル数であり、対応するスピーカはLFE1、LFE2およびLFEである。それ以外の信号のチャンネルをメインチャンネルと呼ぶ。そして、7.1chのスピーカシステムから、上層のスピーカを除いたものが、5.1chのスピーカシステムである。 Examples of standard speaker placement are shown in FIGS. 13A, 13B, and 13C. As shown in FIG. 13A, the speaker group is divided into three groups, upper layer, middle layer, and lower layer, depending on the height of the installation position. The arrangement of each group is as shown in FIGS. 13B and 13C. FIG. 13B shows the arrangement of a 22.2ch speaker system, and FIG. 13C shows the arrangement of a 7.1ch speaker system. The number below the decimal point in the channel number display is the number of channels of the low frequency signal, and the corresponding speakers are LFE1, LFE2, and LFE. The other signal channels are called main channels. A 5.1ch speaker system is obtained by removing the upper layer speaker from the 7.1ch speaker system.
 現行のシステムでは、チャンネルベースの音声信号のチャンネル数と同じスピーカ数のスピーカシステムであればそのまま再生し、スピーカシステムのスピーカ数と異なる場合は、そのスピーカシステムのスピーカ数に合わせてフォーマット変換を行い再生する。特に、スピーカ数の方が音声信号のチャンネル数より少ない場合は、このフォーマット変換をダウンミックスと呼ぶ。フォーマット変換は、音声信号を作成する際に前提としているスピーカシステムの位置と実際のスピーカシステムの位置が異なる場合にも行われる。実際のスピーカの位置から出力されるべき音を各チャンネルの音声信号を重み付けして加算することにより合成する。標準的なスピーカシステムの配置では、想定される視聴者の標準視聴位置に対して、各スピーカは等距離に配置されるので、再生時刻の調整は必要ないが、実際のスピーカが視聴者の位置から等距離に配置されていない場合は、再生時刻の調整も行われて良い。フォーマット変換は次の式1のようになる。 In the current system, if the speaker system has the same number of speakers as the number of channels of the channel-based audio signal, it will be played as is, and if the number of speakers is different from the number of speakers in the speaker system, the format will be converted to match the number of speakers in the speaker system. Reproduce. In particular, when the number of speakers is smaller than the number of audio signal channels, this format conversion is called downmix. Format conversion is also performed when the position of the speaker system assumed when creating the audio signal differs from the actual position of the speaker system. The sound to be output from the actual speaker position is synthesized by weighting and adding the audio signals of each channel. In a standard speaker system arrangement, each speaker is placed at the same distance from the expected standard viewing position of the viewer, so there is no need to adjust the playback time, but if the actual speaker is located at the viewer's position If they are not placed at the same distance from each other, the playback time may also be adjusted. The format conversion is as shown in Equation 1 below.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、s(ch) n(t)は放送・通信で伝送されるチャンネルベースの音声信号であり、nは信号の番号であり、チャンネルベース信号の信号数はN(ch)である。tは時刻である。p(ch) m(t)はスピーカに入力される音声信号であり、mはスピーカ番号であり、スピーカ数はMである。g(ch) mnはチャンネルベース信号に対する重み付け係数である。Δtmは基準視聴位置から最も遠くにあるスピーカと標準視聴位置間の距離Roからのずれに応じた遅延時間調整の時間である。m番目のスピーカの位置と標準視聴位置間の距離をRmとし、音速をcとすると、Δtmは次の式2で与えられる。 Here, s (ch) n (t) is a channel-based audio signal transmitted by broadcasting/communication, n is a signal number, and the number of channel-based signals is N (ch) . t is time. p (ch) m (t) is an audio signal input to a speaker, m is a speaker number, and the number of speakers is M. g (ch) mn is a weighting coefficient for the channel base signal. Δt m is the delay time adjustment time according to the deviation from the distance R o between the speaker farthest from the standard viewing position and the standard viewing position. When the distance between the m-th speaker position and the standard listening position is R m and the speed of sound is c, Δt m is given by the following equation 2.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 なお、各スピーカが等距離に無い時の重み付け係数g(ch) mnは、等距離にある時の重み付け係数g(ch) mnを(Rm/Ro)倍となるように補正して、スピーカ間の音量バランスをとっても良い。 Note that the weighting coefficient g (ch) mn when the speakers are not equidistant is corrected to be (R m /R o ) times the weighting coefficient g (ch) mn when the speakers are equidistant, It is also possible to balance the volume between the speakers.
 次に、高度な音声信号の場合は、オブジェクトベースの信号とHOA方式の信号とが含まれるが、それぞれ、チャンネルベース信号のフォーマット変換式と同様に、伝送された信号を重み付けして加算することにより、式3および式4に示すように、スピーカに入力する信号に変換される。 Next, in the case of advanced audio signals, which include object-based signals and HOA-based signals, the transmitted signals are weighted and summed, similar to the format conversion formula for channel-based signals. As shown in Equations 3 and 4, the signal is converted into a signal input to the speaker.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 ここで、記号の意味はチャンネルベース信号と同じであり、上付きの文字が(ch)であればチャンネルベース信号に対する記号であることを表し、(obj)はオブジェクトベース信号に対するもの、(HOA)はHOA方式の信号に対するものであることを表す。これらを含めて、高度な音声信号を扱うシステムにおいては、スピーカシステムに入力される音声信号pm(t)は次の式5で与えられる。 Here, the meanings of the symbols are the same as those for channel-based signals; the superscript (ch) indicates a symbol for a channel-based signal, (obj) indicates a symbol for an object-based signal, and (HOA) indicates that the signal corresponds to the HOA system signal. In systems that handle advanced audio signals, including these, the audio signal p m (t) input to the speaker system is given by the following equation 5.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 ここで、重み付け係数g(*) mnは、スピーカ配置と標準視聴位置の関係で決まるが、オブジェクトベース信号に関する重み付け係数g(obj) mnは、個々のオブジェクトの再生位置も考慮に入れて決定される。なお、全ての信号に共通する内容に関しては、記号の上付き文字(*)を使って表現している。 Here, the weighting coefficient g (*) mn is determined by the relationship between the speaker arrangement and the standard viewing position, but the weighting coefficient g (obj) mn for the object-based signal is determined by taking into account the playback position of each individual object. Ru. Note that content common to all signals is expressed using a superscript (*).
 ヘッドフォンで音声を聞く場合も、M=2として、上記のフォーマット変換を行う。但し、ヘッドフォンの場合、視聴者の顔の向きにより、受信機画面に対する音声出力部の位置が変わるので、その要素も考慮して重み付け係数g(*) mnが決定される。ヘッドフォンを使用する場合の放送受信装置100との位置関係を図14Aに示す。ヘッドフォンにおける左右の音声出力部をつなぐ線分の中点がこの場合の視聴位置になる。なお。通常のヘッドフォンの場合、音声出力部は左右対称に作られるのでΔtm=0である。 When listening to audio through headphones, the above format conversion is also performed with M=2. However, in the case of headphones, the position of the audio output section with respect to the receiver screen changes depending on the direction of the viewer's face, so the weighting coefficient g (*) mn is determined taking this factor into consideration. FIG. 14A shows the positional relationship with the broadcast receiving apparatus 100 when headphones are used. In this case, the listening position is the midpoint of the line segment connecting the left and right audio output sections of the headphones. In addition. In the case of normal headphones, the audio output section is made symmetrically, so Δt m =0.
 音声信号により作られる音場は、受信機画面の中央方向を基準方向とする基準座標系を前提としている。一方、ヘッドフォンの音声出力部は、ユーザの頭部の回転によりこの基準座標系内でその位置を変える(図14B)。従って、ヘッドフォンの音声出力部への入力信号を合成するための重み付け係数g(*) mnは、その時点でのヘッドフォンの音声出力部の基準座標系内での位置を考慮して計算される。ヘッドフォンの音声出力部の位置は、例えば、ユーザ入力により、ユーザが受信機画面の中央を向いている位置を記録し、その後のユーザの顔の向きの変化をヘッドフォンに搭載したジャイロセンサ等で検出することにより得ることができる。なお、図14Bでは、平面上の配置を示したが、高さ方向の位置を考慮しても構わない。 The sound field created by the audio signal is based on a reference coordinate system in which the center direction of the receiver screen is the reference direction. On the other hand, the audio output section of the headphones changes its position within this reference coordinate system due to the rotation of the user's head (FIG. 14B). Therefore, the weighting coefficient g (*) mn for synthesizing the input signals to the audio output section of the headphone is calculated by taking into account the position of the audio output section of the headphone within the reference coordinate system at that time. The position of the audio output section of the headphones is determined by, for example, recording the position where the user is facing the center of the receiver screen based on user input, and detecting subsequent changes in the direction of the user's face using a gyro sensor etc. installed in the headphones. It can be obtained by Note that although FIG. 14B shows the arrangement on a plane, the position in the height direction may be considered.
 次に音声デコーダの構成例を示す。この音声デコーダは、全体構成の中では、図2F、図2Gの音声デコーダ146S,146Uに組み込まれるものである。伝送される音声信号がチャンネルベース信号だけである場合の音声デコーダ10000の構成例を図15Aに示す。まず、放送や通信で多重化されて伝送されてきた音声ビットストリームをコアデコーダ10001によりチャンネル別の信号に復号する。次に、フォーマット変換器10002により、上述のフォーマット変換を行い、スピーカ用の音声信号とヘッドフォン用の音声信号とを出力する。外部機器への出力は無線で行っても良い。 Next, an example of the configuration of an audio decoder is shown. This audio decoder is incorporated into audio decoders 146S and 146U in FIGS. 2F and 2G in the overall configuration. FIG. 15A shows a configuration example of the audio decoder 10000 when the audio signal to be transmitted is only a channel base signal. First, a core decoder 10001 decodes an audio bitstream multiplexed and transmitted through broadcasting or communication into signals for each channel. Next, the format converter 10002 performs the above format conversion and outputs an audio signal for speakers and an audio signal for headphones. Output to external equipment may be performed wirelessly.
 図15Bは高度な音声信号に対応した音声デコーダ10100の構成例である。高度な音声信号の場合は、まず、放送や通信で多重化されて伝送されてきた音声ビットストリームをコアデコーダ10101により各信号に復号する。ここで、各信号とは、チャンネルベース信号、オブジェクトベース信号およびHOA方式信号である。高度な音声信号の場合も外部機器への出力は無線で行っても良い。 FIG. 15B is a configuration example of an audio decoder 10100 that supports advanced audio signals. In the case of advanced audio signals, first, the core decoder 10101 decodes an audio bitstream that has been multiplexed and transmitted through broadcasting or communication into individual signals. Here, each signal is a channel base signal, an object base signal, and an HOA method signal. Even in the case of advanced audio signals, output to external equipment may be performed wirelessly.
 [チャンネルベース信号の処理]
 まず、チェンネルベース信号の処理について説明する。チャンネルベース信号については、現行の音声信号と同様に、フォーマット変換器10102により、スピーカ配置に応じて式1により各スピーカ向けの信号に変換される。同時に、チャンネルベース信号はヘッドフォン用の信号にも変換される。スピーカの配置情報は、受信機に保存されている配置情報を使用する。
[Channel-based signal processing]
First, processing of channel-based signals will be explained. As with the current audio signal, the channel base signal is converted by the format converter 10102 into a signal for each speaker according to Equation 1 according to the speaker arrangement. At the same time, the channel base signal is also converted to a signal for headphones. As the speaker arrangement information, arrangement information stored in the receiver is used.
 スピーカの配置情報の例を図16に示す。配置情報は、スピーカを区別する番号に対応して、スピーカの種類(メインチャンネル用か、低周波数チャンネル用か)、方位位置、高さ方向の位置(仰角、伏角)、視聴者頭部位置からの距離で構成される。ここで、方位位置とは、視聴位置から見て正面方向を0°として、正の値が左方向に回ったときの角度位置であり、負の値が右方向に回ったときの角度位置である。また、高さ方向の位置とは、視聴者頭部の位置から見て、水平方向を0°として、正の値が仰角を表し、負の値が伏角を表す。これらの情報とチャンネルベース信号の構成により、上述の重み付け係数g(ch) mnが設定される。また距離情報によりΔtmの遅延時間調整を行うが、距離情報がない場合は、遅延時間調整は行わない。なお、ここでのスピーカの配置情報は、極座標表示で行っているが、直交座標による表示で行っても構わない。 FIG. 16 shows an example of speaker arrangement information. The placement information is based on the speaker type (main channel or low frequency channel), azimuth position, height position (elevation angle, inclination angle), and viewer head position, depending on the number that distinguishes the speaker. It consists of a distance of Here, the azimuth position is the angular position when a positive value turns to the left, and the angular position when a negative value turns to the right, assuming that the front direction is 0° when viewed from the viewing position. be. Further, the position in the height direction is defined as 0° in the horizontal direction when viewed from the position of the viewer's head, and a positive value represents an angle of elevation, and a negative value represents an angle of inclination. The above-mentioned weighting coefficient g (ch) mn is set based on this information and the configuration of the channel base signal. Further, the delay time adjustment of Δt m is performed based on the distance information, but if there is no distance information, the delay time adjustment is not performed. Although the speaker arrangement information here is displayed using polar coordinates, it may also be displayed using orthogonal coordinates.
 このスピーカ配置情報は、5.1chスピーカシステム等の標準的な配置情報を用いても良いし、受信機特有のスピーカ配置情報であっても良い。また、受信機ユーザがカスタマイズしたスピーカシステムの配置情報であっても良い。その際、ユーザカスタマイズの配置情報は、番組視聴前に登録を行い、どの配置情報を用いるか、ユーザが設定できるようにする。また、受信機に備わったスピーカシステムとユーザがカスタマイズしたスピーカシステムを切替えて使用できるようにしても良い。さらにまた、番組毎に使用するスピーカシステムを予約設定しても良い。或いは、番組の種別毎、時間帯毎、視聴者毎に使用するスピーカシステムを設定しておいても良い。番組内容や、その時の視聴環境に応じた音声再生が可能となり、ユーザにとっての利便性が向上する。 This speaker placement information may be standard placement information such as a 5.1ch speaker system, or may be speaker placement information specific to the receiver. Alternatively, the location information of the speaker system customized by the receiver user may be used. At this time, the user-customized arrangement information is registered before viewing the program, so that the user can set which arrangement information is to be used. Furthermore, it may be possible to switch between the speaker system provided in the receiver and the speaker system customized by the user. Furthermore, the speaker system used for each program may be reserved. Alternatively, a speaker system may be set for each type of program, each time slot, and each viewer. It becomes possible to reproduce audio according to the program content and the viewing environment at the time, improving convenience for the user.
 ここで、標準的な信号構成である22.2chのチャンネルベース信号を、標準的なスピーカ配置である5.1chのスピーカシステムに出力する場合の重み付け係数の例を以下に示す。フォーマット変換式を次のように表す。
  C’=FC+g1*FLc+g1*FRc+g3*(TpFC+g4*TpC+BtFC)              (式6)
  L’=FL+g1*FLc+g2*SiL+g3*(TpFL+g2*TpSiL+BtFL)            (式7)
  R’=FR+g1*FRc+g2*SiR+g3*(TpFR+g2*TpSiR+BtFR)            (式8)
  Ls’=BL+g5*BC+g2*SiL+g3*(TpBL+g5*TpBC+g2*TpSiL+g4*TpC)  (式9)
  Rs’=BR+g5*BC+g2*SiR+g3*(TpBR+g5*TpBC+g2*TpSiR+g4*TpC)  (式10)
  LFE’=g6*(LFE1+LFE2)                                    (式11)
Here, an example of weighting coefficients when outputting a 22.2ch channel base signal having a standard signal configuration to a 5.1ch speaker system having a standard speaker arrangement is shown below. The format conversion formula is expressed as follows.
C'=FC+g1*FLc+g1*FRc+g3*(TpFC+g4*TpC+BtFC) (Formula 6)
L'=FL+g1*FLc+g2*SiL+g3*(TpFL+g2*TpSiL+BtFL) (Formula 7)
R'=FR+g1*FRc+g2*SiR+g3*(TpFR+g2*TpSiR+BtFR) (Formula 8)
Ls'=BL+g5*BC+g2*SiL+g3*(TpBL+g5*TpBC+g2*TpSiL+g4*TpC) (Formula 9)
Rs'=BR+g5*BC+g2*SiR+g3*(TpBR+g5*TpBC+g2*TpSiR+g4*TpC) (Formula 10)
LFE'=g6*(LFE1+LFE2) (Formula 11)
 上記のg1,g2,g3,g4,g5,g6が重み付け係数(ダウンミックス係数)であり、図17Aにデフォルト値を示す。このダウンミックス係数は、音声信号のメタデータとして伝送されるが、受信するまでは、デフォルト値を用いる。この変換式および重み付け係数のデフォルト値はチャンネルベース信号のみの音声信号を取り扱うシステムで使用するものと同じである。システム間で処理を共通化することにより、信号処理部を共通化することも可能となり、共用受信機の場合、システム全体の規模低減につながる。 The above g1, g2, g3, g4, g5, and g6 are weighting coefficients (downmix coefficients), and default values are shown in FIG. 17A. This downmix coefficient is transmitted as metadata of the audio signal, but the default value is used until it is received. This conversion formula and the default values of the weighting coefficients are the same as those used in a system that handles audio signals with only channel-based signals. By sharing processing between systems, it is also possible to share signal processing units, which leads to a reduction in the overall system size in the case of a shared receiver.
 5.1chよりチャンネル数の大きい音声信号から2chのスピーカシステムに出力する信号に変換する場合は、一旦5.1chの信号にダウンミックスした後、2ch用の信号にダウンミックスする。この5.1chから2chへのダウンミックスの変換式の例を下記に示す。
  Lt’=L+g7*C+g8*Ls             (式12)
  Rt’=R+g7*C+g8*Rs             (式13)
上記のg7,g8が重み付け係数(ダウンミックス係数)であり、図17Bにデフォルト値を示す。このダウンミックス係数は、音声信号のメタデータとして伝送されるが、受信するまでは、デフォルト値を用いる。この変換式および重み付け係数のデフォルト値はチャンネルベース信号のみの音声信号を取り扱うシステムで使用するものと同じである。システム間で処理を共通化することにより、信号処理部を共通化することも可能となり、共用受信機の場合、システム全体の規模低減につながる。
When converting an audio signal with a larger number of channels than 5.1ch to a signal to be output to a 2ch speaker system, the signal is first downmixed to a 5.1ch signal and then downmixed to a 2ch signal. An example of a conversion formula for downmixing from 5.1ch to 2ch is shown below.
Lt'=L+g7*C+g8*Ls (Formula 12)
Rt'=R+g7*C+g8*Rs (Formula 13)
The above g7 and g8 are weighting coefficients (downmix coefficients), and default values are shown in FIG. 17B. This downmix coefficient is transmitted as metadata of the audio signal, but the default value is used until it is received. This conversion formula and the default values of the weighting coefficients are the same as those used in a system that handles audio signals with only channel-based signals. By sharing processing between systems, it is also possible to share signal processing units, which leads to a reduction in the overall system size in the case of a shared receiver.
 次に、視聴時に使用するスピーカシステムの設定方法について説明する。使用するスピーカシステムは、受信機組み込みの内蔵スピーカ、有線接続の外部スピーカ、無線接続の外部スピーカ等がある。リモコン操作(例えば矢印ボタンによる選択)で使用スピーカの選択を行っても良いし、スマートホン等の連携端末で行っても良い。スピーカ設定の選択メニューの例を図18に示す。図18では、「外部スピーカ1」のシステムを選択しているところを表している。外部スピーカシステムとの接続は、有線接続と無線接続が混在していても構わない。また、ユーザ定義とは、各システムのスピーカを組合せたスピーカシステムである。例えば、内蔵スピーカに拡張用の外部スピーカを組合せたシステムである。選択メニューを準備することにより、視聴者の好み、その時の視聴環境により最適なスピーカシステムの選択が簡便に行え、視聴者の利便性が向上する。 Next, we will explain how to set up the speaker system used during viewing. The speaker systems used include a built-in speaker built into the receiver, an external speaker connected by wire, and an external speaker connected wirelessly. The speaker to be used may be selected by remote control operation (for example, selection using arrow buttons) or by a linked terminal such as a smart phone. FIG. 18 shows an example of a selection menu for speaker settings. FIG. 18 shows that the "external speaker 1" system is selected. The connection with the external speaker system may be a combination of wired connection and wireless connection. Furthermore, the user definition is a speaker system that combines speakers of each system. For example, it is a system that combines a built-in speaker with an external speaker for expansion. By preparing a selection menu, the optimal speaker system can be easily selected according to the viewer's preference and viewing environment at the time, improving convenience for the viewer.
 上記のスピーカシステムを使用する際には、図16の配置情報が必要である。この配置情報は、視聴者自身が入力しても良いが、受信機メーカやスピーカメーカのサイトよりダウンロードしても構わない。この配置情報のダウンロードは、スピーカシステムの型番等の識別情報の入力を受けて受信機が行なっても良い。或いは、スピーカ本体に記録された配置情報を受信機に送信しても良い。さらにまた、受信機とスピーカとの連携動作により、実際のスピーカの配置状態を計測して配置情報を作成および修正しても良い。配置状態の計測は、例えば、受信機或いはスピーカまたは双方に備えられたカメラ、UWB(Ultra Wideband)等の測距デバイスを用いても良い。計測やユーザ入力により配置情報を変更した場合、視聴中であっても重ね合わせ係数を変更しても良いし、番組の切れ目、音声出力の切れ目、チャンネル変更時を待って重ね合わせ係数を変更しても良い。スピーカシステムの配置情報が適切に設定、処理されることにより、音声再生の状態が良好となる。 When using the above speaker system, the arrangement information shown in FIG. 16 is required. This placement information may be input by the viewer himself or may be downloaded from the site of the receiver manufacturer or speaker manufacturer. This arrangement information may be downloaded by the receiver upon receiving identification information such as the model number of the speaker system. Alternatively, placement information recorded on the speaker body may be transmitted to the receiver. Furthermore, the arrangement information may be created and corrected by measuring the actual arrangement state of the speakers through a cooperative operation between the receiver and the speakers. The arrangement state may be measured using, for example, a distance measuring device such as a camera or UWB (Ultra Wideband) provided in the receiver or the speaker or both. If the layout information is changed by measurement or user input, you can change the superposition coefficient even while viewing, or wait for a break in the program, a break in audio output, or when changing channels to change the superposition coefficient. It's okay. By properly setting and processing the placement information of the speaker system, the sound reproduction state becomes better.
 さらにまた、外部スピーカシステムを用いる場合、フォーマット変換に用いる重み付け係数を外部から与えても構わない。この重み付け係数はユーザ自身で入力しても良いし、スピーカシステムから通信により受信機に入力しても良いし、サーバから受信機が取得しても構わない。 Furthermore, when using an external speaker system, the weighting coefficients used for format conversion may be provided externally. This weighting coefficient may be inputted by the user himself, inputted into the receiver through communication from the speaker system, or acquired by the receiver from the server.
 外部スピーカシステムの場合、スピーカシステム側で音声変換機能を備えている場合がある。その場合、受信機は、スピーカシステム側からの要求により、スピーカシステムに出力する信号を調整しても良い。出力する信号における調整とは、例えばチャンネルベース信号のチャンネル数、オブジェクトベース信号の信号数、HOA方式信号の信号数、それらの再生に必要なメタデータの範囲等の調整である。番組により、出力できる信号に制限がある場合(例えばチャンネル数)は、番組毎に出力信号の調整を行っても良い。また、外部スピーカシステムが音声ビットストリームのデコーダも備える場合は、図15Bに示すビットストリーム出力制御器10106を通して、音声ビットストリームを外部接続のスピーカシステムに出力しても構わない。適切に出力信号を調整することにより、音声出力の動作が保証される。 In the case of an external speaker system, the speaker system may have an audio conversion function. In that case, the receiver may adjust the signal output to the speaker system in response to a request from the speaker system. Adjustments in the signals to be output include, for example, adjustments to the number of channels of channel base signals, the number of object base signals, the number of HOA system signals, the range of metadata necessary for their reproduction, and the like. If there is a limit to the signals that can be output depending on the program (for example, the number of channels), the output signal may be adjusted for each program. Furthermore, if the external speaker system also includes an audio bitstream decoder, the audio bitstream may be output to the externally connected speaker system through the bitstream output controller 10106 shown in FIG. 15B. By properly adjusting the output signal, the operation of the audio output is guaranteed.
 [オブジェクトベース信号の処理]
 次にオブジェクトベース信号の処理について説明する。オブジェクトベース信号は、音源別の信号であり、コアデコーダ10101により、ビットストリームの信号から音源毎の信号に分離する。続いて、付随した再生位置情報を元に、オブジェクトレンダラー10103により、スピーカ毎の出力信号が計算される。その際、実際のスピーカシステムの配置情報も考慮される。スピーカ毎の出力信号を計算する際、直接計算しても良いし、一旦22.2chの標準的なスピーカシステムへの出力信号を計算し、その後、チャンネルベース信号のフォーマット変換に合わせて、実際のスピーカシステムへの出力信号を計算しても良い。これにより、一部の処理の共通化ができる。
[Object-based signal processing]
Next, object-based signal processing will be explained. The object base signal is a signal for each sound source, and is separated from the bitstream signal into a signal for each sound source by the core decoder 10101. Next, the object renderer 10103 calculates an output signal for each speaker based on the accompanying playback position information. At this time, the actual placement information of the speaker system is also taken into consideration. When calculating the output signal for each speaker, you can calculate it directly, or first calculate the output signal to a standard 22.2ch speaker system, and then calculate the actual output signal according to the format conversion of the channel base signal. The output signal to the speaker system may also be calculated. This allows some processing to be shared.
 また、外部スピーカを用いる場合、スピーカへの出力信号を計算するための重み付け係数を外部から与えても構わない。この重み付け係数の設定はユーザが行なっても構わないし、スピーカシステムとの通信で取得しても構わないし、サーバから取得しても構わない。この際、重み付け係数は、音源位置によっても変わるので、表形式、或いは関数形式で与えられる。或いは、音源位置を考慮した22.2ch用の信号までは受信機内部の処理で行われることを前提に、22.2ch用の信号から外部スピーカ用の信号に変換する変換係数のみを与えることでも良い。 Furthermore, when using an external speaker, weighting coefficients for calculating the output signal to the speaker may be given from the outside. The weighting coefficients may be set by the user, may be obtained through communication with the speaker system, or may be obtained from the server. At this time, since the weighting coefficients vary depending on the sound source position, they are given in a table format or a function format. Alternatively, on the premise that the 22.2ch signal that takes into account the sound source position is processed within the receiver, it is also possible to provide only the conversion coefficient for converting the 22.2ch signal to the external speaker signal. good.
 オブジェクトベース信号は、視聴者が音源の再生位置を指定できるものと、できないものに分かれる。音源毎にこの位置指定と指定位置変更の可否がメタデータとして伝送され、受信機はその可否の指定により処理を変える。受信機には、オブジェクトベース信号毎に再生位置設定をユーザに対して許可するかどうかの情報が記述されたパラメータが伝送される。図19がメタデータの例である。図19では、音源として、ナレーション、ボーカルおよびギターの3種類の音源がある場合を示している。 Object-based signals are divided into those that allow the viewer to specify the playback position of the sound source and those that do not. This position designation and whether or not the designated position can be changed are transmitted as metadata for each sound source, and the receiver changes processing depending on the designation of whether or not the designated position can be changed. A parameter is transmitted to the receiver that describes information as to whether or not the user is permitted to set the playback position for each object-based signal. FIG. 19 is an example of metadata. FIG. 19 shows a case where there are three types of sound sources: narration, vocals, and guitar.
 この例では、ナレーションの音声は、位置変更可能としており、また、差し替え用の音源も用意してある。差し替え用の音源はsub-IDで区別する。例えば、sub-IDがaのものは日本語、bは英語、cはフランス語である。日本語をデフォルトとし、ユーザの指示を受けて他の言語に差し替える。ユーザ指示は、番組毎に行っても良いし、常に特定の言語に差し替える、という設定を行っておいても良い。差し替え用の信号は、汎用のユーザ領域を用いて伝送されていても良い。 In this example, the position of the narration voice can be changed, and a replacement sound source is also prepared. Replacement sound sources are distinguished by sub-ID. For example, sub-ID a is Japanese, b is English, and c is French. Defaults to Japanese, and replaces it with another language based on the user's instructions. The user instruction may be given for each program, or may be set to always change to a specific language. The replacement signal may be transmitted using a general-purpose user area.
 ナレーション音声のデフォルトの再生位置は、標準的なスピーカシステム、例えば22.2chシステムの、スピーカ毎の出力レベルで設定しても良いし、標準的な視聴者位置から見た方向として設定しても良い。図20A、図20Bおよび図20Cに、再生位置のデータ例を示す。まず、再生位置を表現する形式を示すデータ種別を設定する。再生位置の表現形式としては、スピーカの出力レベルで示す(図20A)、標準視聴位置を原点とする極座標で示す(図20B)、標準視聴位置を原点とする直交座標で示す(図20C)等がある。図20Aでは、2つのスピーカの出力レベルを示しているが、3つ以上のスピーカの出力レベルを設定しても良い。図20Bの極座標表示とは、原点から見た方位角と仰伏角と原点からの距離の組合せであり、方位角とは、原点から見て画面中央に向かう方向が0°の方位であり、左方向に回る方位を正の角度とする。また、仰伏角とは原点から見て上下方向の角度を意味し、正の角度が仰角であり、負の角度が伏角である。図20Cの直交座標表示とは、画面に平行に右向きの方向をX軸の正方向とし、画面に垂直に原点から画面に向かう方向をY軸の正方向とし、鉛直上方向をZ軸の正方向とする。次に、プリセットデータ数を設定し、そのプリセットデータを設定する。プリセットデータについては、プリセットデータ1がデフォルトの再生位置であり、ユーザ指示により、他のプリセットデータに変更する。或いは、ユーザ指示により、ユーザが定義する再生位置を用いることができる構成としても良い。 The default playback position of the narration audio may be set by the output level of each speaker of a standard speaker system, such as a 22.2 channel system, or may be set as the direction as seen from the standard viewer position. good. Examples of playback position data are shown in FIGS. 20A, 20B, and 20C. First, the data type indicating the format for expressing the playback position is set. The playback position can be expressed in terms of speaker output level (Figure 20A), polar coordinates with the standard viewing position as the origin (Figure 20B), rectangular coordinates with the standard viewing position as the origin (Figure 20C), etc. There is. Although FIG. 20A shows the output levels of two speakers, the output levels of three or more speakers may be set. The polar coordinate display in Fig. 20B is a combination of the azimuth, elevation angle, and distance from the origin, and the azimuth is the direction of 0° toward the center of the screen when viewed from the origin, and the left The direction of rotation is defined as a positive angle. Moreover, the elevation angle means an angle in the vertical direction when viewed from the origin, a positive angle is an elevation angle, and a negative angle is an inclination angle. The orthogonal coordinate display in Figure 20C means that the direction parallel to the screen and to the right is the positive direction of the X-axis, the direction perpendicular to the screen from the origin to the screen is the positive direction of the Y-axis, and the vertically upward direction is the positive direction of the Z-axis. direction. Next, set the number of preset data and set the preset data. Regarding the preset data, preset data 1 is the default playback position, and it is changed to other preset data according to a user instruction. Alternatively, a configuration may be adopted in which a playback position defined by the user can be used in response to a user instruction.
 オブジェクトベース信号の再生位置の設定例として、図21に、ユーザによるナレーション音声の再生位置の選択例を示す。設定画面では、例えば、プリセットされている位置をボタンで示し、リモコンの矢印キーで選択する。図21の例では左の位置が選択されている状態を表す。なお、ナレーション音声の再生位置については、遠近方向の選択を行えるようにしても良い。また、ボタン選択状態に対応したナレーション音声を再生しても良い。さらに、ナレーションの音量も独立に設定できるようにしても良い。例えば、ナレーション用の音量設定画面にて、リモコンの音量変更ボタンにより設定する。 As an example of setting the playback position of the object-based signal, FIG. 21 shows an example of selection of the playback position of the narration audio by the user. On the setting screen, for example, preset positions are indicated by buttons and selected using the arrow keys on the remote control. The example in FIG. 21 shows a state where the left position is selected. Note that the playback position of the narration audio may be selected in a far or near direction. Furthermore, a narration sound corresponding to the button selection state may be played. Furthermore, the volume of the narration may also be set independently. For example, on the narration volume setting screen, use the volume change button on the remote control to set the volume.
 図22に、ユーザによるナレーション位置の設定例を示す。この例は、プリセット位置からの選択ではなく、予め定められた範囲内で自由に再生位置を設定する場合の例である。図中の黒丸が設定位置を表し、ユーザは、リモコンの矢印ボタンで位置を調整する。ナレーションの音量設定については、図21と同様である。ユーザが設定できる範囲については、音源の種類毎に設定してもよく、設定用スケールの表示はユーザが設定できる範囲に限定しても良い。また、再生位置により大音量になってしまう可能性がある場合は、音量設定の範囲も制限を加えても良い。以上のようにオブジェクトベース信号の再生位置、音量を変更できることにより、ユーザにとって、より好適な音声再生が実現できる。 FIG. 22 shows an example of setting the narration position by the user. In this example, the playback position is not selected from preset positions, but is set freely within a predetermined range. The black circles in the figure represent the set positions, and the user adjusts the positions using the arrow buttons on the remote control. The narration volume setting is the same as in FIG. 21. The range that can be set by the user may be set for each type of sound source, and the display of the setting scale may be limited to the range that can be set by the user. Furthermore, if there is a possibility that the volume may become too loud depending on the playback position, the range of volume settings may also be limited. By being able to change the reproduction position and volume of the object-based signal as described above, it is possible to realize audio reproduction more suitable for the user.
 次に、ボーカルや、ギターなど、画像に合わせた再生位置で再生することが望ましい音源について説明する。これらの音源の再生位置は、ユーザによる変更を禁止し、フラグでもそのように設定する。そしてこれらの音源の再生位置は、再生時刻毎の音声位置を記録したストリームデータとして伝送される。このように再生位置がストリームデータとして与えられる場合、図19の再生位置の情報は、ストリームデータを示す情報となる(図23)。図23におけるデータ種別は再生位置の表現形式を表し、「4」「5」「6」は、再生位置がストリームデータとして与えられることを表す。さらに「4」は、再生位置の指定として、スピーカ出力レベルの指定を用いることを表し、「5」は極座標表示、「6」は直交座標表示を表す。図24Aが、再生位置をスピーカの出力レベルで指定するストリームデータの例であり、図24Bが、再生位置を極座標表示で指定するストリームデータの例であり、図24Cが、再生位置を直交標表示で指定するストリームデータの例である。ストリームデータ上の再生時刻については、任意に設定しても良いし、番組画像のフレームに同期させても良い。また、全体の音量調節とは別に、音源毎の音量調節を行っても良い。このように、音源毎に再生位置、音量を指定することにより、音像定位に優れ、ユーザの嗜好に合った音声再生が可能となる。 Next, sound sources such as vocals and guitars that are preferably played back at playback positions that match the image will be explained. The reproduction positions of these sound sources are prohibited from being changed by the user, and flags are also set accordingly. The playback positions of these sound sources are transmitted as stream data recording the sound positions at each playback time. When the playback position is given as stream data in this way, the playback position information in FIG. 19 becomes information indicating stream data (FIG. 23). The data type in FIG. 23 represents the representation format of the playback position, and "4", "5", and "6" represent that the playback position is given as stream data. Further, "4" indicates that the speaker output level is specified as the reproduction position specification, "5" indicates polar coordinate display, and "6" indicates orthogonal coordinate display. FIG. 24A is an example of stream data in which the playback position is specified by the output level of the speaker, FIG. 24B is an example of stream data in which the playback position is specified in polar coordinates, and FIG. 24C is an example of stream data in which the playback position is shown in orthogonal coordinates. This is an example of stream data specified by . The playback time on the stream data may be set arbitrarily or may be synchronized with the frame of the program image. Furthermore, the volume may be adjusted for each sound source separately from the overall volume adjustment. In this way, by specifying the playback position and volume for each sound source, it is possible to achieve excellent sound image localization and to playback audio that matches the user's preferences.
 [HOA方式信号の処理]
 次に、HOA方式信号の処理について説明する。HOA方式の信号は、音場を0次からある次数までの球面調和関数で展開した信号である。球面調和関数の次数は、非負整数でありn次の球面調和関数は2n+1個ある。従って、n次までの球面調和関数で展開したHOA方式の信号数は(n+1)2個となる。展開に使用する球面調和関数の最高次数とHOA方式信号の信号数を図25にまとめる。ここでは、展開に使用する球面調和関数の最高次数をHOA方式信号の次数と呼ぶことにする。図を見て分かるように、HOA方式信号の信号数は、次数が上がるにつれ急激に大きくなる。22.2chのスピーカシステムに対応するチャンネルベース信号の信号数が24個であるので、25個の信号数からなる4次のHOA方式の信号が大体これに匹敵する情報量となる。
[HOA method signal processing]
Next, processing of HOA system signals will be explained. The signal of the HOA system is a signal in which a sound field is expanded by spherical harmonic functions from the 0th order to a certain order. The order of the spherical harmonics is a non-negative integer, and there are 2n+1 nth-order spherical harmonics. Therefore, the number of signals in the HOA method expanded by spherical harmonics up to the nth order is (n+1) 2 . The highest order of the spherical harmonics used for expansion and the number of HOA system signals are summarized in FIG. 25. Here, the highest order of the spherical harmonics used for expansion will be referred to as the order of the HOA signal. As can be seen from the figure, the number of HOA signals increases rapidly as the order increases. Since the number of channel base signals corresponding to the 22.2ch speaker system is 24, a 4th order HOA system signal consisting of 25 signals has an information amount comparable to this.
 HOA方式の信号は、まず、コアデコーダ10101により球面調和関数で展開した形の信号に分離され、続いてスピーカの配置情報を元にHOA方式専用のデコーダ10104を用いてスピーカへ出力する信号に変換される。この変換は、直接使用するスピーカシステムへの出力信号を生成しても良いし、一旦22.2ch用の信号に変換した後、チャンネルベース信号の変換に合わせて変換を行っても良い。これにより、一部の処理の共通化ができる。 The HOA method signal is first separated into a signal expanded by spherical harmonics by the core decoder 10101, and then converted into a signal to be output to the speaker using the HOA method dedicated decoder 10104 based on the speaker arrangement information. be done. This conversion may be performed by generating an output signal for a speaker system to be used directly, or by converting the signal into a signal for 22.2ch and then performing the conversion in accordance with the conversion of the channel base signal. This allows some processing to be shared.
 また、外部スピーカを用いる場合、スピーカへの出力信号を計算するための重み付け係数を外部から与えても構わない。この重み付け係数の設定はユーザが行なっても構わないし、スピーカシステムとの通信で取得しても構わないし、サーバから取得しても構わない。 Furthermore, when using an external speaker, weighting coefficients for calculating the output signal to the speaker may be given from the outside. The weighting coefficients may be set by the user, may be obtained through communication with the speaker system, or may be obtained from the server.
 ところで、このHOA方式信号とチャンネルベース信号の違いは、視聴空間における情報の密度の違いである。HOA方式信号は、情報密度は等方的で一様であるのに対して、チャンネルベース信号は、スピーカが多く配置される正面方向の情報密度が高い。通常視聴環境では、チャンネルベース信号の音声再生で問題ないが、標準のスピーカ配置から大きく異なるスピーカシステムや、空間内での方向が大きく変わり得るヘッドフォンの場合は、HOA方式信号を用いた方がより好適な再生音声を楽しむことができる。特に、ヘッドマウントディスプレイでの視聴を行う番組においては、360°の空間での映像を見るために音声の情報量は等方的であることが望ましいので、HOA方式の信号を用いることが望ましい。 By the way, the difference between this HOA system signal and the channel-based signal is the difference in the density of information in the viewing space. The information density of the HOA system signal is isotropic and uniform, whereas the information density of the channel base signal is high in the front direction where many speakers are arranged. In normal listening environments, there is no problem with audio playback using channel-based signals, but if you have a speaker system that differs significantly from the standard speaker arrangement, or headphones whose orientation in space can change significantly, it is better to use HOA signals. You can enjoy suitable playback sound. In particular, in programs that are viewed on a head-mounted display, it is desirable that the amount of audio information be isotropic in order to view images in a 360° space, so it is desirable to use HOA signals.
 このHOA方式信号と、チャンネルベース信号は、視聴を行うシステムにより、またユーザの嗜好によりどちらか選択できるようにした方が望ましい。図26に、出力デバイス毎の音声信号の選択画面を示す。図26の例では、チャンネルベース信号を「正面重視型」、HOA方式信号を「全方位型」と表現している。図26の例では、スピーカから出力される音声の信号は、チャンネルベース信号が選択されており、ヘッドフォンから出力される音声の信号は、HOA方式信号が選択されていることを示している。この設定は、番組毎に行っても構わないし、番組によらず共通の設定でも良い。また、デフォルトの設定として、スピーカに対してはチャンネルベース信号を用い、ヘッドフォンに対してはHOA方式の信号を用いることとしても良い。また、番組で、その出力デバイスに対して設定された信号が提供されない場合は、他の信号を用いる、という処理を行っても良い。 It is desirable that either the HOA system signal or the channel base signal can be selected depending on the viewing system and the user's preference. FIG. 26 shows an audio signal selection screen for each output device. In the example of FIG. 26, the channel base signal is expressed as "front-oriented type" and the HOA method signal is expressed as "omnidirectional type". In the example of FIG. 26, the channel base signal is selected as the audio signal output from the speaker, and the HOA system signal is selected as the audio signal output from the headphones. This setting may be performed for each program, or may be a common setting regardless of the program. Further, as a default setting, a channel base signal may be used for the speaker, and an HOA signal may be used for the headphone. Furthermore, if the signal set for the output device is not provided for the program, another signal may be used.
 HOA方式の信号は、次数が高いほど精密な音場再生が可能となるが、放送による伝送容量には、限りがあるので、球面調和関数の展開形において、低次数の信号は放送で伝送し、高次数の信号はインターネット経由で伝送する、という方法でも良い。例えば、0次から4次までは放送で伝送して、5次以上の信号はインターネット経由で伝送する、という方法である。何らかの事情でインターネット経由の信号が途絶えても、音声が全くなくなるわけではないので、大きな障害とはならない。また、インターネット経由の信号は、放送開始前に一括して受信機にダウンロードしておいても構わない。このようにすれば視聴中のインターネット障害の影響を排除できる。 The higher the order of the HOA signal, the more precise the sound field can be reproduced, but the transmission capacity of broadcasting is limited. , high-order signals may be transmitted via the Internet. For example, one method is to transmit signals from the 0th order to the 4th order by broadcasting, and to transmit the signals of the 5th order and above via the Internet. Even if the signal via the Internet is interrupted for some reason, it will not be a major problem because the audio will not be completely lost. Further, the signals via the Internet may be downloaded all at once to the receiver before the start of broadcasting. In this way, the influence of internet failures during viewing can be eliminated.
 或いは、放送ではHOA方式の信号は伝送せず、インターネット経由のみでHOA方式の信号を伝送する方法でも良い。インターネットの障害でHOA方式の信号が途絶えた場合は、チャンネルベース信号を代わりに用いることにすれば、音声が全くなくなることにはならない。この場合もインターネット経由の信号は、放送開始前に一括して受信機にダウンロードしておいても構わない。このようにすれば視聴中のインターネット障害の影響を排除できる。 Alternatively, a method may be used in which the HOA system signal is not transmitted during broadcasting, and the HOA system signal is transmitted only via the Internet. If the HOA signal is interrupted due to an Internet failure, the channel-based signal can be used instead, so that the audio will not be completely lost. In this case as well, the signals via the Internet may be downloaded all at once to the receiver before the start of broadcasting. In this way, the influence of internet failures during viewing can be eliminated.
 音声信号の一部をインターネット経由で伝送する方式は、チャンネルベース信号やオブジェクトベース信号に用いても構わない。例えば、チャンネルベース信号では、22.2chに対応した24チャンネルの信号を放送で伝送し、追加するチャンネルをインターネットで伝送しても良い。これにより、情報密度をより等方的とし、HOA方式の信号の代替にすることもできる。また、オブジェクトベース信号では、複数の音源別の信号のうち、一部の信号をインターネットで伝送しても良い。以上、色々な信号の伝送パターンを説明したが、これにより、様々な視聴環境、ユーザの嗜好に対応することができ、好適な音声再生を実現することができる。 The method of transmitting part of the audio signal via the Internet may be used for channel-based signals and object-based signals. For example, in the case of channel-based signals, signals of 24 channels corresponding to 22.2ch may be transmitted by broadcasting, and additional channels may be transmitted over the Internet. This makes the information density more isotropic and can also be used as an alternative to HOA type signals. Furthermore, in the object-based signal, some of the signals for each of a plurality of sound sources may be transmitted over the Internet. Various signal transmission patterns have been described above, and with these, various viewing environments and user preferences can be accommodated, and suitable audio reproduction can be realized.
 [連携機器への随伴出力]
 上記での音声出力は、内部スピーカシステムであれ、外部スピーカシステムであれ、主として使用するシステムへの音声出力を行うことを前提としていた。しかし、複数人で使用する場合等、ユーザによっては、自分だけ違う副音声を聞きたい、という希望もある。そのような希望に答えるため、主たるシステムに追加して、別の出力デバイスで音声再生を行うことをしても構わない。例えば、連携デバイスとして手持ちのスマートホン等で副音声を聴く。その際の出力デバイスは、スピーカ、通常のイヤホンまたはオープンエア型のイヤホン等、適宜選択して良い。
[Accompanying output to linked equipment]
The audio output described above is based on the premise that audio is output to the system to be primarily used, whether it is an internal speaker system or an external speaker system. However, when multiple users use the system, some users may wish to hear a different sub-audio for themselves. In order to meet such a request, it is possible to add it to the main system and perform audio playback with another output device. For example, you can listen to secondary audio using your smartphone as a linked device. The output device at this time may be selected as appropriate, such as a speaker, normal earphones, or open-air earphones.
 図27に、連携機器であるスマートホン10300において、スマートホンで再生する音声を選択する例を示す。onが再生を表し、offが非再生を表す。再生と非再生とは、ユーザの嗜好に従って切替えられる。ここで、全体音声というのは、チャンネルベース信号の音声であり、個別音声というのはオブジェクトベース信号の音声である。例えば、スポーツ番組の際、オブジェクトベース信号で伝送される、中立的な実況音声と、どちらかのチームのファンの立場からの実況音声を選択することも可能である。全体音声は観客の音声等であるが、ユーザの嗜好に従って、受信機で再生しても、個別の連携機器で再生しても良い。また選択できる音声の設定はこの例に限定されることなく、適宜に選んで良い。このようにユーザ近傍の再生機器による音声再生を自由に選択できることにより、よりユーザの嗜好にあった音声再生が可能となる。 FIG. 27 shows an example of selecting audio to be played back on the smart phone 10300, which is a linked device. On represents reproduction, and off represents non-regeneration. Playback and non-playback are switched according to the user's preferences. Here, the overall audio is the audio of the channel-based signal, and the individual audio is the audio of the object-based signal. For example, in the case of a sports program, it is possible to select between neutral live commentary audio transmitted by object-based signals and live commentary audio from the standpoint of fans of either team. The overall sound is the sound of the audience, etc., and may be played back by a receiver or by a separate linked device, depending on the user's preference. Furthermore, the selectable audio settings are not limited to this example, and may be selected as appropriate. By being able to freely select audio playback using playback devices near the user in this way, it becomes possible to play back audio that better suits the user's preferences.
 [音声信号に関する表示]
 伝送される音声信号に関する表示例についての説明をする。まず、図28Aに、伝送される信号数と信号取得先に関するパラメータの例を示す。チャンネルベース信号、オブジェクトベース信号、HOA方式信号のそれぞれにおいて、信号数とインターネット取得の場合の取得先アドレス(URL)を記述するパラメータがある。インターネットによる伝送を組み合わせることにより、伝送容量の限られた放送による伝送の補完が可能となる。
[Display related to audio signal]
A display example regarding a transmitted audio signal will be explained. First, FIG. 28A shows an example of parameters regarding the number of signals to be transmitted and the signal acquisition destination. Each of the channel-based signals, object-based signals, and HOA system signals has parameters that describe the number of signals and the acquisition destination address (URL) in the case of Internet acquisition. By combining transmission via the Internet, it becomes possible to supplement transmission by broadcasting, which has limited transmission capacity.
 図29Aに、電子番組表の番組毎に、伝送される音声信号の表示をした例を示す。図29Aでは、音声信号の種別毎に信号数を示す。図29Aにおいて括弧内の信号はインターネット経由で取得される信号を表す。もしも、インターネットからの情報取得ができない状態になっている場合は、そのことが分かる表示にする。図29Bに、インターネットからの情報取得ができない状態の時に、インターネット経由で伝送される音声信号に関して、取り消し線を重畳して表示する例を示す。或いは、情報取得ができない場合は、インターネット経由の音声信号については表示を行わない、ということでも良い。さらに、電子番組表には、ユーザが再生を選択することのできる音源について説明情報を記載しても良い。以上のように番組毎の音声信号の構成を表示することにより、ユーザにその番組音声の充実度が分かり、番組選択における補助情報とすることができる。また、番組開始前に、その番組視聴時に使用する音声再生の設定を予約できるようにしても良い。さらに、よく使用する設定については、設定内容を受信機に記録し、その記録を呼び出すことにより、番組視聴時や予約時の音声再生設定を簡便に行えるようにしても良い。これにより、ユーザの使い勝手が向上する。 FIG. 29A shows an example in which audio signals to be transmitted are displayed for each program in the electronic program guide. FIG. 29A shows the number of signals for each type of audio signal. In FIG. 29A, signals in parentheses represent signals obtained via the Internet. If information cannot be obtained from the Internet, a display will be displayed to indicate this. FIG. 29B shows an example in which a strikethrough line is displayed superimposed on an audio signal transmitted via the Internet when information cannot be obtained from the Internet. Alternatively, if information cannot be acquired, audio signals transmitted via the Internet may not be displayed. Furthermore, the electronic program guide may include explanatory information regarding the sound sources that the user can select for reproduction. By displaying the audio signal configuration for each program as described above, the user can understand the degree of richness of the program audio, and this can be used as auxiliary information for program selection. Furthermore, it may be possible to reserve settings for audio playback to be used when viewing a program before the program starts. Furthermore, for frequently used settings, the setting contents may be recorded in the receiver and the recording may be called up to easily perform audio playback settings when viewing a program or making a reservation. This improves usability for the user.
 スピーカシステムの選択や、再生している信号源の種別、信号数は、画面に表示しても良い(図30)。表示は、常に行っても良いし、電源投入時、チャンネル変更時、状態の変更時のみに予め定めた時間表示することでも良い。また、この表示はユーザにより禁止されても良い。このように、現状の状態が示されることにより、ユーザが状態把握を適切に行うことができる。 The selection of the speaker system, the type of signal source being reproduced, and the number of signals may be displayed on the screen (FIG. 30). The display may be performed all the time, or may be displayed for a predetermined time only when the power is turned on, when changing the channel, or when changing the state. Further, this display may be prohibited by the user. In this way, by displaying the current state, the user can appropriately grasp the state.
 [ヘッドフォンによる視聴]
 特殊なヘッドフォンでは、ユーザ頭部の音の伝達特性を考慮して音声信号に処理を加えるものがある。この処理は外部機器で行っても良いし、図15Bに示すミキサーおよび分配器10105に処理機能を持たせて受信機で行っても良い。その際、処理に必要なプログラムや、パラメータ等は、ヘッドフォンメーカのサーバより取得しても良い。これにより、より精密な音声再生が実現できる。
[Viewing with headphones]
Some special headphones perform processing on audio signals taking into account the sound transmission characteristics of the user's head. This processing may be performed by an external device, or may be performed by a receiver by adding a processing function to the mixer and distributor 10105 shown in FIG. 15B. At this time, the programs, parameters, etc. necessary for the processing may be obtained from the headphone manufacturer's server. This makes it possible to achieve more precise audio playback.
 [緊急放送時の処理]
 緊急放送時の場合は、確実に緊急放送内容がユーザにより認識できるように、緊急放送が開始された時点で、音声再生に関するユーザ設定は全て無効とし、緊急放送視聴用の設定に変更する。緊急放送が終了した後は、緊急放送開始前のユーザ設定に戻すようにする。これにより、ユーザの嗜好に対する変更を最低限に抑えた上で確実に緊急放送の内容を伝達することができる。
[Processing during emergency broadcast]
In the case of an emergency broadcast, all user settings regarding audio playback are invalidated and changed to settings for viewing the emergency broadcast at the time the emergency broadcast is started so that the user can reliably recognize the content of the emergency broadcast. After the emergency broadcast ends, the user settings are returned to those before the start of the emergency broadcast. This makes it possible to reliably transmit the contents of the emergency broadcast while minimizing changes to the user's preferences.
 [例外処理]
 いくつかの例外処理について説明する。まず、受信機の電源投入時またはチャンネル変更時であるが、標準的に使用するスピーカシステムを使用することにしても良いし、電源を落とす前或いはチャンネルを変更する前に使用していたスピーカシステムを使用することにしても良い。さらに、電源投入時、或いはチャンネル変更時に受信する番組に対して使用するスピーカシステムが設定されている場合には、そのスピーカシステムを使用する。標準的に使用するスピーカシステムは、受信機固有のものであっても良いし、ユーザが設定したものでも良い。
[Exception handling]
Describe some exception handling. First, when turning on the power of the receiver or changing channels, you can choose to use the standard speaker system, or use the speaker system that was used before turning off the power or changing channels. You may also decide to use . Furthermore, if a speaker system is set to be used for a program to be received when the power is turned on or when the channel is changed, that speaker system is used. The standard speaker system used may be one specific to the receiver or one set by the user.
 外部スピーカシステムを使用する場合に、外部スピーカシステムの電源が落ちていることを検出した場合は、その旨画面に表示し、内部スピーカ等、切替えたスピーカシステムを表示する。以上のように、例外処理を設定しておくことにより、ユーザの視聴状態を良好に保つことができる。 When using an external speaker system, if it is detected that the external speaker system is powered off, this will be displayed on the screen and the switched speaker system, such as the internal speaker, will be displayed. As described above, by setting exception handling, it is possible to maintain a good viewing condition for the user.
 [著作権保護処理]
 次に、本実施例に係る放送システムにおける著作権保護処理(コンテンツ保護処理)について、説明する。
[Copyright protection processing]
Next, copyright protection processing (content protection processing) in the broadcasting system according to this embodiment will be explained.
 [著作権保護機能]
 本実施例のデジタル放送システムにおいて、MPEG-2 TSの番組配列情報(例えば、PMT)やMMTのMPTなどの制御情報に、第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)などのコンテンツのコピー制御を示す制御情報を格納して、放送局側から放送受信装置100へ伝送する。第1の制御情報(digital_recoding_control_data)は、例えば、2ビットのデータであり、00であれば『制約条件なしにコピー可能』を示し、10であれば『1世代のみコピー可』を示し、11であれば『コピー禁止』を示すように構成すれば良い。なお、第1の制御情報(digital_recoding_control_data)が00で『制約条件なしにコピー可能』である場合に、他の1ビットの制御情報と組み合わせて、『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理要』と『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理不要』の2種類の状態を識別するようにしても良い。また、第2の制御情報(copy_restriction_mode)は、例えば、1ビットのデータであり、0であれば『1世代のみコピー可』を示し、1であれば『個数制限コピー可』を示すように構成すれば良い。なお、『個数制限コピー可』とは、所定回数のコピーを許可するコピー制御状態であり、例えば、9回コピー可+ムーブ1回可であれば、いわゆる『ダビング10』となる。なお、第2の制御情報(copy_restriction_mode)は、第1の制御情報(digital_recoding_control_data)が10であれば『1世代のみコピー可』を示す場合のみ機能する。すなわち、第1の制御情報(digital_recoding_control_data)が10であれば『1世代のみコピー可』を示す場合に、コンテンツのコピー制御が、『個数制限コピー可』であるか、『1世代のみコピー可』であるかを識別するための制御情報である。したがって、第1の制御情報(digital_recoding_control_data)が00であり『制約条件なしにコピー可能』を示す場合、および第1の制御情報(digital_recoding_control_data)が11であり『コピー禁止』を示す場合は、第2の制御情報(copy_restriction_mode)は0または1のいずれを示している場合でも、放送受信装置100はコピー制御の識別に用いない。すなわち、第1の制御情報(digital_recoding_control_data)が00であり『制約条件なしにコピー可能』を示す場合、および第1の制御情報(digital_recoding_control_data)が11であり『コピー禁止』を示す場合は、第2の制御情報(copy_restriction_mode)は放送受信装置100においてDon’t_Care状態として扱われる。この場合、本実施例の放送受信装置100は当該第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)に応じて、当該コンテンツのストレージ(蓄積)部110への蓄積、リムーバブル記録媒体への記録、外部機器への出力、外部機器へのコピー、外部機器へのムーブ処理などを制御するように構成しても良い。なお、蓄積処理の対象は放送受信装置100内部のストレージ(蓄積)部110のみならず、放送受信装置100のみで再生可能となるように暗号化処理等の保護処理を施した記録を含んでも良い。具体的には、蓄積処理の対象には外付けの記録装置などのうち、放送受信装置100のみで記録再生可能な状態にしたものなどが含まれる。
[Copyright protection function]
In the digital broadcasting system of this embodiment, first control information (digital_recording_control_data), second control information (copy_restriction_mode), etc. are added to control information such as MPEG-2 TS program sequence information (for example, PMT) and MMT MPT. Control information indicating content copy control is stored and transmitted from the broadcast station side to the broadcast receiving apparatus 100. The first control information (digital_recording_control_data) is, for example, 2-bit data, where 00 indicates "copy is possible without any restrictions", 10 indicates "copy is possible for only one generation", and 11 indicates "copy is possible without restrictions". If so, it may be configured to indicate "copy prohibited". Note that if the first control information (digital_recording_control_data) is 00 and indicates that it is ``copyable without any constraints,'' it can be combined with another 1-bit control information to indicate that it is ``copiable without constraints and that encryption is enabled during storage and output.'' Two types of states may be identified: ``encryption processing required'' and ``copy possible without constraints and encryption processing not required during storage and output''. Further, the second control information (copy_restriction_mode) is, for example, 1-bit data, and is configured so that if it is 0, it indicates that "only one generation can be copied", and if it is 1, it indicates that "copy with a limited number is allowed". Just do it. Note that "limited number of copies allowed" is a copy control state that permits copying a predetermined number of times; for example, if nine copies are allowed + one move is allowed, this is a so-called "dubbing 10". Note that the second control information (copy_restriction_mode) functions only when the first control information (digital_recoding_control_data) is 10, indicating that "only one generation can be copied." In other words, if the first control information (digital_recording_control_data) is 10, indicating that "only one generation can be copied", the copy control of the content is "limited number of copies can be made" or "only one generation can be copied". This is control information for identifying whether the Therefore, if the first control information (digital_recording_control_data) is 00, indicating "copy possible without constraints", and if the first control information (digital_recording_control_data) is 11, indicating "copy prohibited", then the second Regardless of whether the control information (copy_restriction_mode) indicates 0 or 1, the broadcast receiving apparatus 100 does not use it to identify copy control. That is, if the first control information (digital_recording_control_data) is 00, indicating "copy possible without constraints", and if the first control information (digital_recording_control_data) is 11, indicating "copy prohibited", the second The control information (copy_restriction_mode) is treated as a Don't_Care state in the broadcast receiving apparatus 100. In this case, the broadcast receiving apparatus 100 of the present embodiment stores the content in the storage section 110 and stores the content on a removable recording medium according to the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode). It may be configured to control recording to, output to external equipment, copying to external equipment, moving to external equipment, etc. Note that the storage processing target is not limited to the storage unit 110 inside the broadcast receiving device 100, but may also include records that have been subjected to protection processing such as encryption processing so that they can be played only by the broadcast receiving device 100. . Specifically, the storage process targets include external recording devices that are in a state where they can be recorded and reproduced only by the broadcast receiving device 100.
 当該第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)などの制御情報に基づく映像コンテンツの処理の具体例を以下に説明する。 A specific example of video content processing based on control information such as the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) will be described below.
 まず、MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)により『制約条件なしにコピー可能』を示すコンテンツについて、本実施例の放送受信装置100は、ストレージ(蓄積)部110への蓄積、リムーバブル記録媒体への記録、外部機器への出力、外部機器へのコピー、外部機器へのムーブ処理を制限なしに行って構わない。ただし『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理要』と『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理不要』とが分かれている場合は、『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理要』の際には、ストレージ(蓄積)部110への蓄積、リムーバブル記録媒体への記録、外部機器への出力、外部機器へのコピー、および、外部機器へのムーブ処理を回数に制限なく行うことができるが、何れも暗号化処理を施す必要がある。 First, the broadcast receiving apparatus 100 of this embodiment stores content that indicates "copyable without restrictions" by the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT. ) unit 110, recording on a removable recording medium, outputting to an external device, copying to an external device, and moving to an external device may be performed without restriction. However, if there is a distinction between ``Copyable without constraints and encryption processing required during storage and output'' and ``Copyable without constraints and encryption processing not required during storage and output'', then ``Copy without constraints If it is possible and requires encryption processing during storage and output, storage in the storage unit 110, recording on a removable recording medium, output to an external device, copy to an external device, and transfer to an external device. The move processing can be performed an unlimited number of times, but it is necessary to perform encryption processing in each case.
 また、MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)の組み合わせにより『1世代のみコピー可』を示すコンテンツについて、本実施例の放送受信装置100は、ストレージ(蓄積)部110への暗号化しての蓄積を可能とするが、蓄積後のコンテンツを外部機器へ視聴用に出力する場合には、『再コピー禁止』のコピー制御情報とともに暗号化して出力することとする。ただし、外部機器へのいわゆるムーブ処理(外部機器へコンテンツをコピーし、放送受信装置100のストレージ(蓄積)部110内のコンテンツは消去処理などにより再生不能化する処理)は可能とする。 In addition, this example applies to content that indicates "only one generation can be copied" by a combination of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS or the MPT of the MMT. The broadcast receiving apparatus 100 enables encrypted storage of content in the storage (storage) unit 110, but when outputting the stored content to an external device for viewing, copying of "re-copy prohibited" is required. It will be encrypted and output together with the control information. However, so-called move processing to an external device (processing of copying the content to the external device and making the content in the storage unit 110 of the broadcast receiving device 100 unplayable by erasing processing or the like) is possible.
 また、MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)の組み合わせにより『個数制限コピー可』を示すコンテンツについて、本実施例の放送受信装置100は、ストレージ(蓄積)部110へ暗号化して蓄積することを可能とするが、蓄積後のコンテンツを外部機器へ視聴用に出力する場合には、『再コピー禁止』のコピー制御情報とともに暗号化して出力することとする。ただし、外部機器へ予め定められた数のコピーとムーブ処理を可能として良い。いわゆる『ダビング10』規定の場合は、外部機器へ9回のコピーと1回のムーブ処理を行って良い。 In addition, this embodiment applies to content that indicates "copyable with limited number" by a combination of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS and the MPT of the MMT. The broadcast receiving apparatus 100 is capable of encrypting and accumulating the content in the storage (storage) unit 110, but when outputting the accumulated content to an external device for viewing, copy control of "prohibit re-copying" is required. It will be encrypted and output along with the information. However, it may be possible to enable a predetermined number of copies and moves to an external device. In the case of the so-called "dubbing 10" regulation, nine copies and one move process may be performed to the external device.
 また、MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)により『コピー禁止』を示すコンテンツについて、本実施例の放送受信装置100は、ストレージ(蓄積)部110へのコピーを禁止する。ただし、放送受信装置100は予め定められた所定時間または放送信号に含まれる制御情報により指定される所定時間のみストレージ(蓄積)部110への保持を可能とする『一時蓄積』モードを有するように構成する場合には、MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)が『コピー禁止』を示す場合であっても、ストレージ(蓄積)部110への当該コンテンツの一時的な保持を可能とする。MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)が『コピー禁止』の当該コンテンツを外部機器への視聴用として出力する場合には、『コピー禁止』のコピー制御情報とともに暗号化して出力することとする。 Furthermore, the broadcast receiving apparatus 100 of this embodiment stores content that is indicated as "copy prohibited" by the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT to the storage unit 110. Copying is prohibited. However, the broadcast receiving apparatus 100 has a "temporary storage" mode that allows data to be stored in the storage unit 110 only for a predetermined time or a predetermined time specified by control information included in the broadcast signal. In this case, even if the first control information (digital_recording_control_data) included in the PMT of the MPEG-2 TS or the MPT of the MMT indicates "copy prohibited", the content cannot be transferred to the storage unit 110. can be temporarily held. MPEG-2 If the first control information (digital_recording_control_data) included in the PMT of the TS or the MPT of the MMT is to output content for which "copy is prohibited" to be viewed on an external device, the copy control of "copy prohibited" is applied. It will be encrypted and output along with the information.
 なお、前述の外部機器への視聴用の出力は、図2Aの映像出力部193、或いは、デジタルI/F部125やLAN通信部121などを介して行えば良い。前述の外部機器へコピーまたはムーブ処理は、図2AのデジタルI/F部125やLAN通信部121などを介して行えば良い。 Note that output for viewing to the aforementioned external device may be performed via the video output unit 193 in FIG. 2A, the digital I/F unit 125, the LAN communication unit 121, or the like. The copying or moving process to the external device described above may be performed via the digital I/F section 125, the LAN communication section 121, etc. in FIG. 2A.
 以上説明した処理によれば、映像コンテンツと対応付けられたコピー制御情報に応じて、適切なコンテンツ保護を実現することができる。 According to the processing described above, it is possible to realize appropriate content protection according to the copy control information associated with the video content.
 次に、当該コピー制御情報に基づく音声コンテンツの処理の具体例を以下に説明する。既に説明したとおり、本実施例のデジタル放送システムでは、高度な音声信号が含まれる音声コンテンツを送受信することが可能である。したがって、本実施例の放送受信装置100は、当該高度な音声信号が含まれる音声コンテンツに対応する、より好適な著作権保護処理(コンテンツ保護処理)を行う機能を搭載することが望ましい。 Next, a specific example of audio content processing based on the copy control information will be described below. As already explained, in the digital broadcasting system of this embodiment, it is possible to transmit and receive audio content including advanced audio signals. Therefore, it is desirable that the broadcast receiving apparatus 100 of the present embodiment is equipped with a function to perform more suitable copyright protection processing (content protection processing) corresponding to audio content that includes the advanced audio signal.
 まず、図28Bに、本実施例のデジタル放送システムにおいて、MPEG-2 TSのPMTやMMTのMPTに含まれる音声コンポーネント記述子(audio_component_descriptor)のデータ構造を示す。図28Bの音声コンポーネント記述子は、本実施例のデジタル放送システムで伝送されるコンテンツの音声コンポーネントに関する各種情報を記述するデータが格納されている。このうち、図28Bに示す、component_type(コンポーネント種別)のデータに、対象コンテンツに含まれる音声コンポーネントの種類の情報を格納することが可能である。なお、図28Bに示すcomponent_typeのデータ表記には、「音声」は示されていないが、音声コンポーネント記述子に格納される、component_typeであるため、これは、すなわち音声コンポーネント種別を示す情報を格納するデータであるといえる。 First, FIG. 28B shows the data structure of the audio component descriptor (audio_component_descriptor) included in the PMT of MPEG-2 TS and MPT of MMT in the digital broadcasting system of this embodiment. The audio component descriptor in FIG. 28B stores data describing various information regarding the audio component of the content transmitted by the digital broadcasting system of this embodiment. Among these, information on the type of audio component included in the target content can be stored in the component_type (component type) data shown in FIG. 28B. Note that although "audio" is not shown in the data representation of component_type shown in FIG. 28B, this is component_type stored in the audio component descriptor, so this stores information indicating the audio component type. It can be said that it is data.
 次に、図28Cを用いて、本実施例のデジタル放送システムにおいて伝送可能な、音声コンポーネント種別のデータの値について説明する。 Next, the values of audio component type data that can be transmitted in the digital broadcasting system of this embodiment will be explained using FIG. 28C.
 図28Cには、音声コンポーネント種別のデータにおける一部のビットにより示すことが可能な音声信号の種類の一覧が示されている。例えば、00000~10001などには、従来のデジタル放送で用いられた定義と同様の定義が用いられている。これらの定義の中に、シングルモノ(シングルモノラル)や、ステレオ、5.1ch、7.1ch、22.2chなど、比較的ポピュラーな音声信号の定義も含まれている。なお、図28Cに示すLFE(Low Frequency Effect)とは、低音増強用チャンネルを示すものである。 FIG. 28C shows a list of audio signal types that can be indicated by some bits in the audio component type data. For example, definitions similar to those used in conventional digital broadcasting are used for 00000 to 10001. These definitions include definitions of relatively popular audio signals such as single mono, stereo, 5.1ch, 7.1ch, and 22.2ch. Note that LFE (Low Frequency Effect) shown in FIG. 28C indicates a bass enhancement channel.
 さらに、本実施例に係る音声コンポーネント種別のデータでは、音声コンポーネント種別のデータにおける一部のビットにより示すことが可能な音声信号の種類として、新たに10010~11010の定義を追加する。具体的には、図28Cに示すとおり、データ10010として、7.1chに上方に配置される4チャンネルを加えた7.1.4chを定義する。データ10011として、7.1.4chに4つのオブジェクト信号を含めた7.1.4ch+4objを定義する。データ10100として、7.1.4chに6つのオブジェクト信号を含めた7.1.4ch+6objを定義する。データ10101として、22.2chに4つのオブジェクト信号を含めた22.2ch+4objを定義する。データ10110として、信号数が1のHOA方式信号であるHOA1を定義する。データ10111として、信号数が4のHOA方式信号であるHOA4を定義する。データ11000として、信号数が9のHOA方式信号であるHOA9を定義する。データ11001として、信号数が16のHOA方式信号であるHOA16を定義する。また、データ11010として、信号数が25のHOA方式信号であるHOA25を定義する。 Furthermore, in the audio component type data according to this embodiment, definitions of 10010 to 11010 are newly added as audio signal types that can be indicated by some bits in the audio component type data. Specifically, as shown in FIG. 28C, 7.1.4ch, which is 7.1ch plus four channels arranged above, is defined as data 10010. As data 10011, 7.1.4ch+4obj, which includes four object signals in 7.1.4ch, is defined. As data 10100, 7.1.4ch+6obj, which includes six object signals in 7.1.4ch, is defined. As data 10101, 22.2ch+4obj, which includes four object signals in 22.2ch, is defined. As data 10110, HOA1, which is an HOA system signal with one signal, is defined. As data 10111, HOA4, which is an HOA system signal with four signals, is defined. As data 11000, HOA9, which is an HOA system signal with nine signals, is defined. As data 11001, HOA16, which is an HOA system signal with 16 signals, is defined. Further, as data 11010, HOA25, which is an HOA system signal with 25 signals, is defined.
 本実施例において既に説明したとおり、本実施例のデジタル放送システムでは、高度な音声信号が含まれる音声コンテンツを送受信することが可能である。ここで、図28Cに示される、音声信号の種類のうち、音声コンポーネント種別のデータが00001のシングルモノラル信号から10010の7.1.4ch信号までの音声信号はチャンネルベースの信号である。これに対し、音声コンポーネント種別のデータが10011の7.1.4ch+4obj信号から10101の22.2ch+4obj信号からまでが、オブジェクトベース信号の音声信号である。オブジェクトベース信号は、本実施例における高度な音声信号に含まれる。また、音声コンポーネント種別のデータが10110のHOA1信号から11010のHOA25信号からまでが、HOA方式の音声信号である。HOA方式の音声信号は、本実施例における高度な音声信号に含まれる。 As already explained in this embodiment, the digital broadcasting system of this embodiment is capable of transmitting and receiving audio content that includes advanced audio signals. Here, among the types of audio signals shown in FIG. 28C, audio signals ranging from a single monaural signal with audio component type data of 00001 to a 7.1.4ch signal with audio component type data of 10010 are channel-based signals. On the other hand, the audio signal from the 7.1.4ch+4obj signal with audio component type data 10011 to the 22.2ch+4obj signal with audio component type data 10101 is the audio signal of the object-based signal. The object-based signal is included in the advanced audio signal in this embodiment. Furthermore, the HOA1 signal with audio component type data of 10110 to the HOA25 signal with audio component type data of 11010 are audio signals of the HOA system. The HOA audio signal is included in the advanced audio signal in this embodiment.
 したがって、本実施例のデジタル放送システムでは、コンテンツを伝送するときに、MPEG-2 TSのPMTやMMTのMPTに含まれる音声コンポーネント記述子に音声コンポーネント種別のデータを格納し、放送局側から放送受信装置100へ伝送することにより、放送受信装置100において、伝送されたコンテンツの音声信号が、高度な音声信号か否かを識別することが可能となる。 Therefore, in the digital broadcasting system of this embodiment, when transmitting content, audio component type data is stored in the audio component descriptor included in the PMT of MPEG-2 TS and MPT of MMT, and the data is By transmitting the content to the receiving device 100, the broadcast receiving device 100 can identify whether the audio signal of the transmitted content is an advanced audio signal or not.
 ここで、放送受信装置100における音声コンテンツに対する著作権保護処理(コンテンツ保護処理)の制御例について、図31Aから図32Bを用いて説明する。
 なお、図31Aから図32Bにおいて示される、第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)は、コンテンツのコピー制御を示す制御情報であり、当該コンテンツについての第1の制御情報(digital_recoding_control_data)、第2の制御情報(copy_restriction_mode)の定義については、既に説明した通りである。
Here, a control example of copyright protection processing (content protection processing) for audio content in the broadcast receiving apparatus 100 will be described using FIGS. 31A to 32B.
Note that the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) shown in FIGS. 31A to 32B are control information indicating copy control of the content, and are the first control information regarding the content. The definitions of (digital_recording_control_data) and the second control information (copy_restriction_mode) are as already explained.
 なお、図31Aから図32Bにおいて示される、デコード音声出力(アナログ)とは、デコードされたアナログ信号状態での音声信号の出力処理である。これは、例えば、図15Bのコアデコーダでデコードされた音声信号が、図示しないD/Aコンバータ(デジタル―アナログコンバータ)により、アナログ信号に変換され、図2Aの拡張インタフェース部124が備えるアナログ映像/音声インタフェースなどを介して、アナログ信号状態で外部機器などに出力される処置である。 Note that the decoded audio output (analog) shown in FIGS. 31A to 32B is output processing of an audio signal in a decoded analog signal state. This means that, for example, the audio signal decoded by the core decoder in FIG. 15B is converted into an analog signal by a D/A converter (digital-to-analog converter) not shown, and the analog video/video signal provided in the expansion interface section 124 in FIG. This is a procedure in which an analog signal is output to an external device via an audio interface or the like.
 また、図31Aから図32Bにおいて示される、デコード音声出力(デジタル)とは、デコードされたデジタル信号状態での音声信号の出力処理である。これは、例えば、図15Bのコアデコーダでデコードされた音声信号が、図2Aの音声出力部196を介してデジタル信号の状態で外部機器などに出力される処理である。なお、デコードされた音声信号をデジタルI/F部125やLAN通信部121などを介して、デジタル信号の状態で外部機器などに出力しても良い。 Furthermore, the decoded audio output (digital) shown in FIGS. 31A to 32B is output processing of an audio signal in a decoded digital signal state. This is a process in which, for example, the audio signal decoded by the core decoder in FIG. 15B is outputted as a digital signal to an external device or the like via the audio output unit 196 in FIG. 2A. Note that the decoded audio signal may be output as a digital signal to an external device or the like via the digital I/F section 125, the LAN communication section 121, or the like.
 また、図31Aから図32Bにおいて示される、ストリーム出力(IPインタフェース)とは、ストリーム形式のデジタル信号状態での音声信号の出力処理である。これは、例えば、図15Bのコアデコーダでのデコードを行わず、ビットストリーム出力制御器10106を介して、ストリーム形式のデジタル信号状態で外部機器へ出力する処理である。この場合、放送受信装置100から外部機器へ出力は、例えば、図2AのLAN通信部121を介したIPインタフェース出力としての外部機器へ出力しても良い。 Furthermore, stream output (IP interface) shown in FIGS. 31A to 32B is output processing of an audio signal in a stream format digital signal state. This is, for example, a process in which the core decoder in FIG. 15B does not perform decoding and outputs a digital signal in a stream format to an external device via the bitstream output controller 10106. In this case, the broadcast receiving device 100 may output to the external device as an IP interface output via the LAN communication unit 121 in FIG. 2A, for example.
 ここで、図31Aは、放送受信装置100において、コンテンツの音声コンポーネントを蓄積せずにそのまま出力する場合の著作権保護処理の制御例を示す表である。当該コンテンツについての第1の制御情報(digital_recoding_control_data)、第2の制御情報(copy_restriction_mode)の定義については、既に説明した通りである。また、第3の制御情報(音声 component_type)は、図28Cで説明した音声コンポーネント種別の情報であり、当該音声コンポーネント種別の情報により、放送受信装置100は、コンテンツに含まれる音声コンポーネントの音声信号の種類がチャンネルベース信号であるか、オブジェクトベース信号であるか、HOA方式の信号であるかを識別する。また、放送受信装置100は、当該音声コンポーネント種別の情報が未定義に対応する値を示す場合はその旨も識別する。 Here, FIG. 31A is a table showing a control example of copyright protection processing when the broadcast receiving apparatus 100 outputs the audio component of the content as it is without accumulating it. The definitions of the first control information (digital_recording_control_data) and the second control information (copy_restriction_mode) regarding the content have already been described. Further, the third control information (audio component_type) is information on the audio component type explained in FIG. Identify whether the type is a channel-based signal, object-based signal, or HOA type signal. Furthermore, when the information on the audio component type indicates a value corresponding to undefined, the broadcast receiving apparatus 100 also identifies that fact.
 ここで、図31Aの制御例では、第1の制御情報、第2の制御情報、第3の制御情報がいずれの状態であっても、アナログ信号状態でのデコード音声出力については、制限なしにコピー可能の状態で出力するように制御する。デジタル信号に比べてコピー処理による劣化が生じやすいアナログ信号であれば、デジタル信号ほど厳しくコピーを制限する必要がないためである。 Here, in the control example of FIG. 31A, no matter which state the first control information, second control information, or third control information is in, there is no limit to the decoded audio output in the analog signal state. Control output so that it can be copied. This is because analog signals are more susceptible to deterioration due to copy processing than digital signals, so there is no need to restrict copying as severely as digital signals.
 ここで、図31Aの制御例では、デジタル信号状態でのデコード音声出力については、第3の制御情報が高度な音声信号を示す場合、すなわち、第3の制御情報がオブジェクトベース信号またはHOA方式の信号を示す場合、第1の制御情報の値と第2の制御情報の値がいずれの組み合わせであっても、コピー禁止の状態で出力するように制御する。高度な音声信号のコンテンツは、チャンネルベース信号の音声信号のコンテンツより価値が高い。よって、図31Aの制御例では、第1の制御情報の値と第2の制御情報の値の組み合わせが、『制約条件なしにコピー可能』を示す場合、『1世代のみコピー可』を示す場合、『個数制限コピー可』を示す場合などにおいて、チャンネルベース信号の音声信号のコンテンツよりも、オブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツについて、デジタル信号状態でのデコード音声出力でのコピー制御をより制限の厳しい制御としている。これにより、音声信号の種別に応じたコンテンツの価値に対応するコピー制御を実現することが可能となる。なお、デジタル信号状態でのデコード音声出力について、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツと、未定義を示すコンテンツは、図31Aに示すとおり、第1の制御情報が『制約条件なしにコピー可能』を示す場合は、制限なしにコピー可能の状態で出力するように制御する。また、デジタル信号状態でのデコード音声出力について、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツと、未定義を示すコンテンツは、図31Aに示すとおり、第1の制御情報が『1世代のみコピー可』を示す場合は、第2の制御情報の値がいずれの値であっても、再び1世代のみコピー可の状態で出力するように制御する。また、デジタル信号状態でのデコード音声出力について、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツと、未定義を示すコンテンツは、図31Aに示すとおり、第1の制御情報が『コピー禁止』を示す場合は、コピー禁止の状態で出力するように制御する。 Here, in the control example of FIG. 31A, for decoded audio output in the digital signal state, if the third control information indicates an advanced audio signal, that is, if the third control information indicates an object-based signal or an HOA method, When a signal is indicated, the control is performed so that the signal is output in a copy-prohibited state, regardless of the combination of the first control information value and the second control information value. The content of the advanced audio signal is more valuable than the content of the audio signal of the channel-based signal. Therefore, in the control example of FIG. 31A, when the combination of the value of the first control information and the value of the second control information indicates that "copying is possible without any constraint conditions", or when it indicates that "copying of only one generation is possible" , in cases where "limited number of copies allowed" is indicated, the content of object-based audio signals or HOA-based audio signals is copied by decoded audio output in the digital signal state rather than the audio signal content of channel-based signals. The control is more restrictive. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal. Regarding the decoded audio output in the digital signal state, the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are as shown in FIG. 31A. If copying is possible without any conditions is indicated, the output is controlled so that it can be copied without any restrictions. Regarding the decoded audio output in the digital signal state, the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are as shown in FIG. 31A. In the case of indicating "only generation copyable", no matter what value the second control information has, control is performed so that only one generation is output in a copyable state again. Furthermore, regarding decoded audio output in the digital signal state, content where the third control information indicates an audio signal of a channel base signal and content where the third control information indicates undefined, the first control information indicates "copy", as shown in FIG. If "prohibited" is indicated, the output is controlled so that copying is prohibited.
 なお、図31Aに示した、デジタル信号状態でのデコード音声出力についての制御例は、デコード音声出力の出力部が、SCMS(Serial Copy Management System)などのコピー制御方式に対応した出力部である場合に、行うものである。デコード音声出力の出力部がコピー制御に対応していない場合は、第1の制御情報の値と第2の制御情報の値と第3の制御情報の値がいずれの組み合わせであっても、コピー制限なしに出力することを許容する。 Note that the example of control for decoded audio output in a digital signal state shown in FIG. 31A is performed when the output unit for decoded audio output is an output unit compatible with a copy control method such as SCMS (Serial Copy Management System). It is something that is done. If the output part of the decoded audio output does not support copy control, no matter what combination of the first control information value, the second control information value, and the third control information value, the copy Allows output without restrictions.
 また、図31Aに示した、IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力についての制御例でのコピー制御状態の設定は、図31Aに示すとおり、デジタル信号状態でのデコード音声出力についての制御例と同様であるため、繰り返しの説明は省略する。IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力についての制御例でも、第1の制御情報の値と第2の制御情報の値の組み合わせが、『制約条件なしにコピー可能』を示す場合、『1世代のみコピー可』を示す場合、『個数制限コピー可』を示す場合などにおいて、チャンネルベース信号の音声信号のコンテンツよりも、オブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツのコピー制御をより制限の厳しい制御としている。これにより、音声信号の種別に応じたコンテンツの価値に対応するコピー制御を実現することが可能となる。なお、図31Aに示した、IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力では、コピー制御方式としてDTCPまたはDTCP2を用いる。第1の制御情報の値と第2の制御情報の値が、『1世代のみコピー可』、『個数制限コピー可』、『コピー禁止』などコピー制限があることを示すコンテンツについて、DTCPにもDTCP2にも対応していない外部機器への出力は禁止するように制御する。この点の制御は、デジタル信号状態でのデコード音声出力についての制御例と相違する。 Furthermore, as shown in FIG. 31A, the copy control state settings in the control example for audio signal output in a digital signal state in a stream format via an IP interface are as shown in FIG. 31A. Since this is the same as the control example for output, repeated explanation will be omitted. In the example of controlling audio signal output in a stream format digital signal state via an IP interface, the combination of the first control information value and the second control information value indicates that "copying is possible without any constraints". content of object-based audio signals or HOA method audio signals rather than the content of audio signals of channel-based signals, etc. Copy control is more restrictive. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal. Note that in the audio signal output in the stream format digital signal state via the IP interface shown in FIG. 31A, DTCP or DTCP2 is used as the copy control method. DTCP also applies to content where the value of the first control information and the value of the second control information indicate that there are copy restrictions such as "only one generation can be copied", "limited number of copies can be copied", and "copy prohibited". Control is performed to prohibit output to external devices that are not compatible with DTCP2. Control in this respect is different from the example of control regarding decoded audio output in the digital signal state.
 以上説明した本実施例の図31Aの制御例によれば、音声信号の種別に応じたコンテンツの価値に対応する、より好適なコピー制御を実現することが可能となる。 According to the control example shown in FIG. 31A of this embodiment described above, it is possible to realize more suitable copy control that corresponds to the value of content according to the type of audio signal.
 次に、図31Bの制御例について説明する。図31Bの制御例は、放送受信装置100において、コンテンツの音声コンポーネントを蓄積せずにそのまま出力する場合の著作権保護処理の制御例であり、図31Aの制御例の変形例である。以下、繰り返しの説明を省略するため、図31Aの制御例と同様の点は説明を省略し、図31Aの制御例と相違する点のみ説明する。 Next, the control example shown in FIG. 31B will be described. The control example in FIG. 31B is a control example of copyright protection processing when the broadcast receiving apparatus 100 outputs the audio component of the content as it is without storing it, and is a modification of the control example in FIG. 31A. Hereinafter, in order to avoid repeated explanation, the explanation of the points similar to the control example of FIG. 31A will be omitted, and only the points that are different from the control example of FIG. 31A will be explained.
 図31Bの制御例では、アナログ信号状態でのデコード音声出力の制御と、IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力の制御は、図31Aの制御例と同一である。これに対し、図31Bの制御例では、デジタル信号状態でのデコード音声出力における制御が、図31Aの制御例と異なる。 In the control example of FIG. 31B, the control of decoded audio output in an analog signal state and the control of audio signal output in a stream format digital signal state via an IP interface are the same as the control example of FIG. 31A. On the other hand, in the control example of FIG. 31B, the control on decoded audio output in the digital signal state is different from the control example of FIG. 31A.
 具体的には、図31Bの制御例のデジタル信号状態でのデコード音声出力の制御では、第1の制御情報の値と第2の制御情報の値の組み合わせが、『制約条件なしにコピー可能』を示す場合は、第3の制御情報の値がいずれの値を示していても、デジタル信号状態でのデコード音声出力において制限なしにコピー可能の状態で出力可能とする。また、第1の制御情報の値と第2の制御情報の値の組み合わせが、『1世代のみコピー可』を示す場合でも『個数制限コピー可』を示す場合でも、第3の制御情報の値がいずれの値を示していても、デジタル信号状態でのデコード音声出力において1世代のみコピー可の状態で出力する。また、第1の制御情報の値と第2の制御情報の値の組み合わせが、『コピー禁止』を示す場合、第3の制御情報の値がいずれの値を示していても、デジタル信号状態でのデコード音声出力においてコピー禁止の状態で出力する。すなわち、図31Bの制御例のデジタル信号状態でのデコード音声出力の制御では、チャンネルベース信号の音声信号のコンテンツのコピー制御に対して、高度な音声信号であるオブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツのコピー制御とで、音声出力時のコピー制限に差をつけていない。これは、高度な音声信号であるオブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツのデジタル信号状態でのデコード音声出力であっても、例えば、図15Bのミキサーおよび分配器10105の処理を経たのちの出力であれば、出力先の外部機器において、放送受信装置100の受信時の音声信号のコンテンツの状態を再現することは困難であると考えられるためである。よって、この考えにもとづけば、高度な音声信号のコンテンツについてのデジタル信号状態でのデコード音声出力であっても、チャンネルベース信号の音声信号のコンテンツについてのデジタル信号状態でのデコード音声出力と同等のコピー制御で十分であると考えられる。これに対し、IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力では、ミキサーおよび分配器10105の処理を経ていないので、出力先の外部機器において、高度な音声信号のコンテンツを、放送受信装置100の受信時の状態で取得することができてしまう。よって、図31Bの制御例に示すように、IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力については、チャンネルベース信号の音声信号のコンテンツよりも、オブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツのコピー制御をより制限の厳しい制御とするのが好適である。 Specifically, in the control of the decoded audio output in the digital signal state in the control example of FIG. 31B, the combination of the value of the first control information and the value of the second control information is "copyable without constraint". If this is the case, no matter which value the third control information indicates, the decoded audio output in the digital signal state can be output in a copyable state without any restrictions. Furthermore, even if the combination of the value of the first control information and the value of the second control information indicates that "only one generation can be copied" or "copy with a limited number of copies is possible", the value of the third control information No matter which value is indicated, only one generation is output in a copyable state in the decoded audio output in the digital signal state. Furthermore, if the combination of the value of the first control information and the value of the second control information indicates "copy prohibited", no matter which value the value of the third control information indicates, the digital signal state is The decoded audio output will be output with copying prohibited. That is, in the control of decoded audio output in the digital signal state in the control example of FIG. There is no difference in copy restrictions when outputting audio depending on copy control of audio signal content. Even if this is a decoded audio output in a digital signal state of the content of an object-based audio signal or an HOA-based audio signal, which is an advanced audio signal, for example, it may be processed by the mixer and distributor 10105 in FIG. 15B. This is because if the audio signal is output later, it may be difficult for the output destination external device to reproduce the state of the content of the audio signal at the time of reception by the broadcast receiving apparatus 100. Therefore, based on this idea, even if the decoded audio output is in the digital signal state for advanced audio signal content, it is the same as the decoded audio output in the digital signal state for the audio signal content of the channel base signal. Equivalent copy control is considered sufficient. On the other hand, when outputting an audio signal in the form of a stream-format digital signal via an IP interface, it does not go through the processing of the mixer and distributor 10105, so the output destination external device cannot broadcast advanced audio signal content. The information can be acquired in the state of the receiving device 100 at the time of reception. Therefore, as shown in the control example of FIG. 31B, for the audio signal output in the stream format digital signal state via the IP interface, the content of the object-based audio signal or the HOA method is more important than the audio signal content of the channel-based signal. It is preferable that the copy control of the content of the audio signal is made more restrictive.
 以上説明した本実施例の図31Bの制御例によっても、音声信号の種別に応じたコンテンツの価値に対応する、より好適なコピー制御を実現することが可能となる。 The control example shown in FIG. 31B of this embodiment described above also makes it possible to realize more suitable copy control that corresponds to the value of content according to the type of audio signal.
 次に、図32Aは、放送受信装置100において、コンテンツの音声コンポーネントを蓄積し、そののちに出力する場合の著作権保護処理の制御例を示す表である。当該蓄積の処理は、放送受信装置100のストレージ(蓄積)部110に、音声コンポーネントを含むコンテンツを蓄積することにより行われる。 Next, FIG. 32A is a table showing a control example of copyright protection processing when audio components of content are stored in the broadcast receiving apparatus 100 and then output. The storage process is performed by storing content including audio components in the storage unit 110 of the broadcast receiving device 100.
 ここで、図32Aの制御例では、第1の制御情報の値および第2の制御情報の値の組み合わせによって、蓄積状態におけるコンテンツのコピー制御状態はそれぞれ異なるように制御する。具体的には、第1の制御情報が『制約条件なしにコピー可能』を示す場合には、第2の制御情報の値がいずれの値であっても、当該コンテンツを制限なしにコピー可能の状態で蓄積する。このとき、第3の情報がいずれの値であっても、当該コンテンツを制限なしにコピー可能の状態で蓄積して構わない。また、第1の制御情報が『1世代のみコピー可』を示し第2の制御情報が『1世代のみコピー可』を示す場合には、当該コンテンツを再コピー禁止の状態で蓄積する。このとき、第3の情報がいずれの値であっても、当該コンテンツを再コピー禁止の状態で蓄積して構わない。また、第1の制御情報が『1世代のみコピー可』を示し第2の制御情報が『個数制限コピー可』を示す場合には、当該コンテンツを個数制限コピー可の状態で蓄積する。このとき、第3の情報がいずれの値であっても、当該コンテンツを個数制限コピー可の状態で蓄積して構わない。また、第1の制御情報が『コピー禁止』を示す場合には、第2の制御情報の値がいずれの値であっても、当該コンテンツを一時蓄積の状態で蓄積可能なように構成しても良い。このとき、第3の情報がいずれの値であっても、当該コンテンツを一時蓄積の状態で蓄積して構わない。次に、図32Aの制御例では、第1の制御情報の値および第2の制御情報の値の組み合わせによって、蓄積状態におけるコンテンツのコピー制御状態はそれぞれ異なるように制御するが、蓄積状態におけるコピー制御状態がいずれの状態であっても、第3の制御情報がいずれの状態であっても、蓄積後のコンテンツについての、アナログ信号状態でのデコード音声出力については、制限なしにコピー可能の状態で出力するように制御する。デジタル信号に比べてコピー処理による劣化が生じやすいアナログ信号であれば、デジタル信号ほど厳しくコピーを制限する必要がないためである。 Here, in the control example of FIG. 32A, the copy control state of the content in the storage state is controlled differently depending on the combination of the value of the first control information and the value of the second control information. Specifically, if the first control information indicates that the content can be copied without restrictions, regardless of the value of the second control information, the content can be copied without restrictions. Accumulate in the state. At this time, regardless of the value of the third information, the content may be stored in a state where it can be copied without restriction. Further, when the first control information indicates that "only one generation can be copied" and the second control information indicates that "only one generation can be copied", the content is stored in a state where re-copying is prohibited. At this time, regardless of the value of the third information, the content may be stored in a state where re-copying is prohibited. Further, when the first control information indicates that "copying is allowed for only one generation" and the second control information indicates that "copying is permitted with a limited number of copies," the content is stored in a state where copying is allowed with a limited number of copies. At this time, regardless of the value of the third information, the content may be stored in a state where copying is possible with a limited number of copies. Further, if the first control information indicates "copy prohibited", the content is configured so that it can be stored in a temporary storage state regardless of the value of the second control information. Also good. At this time, regardless of the value of the third information, the content may be stored in a temporary storage state. Next, in the control example of FIG. 32A, the copy control state of the content in the storage state is controlled differently depending on the combination of the value of the first control information and the value of the second control information. No matter what the control state is or what state the third control information is, the decoded audio output in the analog signal state of the stored content can be copied without any restrictions. Control the output with . This is because analog signals are more susceptible to deterioration due to copy processing than digital signals, and copying does not need to be restricted as severely as digital signals.
 また、図32Aの制御例では、蓄積後のコンテンツについての、デジタル信号状態でのデコード音声出力については、第3の制御情報が高度な音声信号を示す場合、すなわち、第3の制御情報がオブジェクトベース信号またはHOA方式の信号を示す場合は、蓄積状態におけるコピー制御状態がいずれのコピー制御状態であっても、コピー禁止の状態で出力するように制御する。高度な音声信号のコンテンツは、チャンネルベース信号の音声信号のコンテンツより価値が高い。よって、図32Aの制御例では、蓄積状態におけるコピー制御状態が、制限なしにコピー可能を示す場合、再コピー禁止を示す場合、個数制限コピー可を示す場合などにおいて、チャンネルベース信号の音声信号のコンテンツよりも、オブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツについて、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力でのコピー制御をより制限の厳しい制御としている。これにより、音声信号の種別に応じたコンテンツの価値に対応するコピー制御を実現することが可能となる。なお、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力について、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツと、未定義を示すコンテンツは、図32Aに示すとおり、蓄積状態におけるコピー制御状態が、制限なしにコピー可能の制御状態である場合は、制限なしにコピー可能の状態で出力するように制御する。また、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力について、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツと、未定義を示すコンテンツは、図32Aに示すとおり、蓄積状態におけるコピー制御状態が、再コピー禁止の制御状態または個数制限コピー可の制御状態である場合は、1世代のみコピー可の状態で出力するように制御する。また、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力について、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツと、未定義を示すコンテンツは、図32Aに示すとおり、蓄積状態におけるコピー制御状態が、一時蓄積の制御状態である場合は、コピー禁止の状態で出力するように制御する。 In addition, in the control example of FIG. 32A, regarding the decoded audio output in the digital signal state for the content after storage, if the third control information indicates an advanced audio signal, that is, the third control information is an object. When a base signal or an HOA type signal is indicated, control is performed so that it is output in a copy prohibited state, regardless of which copy control state is in the storage state. The content of the advanced audio signal is more valuable than the content of the audio signal of the channel-based signal. Therefore, in the control example of FIG. 32A, when the copy control state in the storage state indicates that copying is possible without restrictions, prohibiting re-copying, or allowing copying with a limited number of copies, the audio signal of the channel base signal is For content such as object-based audio signals or HOA-based audio signals, copy control in decoded audio output in a digital signal state for stored content is more restrictive than for content. This makes it possible to implement copy control that corresponds to the value of content according to the type of audio signal. Regarding the decoded audio output in the digital signal state for the content after storage, the content where the third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A. If the copy control state in is a control state in which copying is possible without restrictions, control is performed to output in a state in which copying is possible without restrictions. Regarding the decoded audio output in the digital signal state for the content after storage, the content whose third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A. If the copy control state in is a control state in which re-copying is prohibited or a control state in which limited copies are allowed, control is performed so that only one generation is output in a copyable state. Regarding the decoded audio output in the digital signal state for the content after storage, the content whose third control information indicates the audio signal of the channel base signal and the content where the third control information indicates undefined are in the storage state as shown in FIG. 32A. If the copy control state in is a temporary storage control state, control is performed to output in a copy prohibited state.
 なお、図32Aに示した、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力についての制御例は、デコード音声出力の出力部が、SCMS(Serial Copy Management System)などのコピー制御方式に対応した出力部である場合に、行うものである。デコード音声出力の出力部がコピー制御に対応していない場合は、蓄積状態におけるコピー制御状態がいずれの状態であっても、第3の制御情報の値がいずれの値であっても、コピー制限なしに出力することを許容する。 Note that in the example of control for decoded audio output in the digital signal state of the content after storage shown in FIG. 32A, the output unit for decoded audio output is compatible with a copy control method such as SCMS (Serial Copy Management System). This is done when the output section is If the output section of the decoded audio output does not support copy control, no matter what the copy control state is in the storage state or what value the third control information has, copy restrictions will be applied. Allows output without.
 次に、図32Aに示した、蓄積後のコンテンツについてのIPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力についての制御例では、蓄積状態におけるコピー制御状態が、再コピー禁止の制御状態、個数制限コピー可の制御状態または、一時蓄積の制御状態などのコピー制限のある制御状態である場合は、第3の制御情報の値がいずれの値であっても、再コピー禁止の状態で出力を行う。図32Aに示した、蓄積後のコンテンツについてのIPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力についての制御例では、蓄積状態におけるコピー制御状態が、制限なしにコピー可能である場合には、第3の制御情報に応じて、出力時のコピー制御状態を切り替える。具体的には、第3の制御情報がチャンネルベース信号の音声信号を示すコンテンツまたは第3の制御情報が未定義を示すコンテンツは、制限なしにコピー可能の状態での出力を許容する。これに対し、第3の制御情報が高度な音声信号であるオブジェクトベースの音声信号またはHOA方式の音声信号を示すコンテンツについては、再コピー禁止の状態で出力するように制御する。 Next, in the control example of the audio signal output in the stream format digital signal state via the IP interface for the content after storage shown in FIG. 32A, the copy control state in the storage state is the control for prohibiting re-copying. If the control state is in a control state with copy restrictions, such as a control state in which copying is possible with a limited number of copies, or a control state in which temporary storage is possible, re-copying is prohibited regardless of the value of the third control information. Output with . In the control example of the audio signal output in the stream format digital signal state via the IP interface for the content after storage shown in FIG. 32A, when the copy control state in the storage state is that copying is possible without restriction. In this case, the copy control state at the time of output is switched according to the third control information. Specifically, content in which the third control information indicates an audio signal of a channel base signal or content in which the third control information indicates undefined is allowed to be output in a state where it can be copied without restriction. On the other hand, content in which the third control information indicates an object-based audio signal or an HOA-based audio signal, which is an advanced audio signal, is controlled to be output in a state where re-copying is prohibited.
 なお、図32Aに示した、蓄積後のコンテンツについての、IPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力では、コピー制御方式としてDTCPまたはDTCP2を用いる。蓄積状態におけるコピー制御状態が、再コピー禁止の制御状態、個数制限コピー可の制御状態または、一時蓄積の制御状態など、コピー制限のある制御状態であるコンテンツについて、DTCPにもDTCP2にも対応していない外部機器への出力は禁止するように制御する。 Note that in the audio signal output in the stream format digital signal state via the IP interface for the content after storage shown in FIG. 32A, DTCP or DTCP2 is used as the copy control method. DTCP and DTCP2 are not supported for content whose copy control state in the storage state is a control state with copy restrictions, such as a control state of prohibiting re-copying, a control state of copying with limited quantity, or a control state of temporary storage. control to prohibit output to external devices that are not connected.
 以上説明した本実施例の図32Aの制御例によれば、蓄積後のコンテンツの出力においても、音声信号の種別に応じたコンテンツの価値に対応する、より好適なコピー制御を実現することが可能となる。 According to the control example shown in FIG. 32A of this embodiment described above, it is possible to realize more suitable copy control that corresponds to the value of the content according to the type of audio signal even in the output of the content after storage. becomes.
 次に、図32Bの制御例について説明する。図32Bの制御例は、放送受信装置100において、コンテンツの音声コンポーネントを蓄積し、そののちに出力する場合の著作権保護処理の制御例であり、図32Aの制御例の変形例である。以下、繰り返しの説明を省略するため、図32Aの制御例と同様の点は説明を省略し、図32Aの制御例と相違する点のみ説明する。 Next, the control example shown in FIG. 32B will be described. The control example in FIG. 32B is a control example of copyright protection processing when the audio component of content is stored in the broadcast receiving apparatus 100 and then output, and is a modification of the control example in FIG. 32A. Hereinafter, in order to avoid repeated explanation, the explanation of the points similar to the control example of FIG. 32A will be omitted, and only the points that are different from the control example of FIG. 32A will be explained.
 図32Bの制御例では、蓄積後のコンテンツについてのアナログ信号状態でのデコード音声出力の制御と、蓄積後のコンテンツについてのIPインタフェースを介したストリーム形式のデジタル信号状態での音声信号出力の制御は、図32Aの制御例と同一である。これに対し、図32Bの制御例では、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力における制御が、図32Aの制御例と異なる。具体的には、図32Bの制御例の蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力の制御では、蓄積状態におけるコピー制御状態が、制限なしにコピー可能の制御状態である場合は、第3の制御情報の値がいずれの値を示していても、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力において制限なしにコピー可能の状態で出力可能とする。また、蓄積状態におけるコピー制御状態が、再コピー禁止の制御状態である場合または個数制限コピー可の制御状態である場合は、第3の制御情報の値がいずれの値を示していても、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力において1世代のみコピー可の状態で出力する。また、蓄積状態におけるコピー制御状態が、一時蓄積の制御状態である場合、第3の制御情報の値がいずれの値を示していても、蓄積後のコンテンツについてのデジタル信号状態でのデコード音声出力においてコピー禁止の状態で出力する。すなわち、図32Bの制御例のデジタル信号状態でのデコード音声出力の制御では、チャンネルベース信号の音声信号のコンテンツのコピー制御に対して、高度な音声信号であるオブジェクトベースの音声信号またはHOA方式の音声信号のコンテンツのコピー制御とで、音声出力時のコピー制限に差をつけていない。この制御の理由は、既に説明した、図31Bの制御例のデジタル信号状態でのデコード音声出力の制御の理由と同様であるため、繰り返しの説明は省略する。 In the control example of FIG. 32B, the control of decoded audio output in an analog signal state for content after storage and the control of audio signal output in a stream format digital signal state via an IP interface for content after storage are performed. , is the same as the control example in FIG. 32A. On the other hand, in the control example of FIG. 32B, the control for outputting decoded audio in the digital signal state for the content after storage is different from the control example of FIG. 32A. Specifically, in the control example of FIG. 32B for controlling the decoded audio output in the digital signal state for the content after storage, if the copy control state in the storage state is a control state in which copying is possible without restriction, No matter what value the third control information indicates, the decoded audio output in the digital signal state of the content after storage can be output in a copyable state without any restrictions. In addition, if the copy control state in the storage state is a control state in which re-copying is prohibited or a control state in which limited copying is allowed, no matter which value the third control information indicates, the storage In the decoded audio output in the digital signal state for the later content, only one generation is output in a copyable state. Furthermore, when the copy control state in the storage state is a temporary storage control state, no matter which value the third control information value indicates, the decoded audio output in the digital signal state for the content after storage is performed. Output in a copy-prohibited state. That is, in the control of decoded audio output in the digital signal state in the control example of FIG. There is no difference in copy restrictions when outputting audio depending on copy control of audio signal content. The reason for this control is the same as the reason for controlling the decoded audio output in the digital signal state in the control example of FIG. 31B, which has already been explained, so a repeated explanation will be omitted.
 以上説明した本実施例の図32Bの制御例によっても、蓄積後のコンテンツの出力においても、音声信号の種別に応じたコンテンツの価値に対応する、より好適なコピー制御を実現することが可能となる。 According to the control example shown in FIG. 32B of this embodiment described above, it is possible to realize more suitable copy control that corresponds to the value of the content according to the type of audio signal, even when outputting the content after storage. Become.
 以上説明した処理によれば、音声コンテンツと対応付けられたコピー制御情報に応じて、適切なコンテンツ保護を実現することができる。 According to the processing described above, it is possible to realize appropriate content protection according to the copy control information associated with the audio content.
 なお、映像コンテンツについても音声コンテンツについても、IPインタフェースを介した出力においては、上述の制御に加えて、以下の制御を行っても良い。具体的には、MPEG-2 TSのPMTやMMTのMPTに含まれる第1の制御情報(digital_recoding_control_data)および第2の制御情報(copy_restriction_mode)により『1世代のみコピー可』、『個数制限コピー可』、『コピー禁止』などのコピー制限が示されているコンテンツについての、LAN通信部121を介した外部機器へのコピー処理については、放送受信装置100からの送信パケットの宛先である外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット内にある場合のみ可能とし、外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット外にある場合は、禁止しても良い。コピー制御情報が『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理要』のコンテンツも同様に扱っても良い。 Note that for both video content and audio content, in addition to the above-mentioned control, the following control may be performed when outputting via the IP interface. Specifically, the first control information (digital_recording_control_data) and second control information (copy_restriction_mode) included in the PMT of the MPEG-2 TS and the MPT of the MMT specify that "only one generation can be copied" and "a limited number of copies can be made". , for content for which copy restrictions such as "Copy Prohibited" are indicated, to an external device via the LAN communication unit 121, the IP address of the external device that is the destination of the transmission packet from the broadcast receiving device 100. This is allowed only when the address is within the same subnet as the IP address of the broadcast receiving device 100, and may be prohibited when the IP address of the external device is outside the same subnet as the IP address of the broadcast receiving device 100. Content whose copy control information is ``can be copied without any restrictions and requires encryption processing during storage and output'' may be handled in the same way.
 同様に、コピー制御情報が、『1世代のみコピー可』、『個数制限コピー可』、『制約条件なしにコピー可能かつ蓄積および出力時に暗号化処理要』などのコピー制限を示しているコンテンツを一度ストレージ(蓄積)部110へ蓄積した後、LAN通信部121を介して外部機器へムーブする処理についても、放送受信装置100からの送信パケットの宛先である外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット内にある場合のみ可能とし、外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット外にある場合は、禁止しても良い。 Similarly, copy control information indicates copy restrictions such as ``only one generation can be copied'', ``a limited number of copies can be made'', and ``copying is possible without any restrictions and encryption processing is required when storing and outputting''. Regarding the process of once storing data in the storage (accumulation) unit 110 and then moving it to an external device via the LAN communication unit 121, the IP address of the external device that is the destination of the transmission packet from the broadcast receiving device 100 is the same as that of the broadcast receiving device. If the IP address of the external device is outside the same subnet as the IP address of broadcast receiving device 100, it may be prohibited.
 放送受信装置100のストレージ(蓄積)部110へ蓄積したコンテンツについての視聴用映像出力、音声出力は、原則として、放送受信装置100からの送信パケットの宛先である外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット内にある場合のみ可能とし、外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット外にある場合は禁止する。ただし、当該外部機器が所定期間以内に、放送受信装置100のIPアドレスと同一サブネット内で接続されており、かつ、放送受信装置100のIPアドレスと同一サブネット外でも視聴可能な機器としての登録処理(ペアリング)がなされている機器の場合は、外部機器のIPアドレスが、放送受信装置100のIPアドレスと同一サブネット外であっても、当該外部機器への放送受信装置100のストレージ(蓄積)部110へ蓄積したコンテンツについての視聴用映像出力、音声出力を可能とするように構成しても良い。この場合、当該視聴用映像出力、音声出力はコンテンツに暗号化を施して行う。 As a general rule, the IP address of the external device that is the destination of the transmission packet from the broadcast receiving apparatus 100 is the one that receives the broadcast. This is possible only when the IP address of the device 100 is within the same subnet, and is prohibited when the IP address of the external device is outside the same subnet as the IP address of the broadcast receiving device 100. However, if the external device is connected within the same subnet as the IP address of the broadcast receiving device 100 within a predetermined period, and the external device is registered as a device that can be viewed even outside the same subnet as the IP address of the broadcast receiving device 100. In the case of a device that has been paired (paired), even if the IP address of the external device is outside the same subnet as the IP address of the broadcast receiving device 100, the storage (storage) of the broadcast receiving device 100 in the external device The configuration may be such that the content stored in the unit 110 can be outputted as video and audio for viewing. In this case, the video and audio output for viewing is performed after encrypting the content.
 [残響音付加機能]
 次いで、放送受信装置100の残響音付加機能について説明する。図33は、ヘッドフォン使用時の残響音処理フローの例を示す図である。図33に示す例は、放送受信装置100において、音場特性に基づき各音声の残響音の信号を求め、元の音声に残響音が付加された残響音付加音声の信号を生成し、音源別の残響音付加音声の信号をミキシングしてヘッドフォンに送信し、音再生させる例である。なお、音場特性とは、音声の収録場所である音場での音響特性、すなわち、残響音も含めた音の空間伝達関数などで示されるものである。また、音場特性は、音の空間伝達関数を直接的に示す方法に限らず、収録場所の壁などの構造物の形状や、各構造物による音の反射特性や吸収特性など、音の空間伝達関数を推定する参考データでも良い。
[Reverberation sound addition function]
Next, the reverberation sound adding function of the broadcast receiving apparatus 100 will be explained. FIG. 33 is a diagram illustrating an example of a reverberation sound processing flow when headphones are used. In the example shown in FIG. 33, the broadcast receiving apparatus 100 obtains the reverberant sound signal of each sound based on the sound field characteristics, generates a reverberant sound-added sound signal in which the reverberant sound is added to the original sound, and This is an example of mixing the reverberant sound-added audio signal and transmitting it to headphones for sound reproduction. Note that the sound field characteristics are the acoustic characteristics of the sound field where audio is recorded, that is, the spatial transfer function of sound including reverberant sound. In addition, sound field characteristics are not limited to methods that directly indicate the spatial transfer function of sound, but also include the shape of structures such as walls at the recording location, and the reflection and absorption characteristics of sound by each structure. It may also be reference data for estimating the transfer function.
 本制御例において、音場特性を表す情報や、各音源の音場における位置を表す情報は、放送波に含まれる放送メタデータ、例えば、放送波に含まれるオブジェクトベースの各音声信号のメタデータに記述される。放送受信装置100は、受信した放送波に含まれる放送メタデータから音場特性を取得し、音声レンダリング時にその音場特性を反映させて残響音付加音声の信号を生成し、臨場感の高い音再生を可能にする。コンサートやライブ公演の音声をオブジェクトベースで収録して放送すると、放送波を受信して再生される音声は、コンサートホールや劇場などの残響特性が反映されにくく、臨場感に欠ける場合がある。本制御例は、そのような課題を解決するのに特に有効である。 In this control example, information representing the sound field characteristics and information representing the position of each sound source in the sound field is broadcast metadata included in the broadcast wave, for example, metadata of each object-based audio signal included in the broadcast wave. It is described in The broadcast receiving device 100 acquires sound field characteristics from broadcast metadata included in received broadcast waves, reflects the sound field characteristics during audio rendering, generates a reverberated sound signal, and produces sound with a high sense of presence. enable playback. When audio from concerts and live performances is recorded and broadcast on an object basis, the audio received and played back by broadcast waves is difficult to reflect the reverberation characteristics of concert halls, theaters, etc., and may lack a sense of realism. This control example is particularly effective in solving such problems.
 なお、放送受信装置100は、放送番組名やその付加情報等で得た、ホールや劇場などである収録場所の名称を特定し得る情報を手掛かりに、インターネット等のネットワークを介して音場特性を入手しても良い。例えば、ホール名や劇場名とその音場特性とが対応付けられた情報を、図1に示す情報サーバ900に予め保存しておく。放送受信装置100は、特定されたホール名や劇場名に対応付けられた音場特性を、通信部を通して情報サーバ900からネットワークを介して取得する。また、放送受信装置100は、残響音をオブジェクトベースの音源として受信し、視聴者の好みや視聴環境に応じてその強度や周波数特性などを調整して他の音と合成するようにしても良い。 Note that the broadcast receiving device 100 uses information obtained from the broadcast program name and its additional information that can specify the name of the recording location, such as a hall or theater, to determine the sound field characteristics via a network such as the Internet. You can get it. For example, information in which hall names or theater names are associated with their sound field characteristics is stored in advance in the information server 900 shown in FIG. 1. Broadcast receiving device 100 acquires the sound field characteristics associated with the specified hall name or theater name from information server 900 via the network through the communication unit. Furthermore, the broadcast receiving device 100 may receive reverberant sound as an object-based sound source, adjust its intensity, frequency characteristics, etc. according to the viewer's preferences and viewing environment, and synthesize it with other sounds. .
 なお、ヘッドフォンは、複数のスピーカを有する音声出力装置の一例である。音声出力装置は、後述するように、ヘッドマウントディスプレイやスピーカシステム等であっても良い。音声の収録場所は、音声がマイクやピックアップ等で電気信号に変換され収録された場所のことであり、例えば、演奏や演劇が行われるコンサートホールや劇場等である。 Note that headphones are an example of an audio output device having multiple speakers. The audio output device may be a head mounted display, a speaker system, or the like, as described later. An audio recording location is a location where audio is converted into an electrical signal using a microphone, pickup, etc. and recorded, such as a concert hall or theater where a performance or play is held.
 これより、図33に示す残響音処理フローについて詳しく説明する。20010はオブジェクトレンダラー10103に相当する。上記(実施例2)[高度な音声信号]に記載のヘッドフォン使用時の処理は、空間位置補正20012,20022に相当する。さらに、残響音付加20013,20023と頭部伝達補正20014,20024とが行われる。 From now on, the reverberation sound processing flow shown in FIG. 33 will be explained in detail. 20010 corresponds to the object renderer 10103. The processing when using headphones described in the above (Example 2) [Advanced Audio Signal] corresponds to spatial position correction 20012 and 20022. Further, reverberation sound addition 20013, 20023 and head transfer correction 20014, 20024 are performed.
 個別音声20011は、個別マイク等で収録されて残響音は少ないが、背景音20021は残響音を含んでいる場合がある。このため、本制御例においては、個別音声20011への残響音付加に対して、背景音20021への残響音付加については、残響音の付加量を小さくするか、あるいは残響音付加を無くしても良い。 Although the individual sound 20011 is recorded with an individual microphone or the like and has little reverberation, the background sound 20021 may contain reverberation. Therefore, in this control example, with respect to adding reverberant sound to the individual sound 20011, when adding reverberant sound to the background sound 20021, the amount of reverberant sound added may be reduced or even if the reverberant sound is not added. good.
 残響音付加20013,20023は、放送メタデータ、例えば、オブジェクトオーディオ(音源ごとの個別音声)信号のメタデータから抽出して得られた音場特性に基づく演算あるいは、収録場所の構造からシミュレーションした空間伝達関数を用いた演算によって行われる。より具体的には、残響音付加20013,20023は、例えば各オブジェクトオーディオや背景音に対するインパルス応答の畳み込み積分や、遅延時間毎の周波数減衰特性のフィルタリング処理などにより行われる。なお、オブジェクトオーディオ信号のメタデータに音場特性が記述されていない場合は、番組の付加情報等から劇場名を抽出し、情報サーバ900からその劇場の音場特性を得ても良い。 Reverberation addition 20013 and 20023 are calculations based on sound field characteristics extracted from broadcast metadata, such as object audio (individual audio for each sound source) signal metadata, or space simulated from the structure of the recording location. This is done by calculation using a transfer function. More specifically, the reverberation sound additions 20013 and 20023 are performed, for example, by convolution integration of impulse responses to each object audio and background sound, filtering processing of frequency attenuation characteristics for each delay time, and the like. Note that if the sound field characteristics are not described in the metadata of the object audio signal, the theater name may be extracted from the additional information of the program, and the sound field characteristics of the theater may be obtained from the information server 900.
 放送受信装置100は、オブジェクトオーディオ信号のメタデータから劇場内の音源位置情報を取得し、劇場における仮想的な視聴者の位置である視聴位置が指定されると、劇場内の音源と視聴位置に基づき空間伝達関数を求めることができる。放送受信装置100は、求めた空間伝達関数に基づいて、残響音付加音声が音源別の音声を視聴位置で聴いた場合の音声に近づくように、残響音付加音声の信号を生成する。このような残響音付加音声の信号の生成により、臨場感のより高い音声を再生することができる。なお、音場特性を直接得られない場合には、劇場の内部構造・材質情報から空間伝達関数をシミュレーションで求めても良い。劇場の内部構造・材質情報は、例えば、部屋のサイズ、形状、天井の高さ、内装の素材といった要素を含む。 The broadcast receiving device 100 acquires the sound source position information in the theater from the metadata of the object audio signal, and when the viewing position, which is the virtual viewer's position in the theater, is specified, the broadcast receiving device 100 acquires the sound source position information in the theater from the metadata of the object audio signal. Based on this, the spatial transfer function can be determined. Based on the obtained spatial transfer function, the broadcast receiving device 100 generates a signal of the reverberant sound so that the reverberant sound becomes close to the sound when the sound of each sound source is listened to at the viewing position. By generating such a reverberation-added audio signal, it is possible to reproduce audio with a higher sense of presence. Note that if the sound field characteristics cannot be obtained directly, the spatial transfer function may be determined by simulation from information on the internal structure and materials of the theater. The theater's internal structure/material information includes, for example, elements such as room size, shape, ceiling height, and interior material.
 各方向から来る音と、両耳の外耳道入口に到達する音の関係を示したものが頭部伝達関数である。頭部伝達特性は耳介形状などにより聴音者によって個人差がある。個人登録された頭部伝達関数が無ければ、標準的な頭部伝達関数に基づき、頭部伝達補正20014で音声の変換を行う。 The head-related transfer function shows the relationship between sounds coming from each direction and sounds reaching the entrances of the external auditory canals in both ears. Head transfer characteristics vary from person to person depending on the shape of the auricle and other factors. If there is no personally registered head-related transfer function, the voice is converted by head-related transfer correction 20014 based on the standard head-related transfer function.
 ヘッドフォンから出た音は、外耳道を通して鼓膜に届き、聴音者は音声を認識する。ヘッドフォンから鼓膜までの外耳道伝達関数は、聴音者の個人差だけでなく、ヘッドフォンが、耳介を覆うクローズ形であるか、耳穴に入れるイヤホン形であるかなどによっても変動することがある。頭部伝達補正20014は、頭部伝達関数に基づく補正処理だけでなく、外耳道伝達関数に基づく補正処理を行っても良い。この場合、例えば、ヘッドフォンの形式情報や聴音者の識別情報を取得することにより頭部伝達関数を特定し、特定された頭部伝達関数に基づく補正処理を行うことが望ましい。これらの情報が得られない場合を想定して、典型的と思われる頭部伝達関数を予め複数種類用意しておき、聴音者がそれぞれの頭部伝達関数に基づく補正処理が行われた音声を試聴比較して、好適と考える頭部伝達関数を選択できるようにしても良い。 The sound emitted from the headphones reaches the eardrum through the ear canal, and the listener recognizes the sound. The ear canal transfer function from the headphones to the eardrum may vary not only depending on individual listeners, but also depending on whether the headphones are closed-type headphones that cover the pinna of the ear or earphone-type headphones that are inserted into the ear canal. The head transfer correction 20014 may perform not only correction processing based on the head transfer function but also correction processing based on the ear canal transfer function. In this case, for example, it is desirable to specify the head-related transfer function by acquiring the type information of the headphones and the identification information of the listener, and to perform correction processing based on the specified head-related transfer function. Assuming that this information cannot be obtained, we prepare multiple types of typical head-related transfer functions in advance, and listeners can listen to the sound that has been corrected based on each head-related transfer function. It may also be possible to select a head-related transfer function that is considered suitable by listening to a comparison.
 なお、図33は、個別音声と背景音とがそれぞれ1つずつである場合を例示したが、個別音声あるいは背景音が複数ある場合には、個別に同様な処理を行って処理後の各音声をミキシングすればよい。また、ナレーションの個別音声については、残響を小さくあるいは無くして聞きやすくするなど、個別音声の種類によって残響音付加処理の強さあるいは有無を変えても良い。また、空間位置補正20012と残響音付加20013、頭部伝達補正2014を順次処理する例を説明したが、それらをまとめた合成伝達関数を計算し、その合成伝達関数で個別音声を処理しても良い。また、頭部伝達補正は、音信号が、音の方向性情報を維持した音信号、例えばHOA形式の音信号であれば、ミキシング20015の処理の後にまとめて処理されるようにしても良いし、ヘッドフォン910で処理されるようにしても良い。ヘッドフォン910で処理する場合は、ヘッドフォン910の形式や向きを放送受信装置100で把握する必要がなくなるので、複数の視聴者がいる場合に各視聴者への個別対応が容易となる。 Note that FIG. 33 illustrates the case where there is one individual sound and one background sound, but if there are multiple individual sounds or background sounds, similar processing is performed individually to create each sound after processing. All you have to do is mix. Furthermore, for individual voices of narration, the strength or presence or absence of reverberation sound addition processing may be changed depending on the type of individual voice, such as reducing or eliminating reverberation to make it easier to listen to. In addition, although we have explained an example in which spatial position correction 20012, reverberation sound addition 20013, and head transfer correction 2014 are sequentially processed, it is also possible to calculate a composite transfer function that combines them and process individual sounds with the composite transfer function. good. Furthermore, if the sound signal is a sound signal that maintains the directional information of the sound, for example, a sound signal in the HOA format, the head-related transfer correction may be processed all at once after the mixing 20015 processing. , may be processed by the headphones 910. When processing is performed using the headphones 910, there is no need for the broadcast receiving apparatus 100 to know the type and orientation of the headphones 910, so if there are multiple viewers, it becomes easy to respond to each viewer individually.
 なお、残響音の原因となる劇場の壁は音源や視聴者から比較的離れている場合が多い。このことを考慮すると、音源位置の残響音への影響は小さくなると考えられる。そこで、音場特性をメタデータで伝送する代わりに、残響音をオブジェクトオーディオの1音源として近似して伝送するようにしても良い。また、視聴者の好みで、残響音の強度を背景音とは別に調整できるようにしても良い。 Note that the theater walls that cause reverberant sound are often relatively far away from the sound source and the audience. Taking this into consideration, it is thought that the influence of the sound source position on the reverberant sound will be reduced. Therefore, instead of transmitting the sound field characteristics as metadata, reverberant sound may be approximated and transmitted as one sound source of object audio. Furthermore, the intensity of the reverberation sound may be adjusted separately from the background sound according to the viewer's preference.
 図34Aは、音声出力設定メニューの例を示す図である。図34Aの音声設定メニューにおいて、各タグは設定対象を示している。視聴者は、タグを選択することにより、そのタグに対応付けされた設定対象の設定メニューを開くことができる。タグ「内蔵SP」はTV内蔵スピーカの設定、タグ「光IF」は光デジタルインタフェースの設定、タグ「ARC」はHDMIのAudio Return Chanelの設定、タグ「HP1」はHead Phone1(Bluetooth(登録商標)接続)の設定、タグ「HP2」はHead Phone 2(3.5φミニジャックを想定)の設定とそれぞれ対応付けされている。図34Aでは、タグ「HP1」が選択され、Head Phone 1の設定メニュー20032が開かれた状態を示している。 FIG. 34A is a diagram showing an example of the audio output setting menu. In the audio setting menu of FIG. 34A, each tag indicates a setting target. By selecting a tag, the viewer can open the setting menu for the setting target associated with that tag. The tag "Built-in SP" is the setting of the TV's built-in speaker, the tag "Optical IF" is the setting of the optical digital interface, the tag "ARC" is the setting of the HDMI Audio Return Channel, and the tag "HP1" is the setting of the Head Phone 1 (Bluetooth (registered trademark)). connection) settings and the tag "HP2" are respectively associated with the settings of Head Phone 2 (assuming a 3.5φ mini jack). FIG. 34A shows a state in which the tag "HP1" is selected and the settings menu 20032 for Head Phone 1 is opened.
 HP1の設定メニュー20032において、「型番hhh」はヘッドフォンの型番を表している。なお、放送受信装置100は、このヘッドフォンの型番に基づいて、図33の頭部伝達補正20014に用いる標準的な頭部伝達関数情報を、情報サーバ900から入手しても良い。「チャンネルベース On」はチャンネルベースの音声信号がOnになっていることを表している。「▼」マークはOn/Off設定のためのマークであり、このマークにカーソルを合わせると選択肢「On」「Off」が現れ、視聴者はOn/Offの設定を行うことができるようになっている。「>詳細」は、詳細設定メニューを開くためのボタンであり、このボタンを押下すると、対応した設定対象の詳細設定メニューが開かれる。 In the settings menu 20032 of HP1, "model number hhh" represents the model number of the headphones. Note that the broadcast receiving apparatus 100 may obtain standard head-related transfer function information used for the head-related transfer correction 20014 in FIG. 33 from the information server 900 based on the model number of this headphone. "Channel base On" indicates that the channel base audio signal is turned on. The "▼" mark is a mark for setting On/Off, and when you move the cursor over this mark, the options "On" and "Off" will appear, allowing viewers to set On/Off. There is. “>Details” is a button for opening a detailed settings menu, and when this button is pressed, a detailed settings menu for the corresponding setting target is opened.
 「オブジェクトベース On」は、オブジェクトベースの音声信号がOnになっていることを表している。オブジェクトベースの音声信号がOnである場合、オブジェクト音声が全部でいくつあって、そのうちOnになっている音声がいくつなのかを表すサマリー表示がなされる。図34Aの例では、「5オブジェクト中3選択」が表示されており、オブジェクト音声が全部で5つあって、そのうちOnになっている音声が3つであることを示している。「オブジェクトベース Off」が表示されている場合は、すべてのオブジェクト音声がOffになっていることを意味する。 "Object-based On" indicates that the object-based audio signal is turned on. When the object-based audio signal is on, a summary display is displayed showing how many object sounds there are in total and how many of them are on. In the example of FIG. 34A, "3 out of 5 objects selected" is displayed, indicating that there are a total of 5 object sounds, and 3 of them are turned on. When "Object Base Off" is displayed, it means that all object sounds are turned off.
 オブジェクトベース信号の詳細設定メニューは、図27に示す詳細メニューが一例となる。図27に示す詳細メニューでは、個別音声ごとにOn/Offを設定することができるようになっている。 An example of the detailed setting menu for object-based signals is the detailed menu shown in FIG. In the detailed menu shown in FIG. 27, it is possible to set On/Off for each individual voice.
 「残響音 On」は残響音付与がOnになっていることを示している。「劇場AAA」は残響音付与対象の劇場名である。 "Reverberation sound On" indicates that reverberation sound application is turned on. “Theatre AAA” is the name of the theater to which reverberation sound is added.
 図34Bは、残響音設定の詳細メニューの例を示す図である。残響音設定の詳細メニューでは、残響音の大きさや聴音点(音を聴取する位置)の詳細な設定を行うことができる。コンテンツを視聴する視聴位置は、音を聴取する位置である聴音点と、映像を見る位置が同一位置であることが自然である。しかしながら、表示の大きさが変わる場合など。それらが異なる場合も生じる。以降、音再生に関する説明では視聴位置に代えて聴音点として説明する。 FIG. 34B is a diagram showing an example of a detailed menu for reverberation sound settings. In the detailed reverberation settings menu, you can make detailed settings for the reverberation volume and listening point (position where you hear the sound). Naturally, the viewing position for viewing the content is the same position as the listening point, which is the position for listening to the sound, and the position for viewing the video. However, if the display size changes, etc. There may also be cases where they are different. Hereinafter, in the explanation regarding sound reproduction, the listening point will be used instead of the viewing position.
 図34Bに示す残響音の詳細メニューの例では、画面の左側に残響音詳細設定欄20033が表示され、画面の右側に公演場所である劇場を上空から見た座席配列図が表示される。残響音詳細設定欄20033には、残響音設定を行うための残響音設定エリアと聴音点設定を行うための聴音点設定エリアとが設けられる。 In the example of the reverberation sound detail menu shown in FIG. 34B, a reverberation sound detail setting field 20033 is displayed on the left side of the screen, and a seating layout diagram of the theater where the performance is performed from above is displayed on the right side of the screen. The reverberation sound detailed setting field 20033 is provided with a reverberation sound setting area for setting reverberation sound and a listening point setting area for setting listening point.
 残響音設定エリアには、選択されている劇場名と残響音の効果強度とが表示され、「▲」「▼」マークで選択肢の中から所望のものを選択することが可能である。劇場名は、番組情報、例えば番組メタデータから抽出して得られる場合には、その番組情報に基づいて自動で選択される。図34Bでは、劇場AAAが設定されている例を示している。劇場AAAとは別の代替劇場を選択することもできる。代替劇場とは、公演が行われているAAA劇場ではなく別の劇場、例えばBBB劇場などである。代替劇場が選択されると、その代替劇場の音響効果を模擬したものが再現される。代替劇場、例えばBBB劇場の音場特性は放送番組から得られないので、情報サーバ900から入手できる劇場名を検索して設定できるようにする。 In the reverberation sound setting area, the name of the selected theater and the effect strength of the reverberation sound are displayed, and it is possible to select the desired one from the options using the "▲" and "▼" marks. If the theater name is extracted from program information, such as program metadata, the theater name is automatically selected based on the program information. FIG. 34B shows an example in which theater AAA is set. It is also possible to select an alternative theater other than theater AAA. An alternative theater is a theater other than the AAA theater where the performance is being performed, such as a BBB theater. When an alternate theater is selected, a simulation of the acoustics of that alternate theater is reproduced. Since the sound field characteristics of an alternative theater, such as the BBB theater, cannot be obtained from the broadcast program, the name of the theater that can be obtained from the information server 900 can be searched and set.
 また、残響音の効果強度は、初めはデフォルトもしくは記憶されている前回の設定が設定される。残響音の効果強度は、例えば10段階で選択でき、番組または音声メタデータが推奨するデフォルト値を5とする。効果強度の値0は、残響無しを意味する。図34Bでは、効果強度の値として5が設定されている例を示している。なお、残響音情報のメタデータが入手できない場合には、例えば“-”が表示される。 Additionally, the effect strength of the reverberant sound is initially set to the default or stored previous setting. The effect strength of the reverberation sound can be selected from, for example, 10 levels, and the default value recommended by the program or audio metadata is 5. An effect strength value of 0 means no reverberation. FIG. 34B shows an example in which 5 is set as the value of the effect strength. Note that if the metadata of the reverberation sound information cannot be obtained, for example, "-" is displayed.
 聴音点設定エリアには、選択されている聴音点の設定が表示される。聴音点の設定は、例えば、「標準(A)」/「映像に同期」/「(B)設定」の3種類の中から選択することが可能である。「標準(A)」は、番組情報または音声メタデータが示すデフォルト設定が選択される設定である。「映像に同期」は、後述の図34Dに示す映像部分拡大時に音場も同期させる設定である。なお、「映像に同期」は、「画像と音像の連動」モードともいう。「(B)設定」は、聴音点を右の座席配列図上でカーソル(B)を動かしてユーザが決定する設定である。図34Bでは、デフォルト設定の聴音点20034(マークAで表す)が座席配列図の中央の位置に表示されている例を示している。また、図34Bでは、「(B)設定」が選択されており、この場合の聴音点20035(マークBで表す)が座席配列図の前方に表示されている例を示している。聴音点は、例えば、聴音点のカーソルをその近傍に表示される「▲」「▼」マーク等を操作することで移動させる。 The settings of the selected listening point are displayed in the listening point setting area. The setting of the listening point can be selected from, for example, three types: "Standard (A)," "Synchronized with video," and "(B) setting." "Standard (A)" is a setting in which the default setting indicated by program information or audio metadata is selected. “Synchronize with video” is a setting that also synchronizes the sound field when enlarging a video portion shown in FIG. 34D, which will be described later. Note that "synchronize with video" is also referred to as "linking of image and sound image" mode. "(B) Setting" is a setting determined by the user by moving the cursor (B) on the seating arrangement map on the right to determine the listening point. FIG. 34B shows an example in which the default listening point 20034 (represented by mark A) is displayed at the center of the seating arrangement map. Further, FIG. 34B shows an example in which "(B) setting" is selected and the listening point 20035 (represented by mark B) in this case is displayed at the front of the seating arrangement map. The listening point can be moved by, for example, operating the cursor of the listening point using marks such as "▲" and "▼" displayed near the cursor.
 図34Cは、残響音処理状態を示すバナー表示の例を示す図である。図34Cでは、放送画像としてマクセル公演という番組の舞台全体を表示した例を示しており、AAA劇場を想定した残響音を付加した場合のバナー表示20041を示している。 FIG. 34C is a diagram illustrating an example of a banner display indicating the reverberation processing state. FIG. 34C shows an example in which the entire stage of a program called Maxell Performance is displayed as a broadcast image, and shows a banner display 20041 when reverberation sound assuming an AAA theater is added.
 図34Dは、所定のリモコン操作などで放送画像の部分拡大表示におけるバナー表示の例を示す図である。図34Dでは、図34Cの放送画像を部分拡大して、放送画像に含まれるダンサーを大きく表示した例を示している。拡大部分はリモコンの上下左右のボタンで移動させても良い。聴音点の設定として「映像に同期」が選択されていない場合、すなわち「画像と音像の連動」モードがOffである場合には、放送画像を部分拡大しても、音の空間位置補正は連動させず、音場は変わらない。一方、聴音点の設定として「映像に同期」が選択されている場合、すなわち「画像と音像の連動」モードがOnである場合には、放送画像の部分拡大に際し空間位置補正を連動させる。例えば、視野角一定において、放送画像を2倍に部分拡大表示する場合に、聴音点と音源との間の距離が半減するとともに、音の来る方向や大きさも変える処理を行う。 FIG. 34D is a diagram illustrating an example of a banner display when a portion of a broadcast image is enlarged and displayed by a predetermined remote control operation or the like. FIG. 34D shows an example in which the broadcast image of FIG. 34C is partially enlarged to display the dancers included in the broadcast image in a larger size. The enlarged part may be moved using the up, down, left, and right buttons on the remote control. If "Synchronize with video" is not selected as the listening point setting, that is, if "Image and sound image linkage" mode is Off, even if you partially enlarge the broadcast image, the sound spatial position correction will not be linked. The sound field remains unchanged. On the other hand, when "synchronized with video" is selected as the setting of the listening point, that is, when the "image and sound image linkage" mode is On, spatial position correction is linked when partially enlarging the broadcast image. For example, when a broadcast image is partially enlarged and displayed at a constant viewing angle, the distance between the listening point and the sound source is halved, and the direction and size of the sound are also changed.
 したがって、「画像と音像の連動」モードがOnである場合には、画像が拡大すると音の方向や大きさも変わるので、例えば、視聴者は舞台に近寄る感覚を得ることになる。一方、「画像と音像の連動」モードがOffである場合には、画像が拡大しても音の方向や大きさは変わらないので、例えば、視聴者は座席に座ったまま双眼鏡をかけたような感覚を得ることになる。 Therefore, when the "image and sound image linkage" mode is On, the direction and size of the sound will change as the image is enlarged, so for example, the viewer will feel as if they are getting closer to the stage. On the other hand, if the "Image and sound image linkage" mode is Off, the direction and size of the sound will not change even if the image is enlarged. You will get a feeling.
 なお、「画像と音像の連動」モードは、画像部分拡大時と画像全体表示時で独立に設定できるようにしても良い。所定倍率以下では「画像と音像の連動モードをOnとし、所定倍率(例えば2倍)以上では「画像と音像の連動」モードをOffと設定できるようにしても良い。 Note that the "image and sound image linkage" mode may be set independently when enlarging a portion of the image and when displaying the entire image. When the magnification is below a predetermined magnification, the "image and sound image interlocking mode" may be set to On, and when the magnification is above a predetermined magnification (for example, 2x), the "image and sound image interlocking" mode may be set to Off.
 図34Eは、放送画像の部分拡大表示前後の視聴条件を説明するための図である。図34Eでは、放送される番組の公演会場を上方から見た図の一例を示している。図34Eでは、一例として、部分拡大表示前の画面に表示される領域20051と、部分拡大表示後の画面に表示される領域20061とを示している。なお、出演者20052は、説明の理解を容易にするため、正面から見た図で表現されている。 FIG. 34E is a diagram for explaining viewing conditions before and after partially enlarged display of a broadcast image. FIG. 34E shows an example of a view from above of a performance venue for a program to be broadcast. FIG. 34E shows, as an example, an area 20051 displayed on the screen before partial enlargement display, and an area 20061 displayed on the screen after partial enlargement display. Note that the performer 20052 is shown as a front view to facilitate understanding of the explanation.
 表示された出演者20052を基準にすると、部分拡大表示後の画面の領域20061は、部分拡大表示前の画面の領域20051よりも、みかけ上、実空間における表示領域が小さくなり、視聴者は視聴点20053(マークCで表す)から視聴点20063(マークDで表す)へ近寄った状態とみなすことができる。すなわち、視聴者から表示領域に対応する位置までの距離は、部分拡大表示前の距離20054から部分拡大表示後の距離20064に縮まった状態とみなすことができる。 Based on the displayed performer 20052, the screen area 20061 after partially enlarged display appears to have a smaller display area in real space than the screen area 20051 before partially enlarged display. It can be considered that the viewing point 20063 (represented by mark D) is approaching from point 20053 (represented by mark C). That is, the distance from the viewer to the position corresponding to the display area can be considered to be shortened from the distance 20054 before partial enlargement display to the distance 20064 after partial enlargement display.
 出演者20052のうち最も右側の出演者を基準とした見込み角について考えると、放送画像の部分拡大表示後の見込み角20065は、部分拡大表示前の見込み角20055より大きくなる。ここでは、この見込み角の変化に合わせて音の来る方向を変える処理モードにある状態のことを、「画像と音像の連動」モードがOnである、と称している。「画像と音像の連動」モードがOnである場合には、放送画像の部分拡大表示により出演者を大きく見ることができ、それに連動して音の来る方向が変化する。すなわち、この場合には、視聴者は、舞台すなわち出演者に近づく感覚を得ることができ、劇場での移動体験を再現することができる。 Considering the viewing angle based on the rightmost performer among the performers 20052, the viewing angle 20065 after partially enlarged display of the broadcast image is larger than the viewing angle 20055 before partially enlarged display. Here, a state in which the processing mode changes the direction from which the sound comes in accordance with changes in the viewing angle is referred to as the "image and sound image linkage" mode being on. When the "image and sound image linkage" mode is on, the performer can be seen larger by partially enlarging the broadcast image, and the direction from which the sound is coming changes in conjunction with this. That is, in this case, the viewer can get a sense of getting closer to the stage, that is, the performers, and can recreate the experience of moving in a theater.
 一方、この見込み角の変化に合わせて音の来る方向を変える処理モードにない状態のことを、「画像と音像の連動」モードがOffである、と称している。「画像と音像の連動」モードがOffである場合には、放送画像の部分拡大表示により出演者を大きく見ることができるが、音の来る方向は変わらない。すなわち、この場合には、視聴者は、双眼鏡で舞台を見る感覚を得ることができ、劇場での固定位置での体験を再現することができる。 On the other hand, a state in which the processing mode that changes the direction of the sound in accordance with changes in the viewing angle is not in effect is referred to as the "image and sound image linkage" mode being off. When the "image and sound image linkage" mode is OFF, the performer can be seen larger by partially enlarging the broadcast image, but the direction from which the sound is coming does not change. That is, in this case, the viewer can get the feeling of viewing the stage through binoculars, and can recreate the experience of being at a fixed location in a theater.
 なお、「画像と音像の連動」モードのOnとOffの切換えは、リモコンボタンでダイレクトに行えるようにしても良いし、メニュー設定で行えるようにしても良い。 Note that switching the "image and sound image linkage" mode between On and Off may be performed directly with a remote control button, or may be performed through menu settings.
 <スピーカ音出力>
 ここまで、ヘッドフォン910で音出力する場合を想定して説明してきたが、スピーカで残響音を付与して音出力する場合も同様な処理を行うことができる。スピーカで音出力する場合は、各スピーカの音とスピーカの方向とに基づき頭部伝達関数によって得られる音が、視聴者の耳に届く。従って、スピーカで音出力する場合であっても、所望の音が視聴者の耳に届くように頭部伝達補正20014、20024をかけることによって、図33と同様な処理を行えばよい。さらに、視聴環境の残響音が無視できない場合は、視聴環境の音場特性を測定し、測定された音場特性に基づき視聴環境の残響音を抑制するような補正を、頭部伝達補正で行うことで、劇場の臨場感をより高めることができる。
<Speaker sound output>
Up to this point, the description has been made on the assumption that sound is output using the headphones 910, but similar processing can be performed when sound is output with reverberation added to the speaker. When sound is output through speakers, the sound obtained by a head-related transfer function based on the sound of each speaker and the direction of the speaker reaches the ears of the viewer. Therefore, even when sound is output through a speaker, processing similar to that shown in FIG. 33 may be performed by applying head transfer corrections 20014 and 20024 so that the desired sound reaches the ears of the viewer. Furthermore, if the reverberant sound in the viewing environment cannot be ignored, the sound field characteristics of the viewing environment are measured, and head transfer correction is performed to suppress the reverberant sound in the viewing environment based on the measured sound field characteristics. This makes it possible to further enhance the sense of being in the theater.
 <HMD視聴>
 HMDを用いて視聴する場合であっても、同様な処理を行うことができる。視聴者がHMDを用いて視聴する場合、HMD920が放送受信装置100から映像と音声データを受けて、映像と音を再生し、あたかも放送受信装置100のモニタ部192を見て、ヘッドフォン910から音を聞くような視聴環境を提供する。視聴者がHMD920を装着した際や、視聴者が視聴に係る設定を初期化した際には、テレビ放送の標準視聴状態、すなわち視聴者の正面に映像を表示した状態とすればよい。
<HMD viewing>
Similar processing can be performed even when viewing using an HMD. When a viewer watches using an HMD, the HMD 920 receives video and audio data from the broadcast receiving device 100 and plays the video and sound, as if watching the monitor section 192 of the broadcast receiving device 100 and outputting the sound from the headphones 910. Provide a viewing environment where you can listen to. When the viewer wears the HMD 920 or when the viewer initializes settings related to viewing, the standard viewing state for television broadcasting may be established, that is, the state in which the video is displayed in front of the viewer.
 放送受信装置100は、視聴者の視聴状態を設定するモードとして、HMD920からHMD920の姿勢情報を入手してその姿勢情報に基づいた映像を生成してHMD920に伝送するモードと、標準視聴状態に固定するモードとを備えるようにしても良い。この場合、モードの設定は、メニュー設定やリモコンボタン指示で切り換えるようにしても良い。放送受信装置100は、モニタ部192の表示をHMD表示に同期させるモードを備え、メニュー操作に応じて同期/非同期のモードを切り換えるようにしても良い。 The broadcast receiving device 100 has two modes for setting the viewing state of the viewer: a mode in which the attitude information of the HMD 920 is obtained from the HMD 920, a video is generated based on the attitude information and transmitted to the HMD 920, and a mode in which the viewing state is fixed to the standard viewing state. A mode may also be provided. In this case, the mode settings may be switched by menu settings or remote control button instructions. The broadcast receiving apparatus 100 may be provided with a mode in which the display of the monitor unit 192 is synchronized with the HMD display, and may be configured to switch between synchronous and asynchronous modes in accordance with menu operations.
 また、放送受信装置100は、HMD920の位置情報あるいは加速度情報を受信して、HMD920の位置の変化を検出することにより、視聴者の動作を把握し、その動作に応じて表示映像の拡縮等を行って臨場感を引き出すようにしても良い。例えば、視聴者が表示映像に近づく、または前に出る動作が検出された場合に、表示映像を拡大し、視聴者が表示映像から遠ざかる、または後ろに下がる動作が検出された場合に、表示映像を縮小させるようにする。 In addition, the broadcast receiving device 100 receives position information or acceleration information of the HMD 920 and detects changes in the position of the HMD 920 to understand the viewer's movements, and adjusts the display image to be enlarged or contracted according to the movement. You can also go there to bring out the sense of realism. For example, if a movement of the viewer approaching or stepping forward is detected, the displayed image is enlarged, and if a movement of the viewer moving away from the displayed image or backwards is detected, the displayed image is enlarged. Try to reduce the size.
 このような表示映像の拡縮制御を行う場合において、視聴者が表示映像に近づいたり遠ざかったりする距離に応じた表示映像の拡大/縮小率を、ユーザが設定できるようにしても良い。また、視聴者が移動して視聴場所を変更する場合に、映像拡大/縮小を停止させたり、デフォルトの映像拡大/縮小率に設定したりするユーザインタフェース(例えばリモコン操作ボタンやジェスチャー動作の検出など)を設けるようにしても良い。 When performing such enlargement/reduction control of the displayed image, the user may be able to set the enlargement/reduction ratio of the displayed image according to the distance at which the viewer approaches or moves away from the displayed image. In addition, when the viewer moves and changes the viewing location, the user interface (for example, remote control operation button or gesture movement detection, ) may be provided.
 また、放送受信装置100は、表示映像の拡縮に合わせて音の来る方向や距離感を変化させる、すなわち、映像と音像とを連動させるモードを備え、当該モードのOn/Offを切換えられるようなユーザインタフェースを設けるようにしても良い。 Furthermore, the broadcast receiving device 100 has a mode that changes the direction and sense of distance of the sound in accordance with the enlargement/reduction of the displayed image, that is, a mode in which the image and the sound image are linked, and the mode can be turned on/off. A user interface may also be provided.
 また、HMD920の位置情報や加速度情報、姿勢情報等に応じた映像と音像の変換処理は、放送受信装置100側ではなく、HMD920側で実施するようにしても良い。HMD920で映像と音像の変換処理を行うようにすれば、放送受信装置100が複数の視聴者のそれぞれのHMD920へ同じ映像・音声データを提供しても、各視聴者が独立してその視聴者の位置・姿勢に応じた映像と音を視聴できる効果がある。 Further, the video and sound image conversion processing according to the position information, acceleration information, posture information, etc. of the HMD 920 may be performed on the HMD 920 side instead of on the broadcast receiving device 100 side. If the HMD 920 performs video and audio conversion processing, even if the broadcast receiving device 100 provides the same video and audio data to each HMD 920 of multiple viewers, each viewer can independently This has the effect of allowing you to view images and sounds that match your position and posture.
 (実施例3)
 [音像定位モードの切替]
 本実施例は、音像定位モードの切替の取り扱いに関するものである。上述した実施例においては、ユーザの顔の正面方向の変化に対応して放送受信装置100の画面に映し出される画像に音像が定位するように、ユーザに装着されるHMD(Head Mound Display)、イヤホン及びヘッドフォン等の音声出力部から音が放出されるように放送受信装置100はユーザの顔の正面方向に関する情報を取り入れて音声処理を実行する。このようにユーザの顔の正面方向の変化に対応して、放送受信装置100の画面に映し出される画像に音像が定位するモードを、本実施例では音像定位モードが可変であると称する。一方、ユーザの顔の正面方向が固定であるとして、放送受信装置100が音声の出力処理を実行するモードを、本実施例では音像定位モードが固定であると称する。
(Example 3)
[Switching sound image localization mode]
This embodiment relates to handling of switching of sound image localization mode. In the embodiment described above, the HMD (Head Mound Display) and earphones worn by the user are configured such that the sound image is localized to the image displayed on the screen of the broadcast receiving device 100 in response to changes in the front direction of the user's face. The broadcast receiving apparatus 100 takes in information regarding the front direction of the user's face and performs audio processing so that sound is emitted from an audio output unit such as headphones. In this embodiment, a mode in which a sound image is localized to the image displayed on the screen of the broadcast receiving apparatus 100 in response to a change in the front direction of the user's face is referred to as a variable sound image localization mode. On the other hand, in this embodiment, the mode in which the broadcast receiving apparatus 100 executes audio output processing assuming that the front direction of the user's face is fixed is referred to as the fixed sound image localization mode.
 図14Aは、放送受信装置100、HMD、ヘッドフォン及びイヤホン等の音声出力部及びユーザ頭部の平面的な位置関係を示す図である。左右の音声出力部をつなぐ線分の中点がこの場合の標準視聴位置になる。左右の音声出力部をつなぐ線分に垂直な左右の音声出力部をつなぐ線分の中点を通る直線の前方方向がユーザの顔の正面方向となる。 FIG. 14A is a diagram showing a planar positional relationship between the broadcast receiving device 100, the HMD, an audio output unit such as headphones and earphones, and the user's head. The standard viewing position in this case is the midpoint of the line segment connecting the left and right audio output sections. The forward direction of a straight line passing through the midpoint of the line segment connecting the left and right audio output units, which is perpendicular to the line segment connecting the left and right audio output units, is the front direction of the user's face.
 また、図14Bは、標準視聴位置が、放送受信装置100の画面中央から画面の長手方向(X軸)に直交する線(Y軸)の上にある場合を示している。「θ」は図14Aで説明したユーザの顔の正面方向と、放送受信装置100の画面中央方向とのなす角であり、ユーザの顔が画面中央方向からどの程度横向きになっているかを示している。なお、上記「θ」は、本願発明における「方位角度」の一例である。 Further, FIG. 14B shows a case where the standard viewing position is on a line (Y-axis) from the center of the screen of the broadcast receiving device 100 orthogonal to the longitudinal direction (X-axis) of the screen. “θ” is the angle between the front direction of the user's face described in FIG. 14A and the center direction of the screen of the broadcast receiving device 100, and indicates how far the user's face is oriented sideways from the center direction of the screen. There is. Note that the above "θ" is an example of the "azimuth angle" in the present invention.
 上述した音像定位モードが可変である場合には、放送受信装置100は、左右の音声出力部の位置情報を入力し、前記「θ」を演算し、左右の音声出力部から出力される音声情報等の音情報を調整している。また、上述した音像定位モードが固定である場合には、放送受信装置100は、前記「θ」を固定値(一例として「θ」=0として)演算し、左右の音声出力部から出力される音声情報等の音情報を調整している。 When the sound image localization mode described above is variable, the broadcast receiving device 100 inputs the position information of the left and right audio output units, calculates the “θ”, and adjusts the audio information output from the left and right audio output units. Adjusting the sound information such as. Further, when the above-mentioned sound image localization mode is fixed, the broadcast receiving device 100 calculates the "θ" to a fixed value (as an example, "θ" = 0), and outputs it from the left and right audio output sections. Adjusts sound information such as voice information.
 ここで、HMD、イヤホン及びヘッドフォン等の音声出力部の位置は、例えば、ユーザ入力により、ユーザが受信機画面の中央を向いている位置を記録し、その後のユーザの顔の向きの変化をヘッドフォンに搭載したジャイロセンサ等で検出することにより得ることができる。また、HMD、イヤホン及びヘッドフォン等の音声出力部の位置は、GPS等のシステムによる位置情報を、左右の音声出力部を介して、放送受信装置100が受信し、放送受信装置100が演算することによっても取得することが可能である。なお、音声出力部において、GPS等のシステムによる位置情報から音声出力部が当該音声出力部の位置を演算し、演算された位置情報を当該音声出力部が放送受信装置100に出力する構成とすることも可能である。また、放送受信装置100と当該音声出力部は、BlueTooth(登録商標)通信部やNFC通信部、赤外線通信部等、他の通信部を備えることによって位置情報を通信することが可能になる。 Here, the position of the audio output unit such as the HMD, earphones, and headphones is determined by, for example, recording the position where the user is facing the center of the receiver screen based on user input, and then recording the position where the user is facing the center of the receiver screen, and then recording the position of the audio output unit such as the HMD, earphones, and headphones. It can be obtained by detecting with a gyro sensor etc. mounted on the. Furthermore, the position of the audio output unit of the HMD, earphones, headphones, etc. can be calculated by the broadcast receiving apparatus 100 receiving position information from a system such as GPS via the left and right audio output units. It is also possible to obtain it by Note that the audio output unit is configured to calculate the position of the audio output unit from position information provided by a system such as GPS, and output the calculated position information to the broadcast receiving device 100. It is also possible. Further, the broadcast receiving device 100 and the audio output unit can communicate position information by including other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
 しかし、音声出力部の位置情報に誤差が発生した場合などには、ユーザの顔の向きに対して、音声出力部の音声が適切に放音されない可能性がある。例えば、ユーザの顔の向きが画面の正面に対して右側を向き、ユーザの右側の音声出力部から音声が強く聞こえるようにユーザが感じれば、ユーザは音声の放音方向に違和感を感じる場合がある。そこで、本実施例では、ユーザが放音方向に違和感を感じた場合等に、放送受信装置100が図14Bの「θ」を零度にして、音声情報を含む音情報を演算して、出力することを可能にする構成とした。すなわち、ユーザの正面方向が画面中央方向を向いているものとして、放送受信装置100は音声情報を含む音情報を演算して、イヤホン及びヘッドフォン等の音声出力部に演算された音情報を出力する。 However, if an error occurs in the position information of the audio output unit, the audio from the audio output unit may not be emitted appropriately depending on the direction of the user's face. For example, if the user's face is facing to the right of the screen and the user feels that the sound is coming from the audio output unit on the right side of the user, the user may feel that the direction of the sound is strange. be. Therefore, in this embodiment, when the user feels something strange about the sound emission direction, the broadcast receiving device 100 sets "θ" in FIG. 14B to zero degrees, calculates and outputs sound information including audio information. It has a configuration that makes it possible. That is, assuming that the user's front direction is facing the center of the screen, the broadcast receiving device 100 calculates sound information including audio information, and outputs the calculated sound information to an audio output unit such as an earphone or headphone. .
 なお、オブジェクトベース信号、及び、HOA方式信号において、「θ」を予め決定された固定値(例えば「0」)に設定して、上述した各式による演算を実行することによって、音声出力部に出力される信号が決定される。 In addition, in the object base signal and the HOA method signal, by setting "θ" to a predetermined fixed value (for example, "0") and executing the calculations according to the above-mentioned formulas, the audio output section The signal to be output is determined.
 図35は、本実施例の音像定位モードの切替動作を示すフローチャートである。 FIG. 35 is a flowchart showing the sound image localization mode switching operation of this embodiment.
 ステップS201において、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を固定とするか、非固定とするかを判定する。角度「θ」を固定とする場合(ステップS201:Yes)の場合は、ステップS202に進み、角度「θ」を非固定とする場合(ステップS201:No)の場合は、ステップS204に進む。 In step S201, it is determined whether the angle "θ" formed by the screen center direction and the front direction of the user's face is fixed or non-fixed. If the angle "θ" is fixed (step S201: Yes), the process proceeds to step S202, and if the angle "θ" is not fixed (step S201: No), the process proceeds to step S204.
 画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を固定とするか、非固定とするかは様々な方法によって設定及び判定することが可能である。詳細は後述する。 Whether the angle "θ" formed by the screen center direction and the front direction of the user's face is fixed or non-fixed can be set and determined by various methods. Details will be described later.
 ステップS202において、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を固定とする場合の一例として、「θ」=「0」とする情報を放送受信装置100が受信し、放送受信装置100は音声情報を含む音情報を「θ」=「0」として演算する。当該演算は、放送受信装置100における、図2F、図2Gの音声デコーダ146S,146Uにおいて実行されてもよい。より具体的には、音声デコーダ146S,146Uに組み込まれるべき、図15Bの音声デコーダ10100において当該演算が実行されてもよい。 In step S202, as an example of a case where the angle "θ" of the corner formed by the screen center direction and the front direction of the user's face is fixed, the broadcast receiving device 100 receives information that "θ"="0". Upon reception, the broadcast receiving apparatus 100 calculates the sound information including audio information by setting "θ"="0". The calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
 ステップS203において、放送受信装置100は音声デコーダ146S,146U等において演算された演算情報をHMD、イヤホン及びヘッドフォン等の音声出力部に出力し、音声出力部はユーザの聴覚に演算情報を放音する。 In step S203, the broadcast receiving device 100 outputs the calculation information calculated in the audio decoders 146S, 146U, etc. to an audio output unit such as an HMD, earphones, headphones, etc., and the audio output unit emits the calculation information to the user's hearing. .
 ステップS204において、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を非固定とする情報を放送受信装置100が受信し、放送受信装置100は「θ」を演算し、音声情報を含む音情報の属性情報に演算された「θ」の値を加えて、イヤホン及びヘッドフォン等の音声出力部に出力する演算情報を演算する。上述したように、当該演算は、放送受信装置100における、図2F、図2Gの音声デコーダ146S,146Uにおいて実行されてもよい。より具体的には、音声デコーダ146S,146Uに組み込まれるべき、図15Bの音声デコーダ10100において当該演算が実行されてもよい。 In step S204, the broadcast receiving device 100 receives information indicating that the angle “θ” formed by the screen center direction and the front direction of the user's face is not fixed, and the broadcast receiving device 100 calculates “θ”. Then, the calculated value of "θ" is added to the attribute information of the sound information including the audio information to calculate the calculation information to be output to the audio output unit such as earphones and headphones. As described above, the calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
 [音像定位モードの設定]
 音像定位モードの設定方法の一例として、リモコン180Rを使用する場合について説明する。図36Aに、本実施例の放送受信装置100に対する音像定位モードの設定指示の入力に使用するリモコン180Rの外観図の一例を示す。図36Aに示したリモコン180Rと図12Bに示したリモコン180Rとキー配列は同一である。しかし、本実施例に係る図36Aに示したリモコン180Rにおいては、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を固定とするか、非固定とするかを選択するモード画面を出力する機能を備える。
[Sound image localization mode settings]
As an example of how to set the sound image localization mode, a case where the remote control 180R is used will be described. FIG. 36A shows an example of an external view of a remote control 180R used to input a sound image localization mode setting instruction to the broadcast receiving apparatus 100 of this embodiment. The remote controller 180R shown in FIG. 36A and the remote controller 180R shown in FIG. 12B have the same key arrangement. However, in the remote control 180R shown in FIG. 36A according to this embodiment, it is possible to determine whether the angle "θ" formed by the center direction of the screen and the front direction of the user's face is fixed or non-fixed. Equipped with a function to output the mode screen to select.
 リモコン180Rは、放送受信装置100の電源オン/オフ(スタンバイオン/オフ)を行うための電源キー180R1と、カーソルを上下左右に移動させるためのカーソルキー(上、下、左、右)180R2と、カーソル位置の項目を選択項目として決定するための決定キー180R3と、戻るキー180R4と、を備える。音声モードを切り替えるためには、メニューキー180RAを押下する。 The remote control 180R includes a power key 180R1 for powering on/off (standby on/off) the broadcast receiving device 100, and cursor keys (up, down, left, right) 180R2 for moving the cursor up, down, left and right. , a determination key 180R3 for determining the item at the cursor position as a selection item, and a return key 180R4. To switch the audio mode, press menu key 180RA.
 図36Bは、メニューキー180RAを押下した場合に、放送受信装置100のモニタ部192に表示される音像定位モードを設定するためのバナー表示の一例を示す図である。メニューキー180RAを押下すると、メニューの種類を示すバナー表示192B1がモニタ部192に表示される。バナー表示192B1の形状、モニタ部192における表示位置、バナー表示192B1の項目は任意に放送受信装置100が設定することが可能である。ただし、バナー表示192B1の項目には少なくとも音像定位モードを設定するための項目(一例として、「音声設定」との項目情報があるが、名称は限定的ではない。)が含まれる。 FIG. 36B is a diagram showing an example of a banner display for setting the sound image localization mode displayed on the monitor unit 192 of the broadcast receiving device 100 when the menu key 180RA is pressed. When the menu key 180RA is pressed, a banner display 192B1 indicating the type of menu is displayed on the monitor unit 192. The shape of the banner display 192B1, the display position on the monitor unit 192, and the items of the banner display 192B1 can be arbitrarily set by the broadcast receiving apparatus 100. However, the items of the banner display 192B1 include at least an item for setting the sound image localization mode (as an example, there is item information of "audio setting", but the name is not limited).
 バナー表示192B1が表示された状態で、カーソルキー(上、下、左、右)180R2を使用して、カーソル位置を「音声設定」を選択するように移動させ決定キー180R3を押下すると、バナー表示192B2が表示される。バナー表示192B2には音声の設定に関する項目が表示される。バナー表示192B2の項目には少なくとも音像定位モードを設定するための項目(一例として、「音像定位」との項目情報があるが、名称は限定的ではない。)が含まれる。バナー表示192B2が表示された状態で、カーソルキー(上、下、左、右)180R2を使用して、カーソル位置を「音像定位」を選択するように移動させ決定キー180R3を押下すると、バナー表示192B3が表示される。バナー表示192B3には音像定位に関する項目が表示される。バナー表示192B3の項目には少なくとも音像定位モードを「固定」とするか「非固定」するか設定するための項目(一例として、「固定」及び「非固定」との項目情報があるが、名称は限定的ではない。例えば、「非固定」を「変動」と名称を変更することが可能であるが、音像定位モードの機能は同一である)が含まれる。 With the banner display 192B1 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor to select "Audio Settings" and press the OK key 180R3 to display the banner. 192B2 is displayed. Items related to audio settings are displayed on the banner display 192B2. The items of the banner display 192B2 include at least an item for setting the sound image localization mode (as an example, there is item information of "sound image localization", but the name is not limited). With the banner display 192B2 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor to select "sound image localization" and press the enter key 180R3 to display the banner. 192B3 is displayed. Items related to sound image localization are displayed on the banner display 192B3. The items in the banner display 192B3 include at least an item for setting the sound image localization mode to "fixed" or "non-fixed" (for example, there is item information for "fixed" and "non-fixed", but the name (For example, it is possible to change the name from "non-fixed" to "variable," but the function of the sound image localization mode is the same.)
 バナー表示192B3が表示された状態で、カーソルキー(上、下、左、右)180R2を使用して、カーソル位置を「固定」を選択するように移動させ決定キー180R3を押下すると、リモコン180Rは以下の動作を実行する。すなわち、リモコン180Rは、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を固定とする情報(以下、「固定情報」と称する)を放送受信装置100に送信する。「固定情報」は、リモコン180R及び放送受信装置100が、BlueTooth(登録商標)通信部やNFC通信部、赤外線通信部等、他の通信部を備えることによって通信することが可能になる。 With the banner display 192B3 displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor position to select "Fixed" and press the enter key 180R3, the remote control 180R will Perform the following actions. That is, the remote control 180R transmits information (hereinafter referred to as "fixed information") that fixes the angle "θ" formed by the center direction of the screen and the front direction of the user's face to the broadcast receiving device 100. . The "fixed information" can be communicated by the remote controller 180R and the broadcast receiving apparatus 100 being equipped with other communication units such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
 また、バナー表示192B3が表示された状態で、カーソルキー(上、下、左、右)180R2を使用して、カーソル位置を「非固定」を選択するように移動させ決定キー180R3を押下すると、リモコン180Rは以下の動作を実行する。すなわち、リモコン180Rは、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を非固定とする情報(以下、「非固定情報」と称する)を放送受信装置100に送信する。「非固定情報」は、リモコン180R及び放送受信装置100が、BlueTooth(登録商標)通信部やNFC通信部、赤外線通信部等、他の通信部を備えることによって通信することが可能になる。 In addition, when the banner display 192B3 is displayed, use the cursor keys (up, down, left, right) 180R2 to move the cursor position to select "Non-fixed" and press the enter key 180R3. The remote control 180R performs the following operations. That is, the remote control 180R sends information (hereinafter referred to as "non-fixed information") that indicates that the angle "θ" formed by the screen center direction and the front direction of the user's face is non-fixed to the broadcast receiving device 100. Send. "Non-fixed information" can be communicated by the remote control 180R and the broadcast receiving apparatus 100 being equipped with other communication units such as a Bluetooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
 なお、「固定情報」及び「非固定情報」は、カラーキー(青、赤、緑、黄)180RDに割り当てられ、カラーキー(青、赤、緑、黄)180RDを押下するたびに、「固定情報」と「非固定情報」が切り換わるように構成することも可能である。 "Fixed information" and "non-fixed information" are assigned to color keys (blue, red, green, yellow) 180RD, and each time you press the color key (blue, red, green, yellow) 180RD, "fixed information" It is also possible to configure the information to be switched between "information" and "non-fixed information."
 「固定情報」又は「非固定情報」を受信した放送受信装置100は、画面中央方向とユーザの顔の正面方向とによって形成される角の角度「θ」を固定された値(一例として0度)又は角度「θ」を演算し、HMD、イヤホン及びヘッドフォン等の音声出力部に出力する演算情報を演算する。上述したように、当該演算は、放送受信装置100における、図2F、図2Gの音声デコーダ146S,146Uにおいて実行されてもよい。より具体的には、音声デコーダ146S,146Uに組み込まれるべき、図15Bの音声デコーダ10100において当該演算が実行されてもよい。 The broadcast receiving device 100 that has received the "fixed information" or "non-fixed information" sets the angle "θ" formed by the center direction of the screen and the front direction of the user's face to a fixed value (for example, 0 degrees). ) or angle "θ", and calculates calculation information to be output to an audio output unit such as an HMD, earphones, headphones, etc. As described above, the calculation may be performed in the audio decoders 146S and 146U of FIGS. 2F and 2G in the broadcast receiving apparatus 100. More specifically, the calculation may be performed in the audio decoder 10100 of FIG. 15B, which is to be incorporated into the audio decoders 146S, 146U.
 以上説明した本開示の実施例に係る各機能の一部または全部の機能を有する、放送受信装置100を含む高度デジタル放送サービスのシステムによれば、ユーザの顔の正面方向がどの方向であっても、放送受信装置100が図14Bの「θ」を固定(一例として零度)にして、音声情報を含む音情報を演算して、出力することを可能とした。したがって、ユーザの顔の正面方向を正確に取得できない場合、又は、ユーザが放音方向に違和感を感じた場合等に、音像定位方向を固定にすることが可能になる。 According to the advanced digital broadcast service system including the broadcast receiving device 100 that has some or all of the functions according to the embodiments of the present disclosure described above, no matter which direction the front direction of the user's face is, Also, the broadcast receiving apparatus 100 is able to calculate and output sound information including audio information by fixing "θ" in FIG. 14B (zero degrees as an example). Therefore, when the front direction of the user's face cannot be accurately obtained, or when the user feels uncomfortable with the sound emission direction, it is possible to fix the sound image localization direction.
 (実施例3の変形例)
 [音像定位モードが非固定の場合のθの修正]
 本変形例は、音像定位モードが非固定の場合のθの設定に関するものである。上述した実施例の音像定位モードが非固定の場合においては、ユーザの顔の正面方向の変化に対応して放送受信装置100の画面に映し出される画像に音像が定位するように、イヤホン及びヘッドフォン等の音声出力部から音が放出されるように放送受信装置100はユーザの顔の正面方向に関する情報を入力して音声処理を実行する。しかし、ジャイロセンサによりユーザの顔の正面方向を演算する場合に、初期値が誤って設定されるとユーザの顔の正面方向は継続して誤った方向になってしまう。そこで、本変形例では、ユーザの顔の実際の方向を新たな正面方向として「θ」を設定する方法について説明する。
(Modification of Example 3)
[Correction of θ when sound image localization mode is not fixed]
This modification relates to the setting of θ when the sound image localization mode is not fixed. When the sound image localization mode of the embodiment described above is non-fixed, earphones, headphones, etc. The broadcast receiving apparatus 100 inputs information regarding the front direction of the user's face and executes audio processing so that sound is emitted from the audio output unit of the user. However, when calculating the front direction of the user's face using the gyro sensor, if the initial value is set incorrectly, the front direction of the user's face will continue to be in the wrong direction. Therefore, in this modification, a method of setting "θ" with the actual direction of the user's face as a new front direction will be described.
 図37は、本変形例の音像定位モードが非固定の場合のθの修正動作を示すフローチャートである。 FIG. 37 is a flowchart showing the θ correction operation when the sound image localization mode of this modification is non-fixed.
 ステップS301において、放送受信装置100はリセットボタンが押下されたか否かを判定する。リセットボタンが押下された場合(ステップS301:Yes)には、放送受信装置100はステップS302に進む。リセットボタンが押下されていない場合(ステップS301:No)には、放送受信装置100はステップS303に進む。ここで、リセットボタンはリモコン180Rに新たに設けてもよい(図示せず)。また、上述したように、メニューキー180RAを押下し、放送受信装置100のモニタ部192に表示される図示しないリセット領域をカーソルキー(上、下、左、右)180R2を使用して選択し、決定キー180R3を押下することによって、リセットボタンが押下されたと判定されてもよい。さらに、カラーキー(青、赤、緑、黄)180RDをリセットボタンとして設定するように構成されてもよい。さらに、リセットボタンはHMD、イヤホンやヘッドフォンに設けられ、当該リセットボタンが押下された情報が放送受信装置100に出力されるように構成されてもよい。リセットボタンが押下された情報は、BlueTooth(登録商標)通信部やNFC通信部、赤外線通信部等、他の通信部を各装置が備えることによって実現される。 In step S301, the broadcast receiving device 100 determines whether the reset button has been pressed. If the reset button is pressed (step S301: Yes), the broadcast receiving device 100 proceeds to step S302. If the reset button is not pressed (step S301: No), the broadcast receiving device 100 proceeds to step S303. Here, a reset button may be newly provided on the remote control 180R (not shown). Also, as described above, press the menu key 180RA, select a reset area (not shown) displayed on the monitor unit 192 of the broadcast receiving device 100 using the cursor keys (up, down, left, right) 180R2, By pressing the enter key 180R3, it may be determined that the reset button has been pressed. Furthermore, the color key (blue, red, green, yellow) 180RD may be configured to be set as a reset button. Further, the reset button may be provided on the HMD, earphones, or headphones, and information on the press of the reset button may be output to the broadcast receiving device 100. The information that the reset button has been pressed is realized by each device having other communication units such as a BlueTooth (registered trademark) communication unit, an NFC communication unit, an infrared communication unit, and the like.
 ステップS302において、放送受信装置100は、リセットボタンが押下された場合のユーザの顔の方向をユーザの顔の新たな正面方向として「θ」を設定する。 In step S302, the broadcast receiving device 100 sets "θ" as the direction of the user's face when the reset button is pressed as the new front direction of the user's face.
 ステップS303において、放送受信装置100は音声デコーダ146S,146U等において、ユーザの顔の新たな正面方向として設定された「θ」を用いて音声情報を演算し、演算された演算情報をHMD、イヤホン及びヘッドフォン等の音声出力部に出力し、音声出力部はユーザの聴覚に演算情報を放音する。 In step S303, the broadcast receiving device 100 uses the audio decoders 146S, 146U, etc. to calculate audio information using "θ" set as the new front direction of the user's face, and transmits the calculated calculation information to the HMD, earphones, etc. and output to an audio output unit such as headphones, and the audio output unit emits the calculated information to the user's auditory senses.
 以上説明した本開示の変形例に係る各機能の一部または全部の機能を有する、放送受信装置100を含む高度デジタル放送サービスのシステムによれば、ユーザの顔の正面方向を放送受信装置100が間違って認識した場合であっても、リセットボタンにより、ユーザの顔の正面方向を実際にユーザが向いている方向に修正することが可能になる。 According to the advanced digital broadcasting service system including the broadcast receiving device 100 that has some or all of the functions according to the modified example of the present disclosure described above, the broadcast receiving device 100 Even in the case of incorrect recognition, the reset button allows the user to correct the front direction of the user's face to the direction in which the user is actually facing.
 以上、本制御例によれば、放送受信装置は、番組情報のメタデータやコンテンツから番組の音声の収音場所を特定し、その場所の音場情報をサーバから取得して残響音を生成し、元の音声にその残響音を付加した合成音を生成して音声出力装置に出力させる。このような放送受信装置の動作により、視聴者は、収音場所に居る場合と同じような臨場感を味わうことができる。また、放送受信装置は、収音場所における聴音点を指定されると、その聴音点で音源からの音声を聴いた場合に対応した合成音を生成するので、視聴者は、好みの座席位置で聴く音声を再現させることができる。また、放送受信装置は、放送画像の部分拡大操作に合わせて収音場所における聴音点を移動させ、音の来る方向を仮想的に変化させるので、映像と音声とが同期した、より臨場感の高い体験を楽しむことができる。 As described above, according to this control example, the broadcast receiving device identifies the sound collection location of the program from the metadata and content of the program information, obtains the sound field information of that location from the server, and generates reverberant sound. , a synthesized sound is generated by adding the reverberation sound to the original sound, and the synthesized sound is outputted to the sound output device. Due to the operation of the broadcast receiving apparatus as described above, the viewer can experience the same sense of presence as if he or she were present at the sound collection location. In addition, when the broadcast receiving device specifies a listening point in the sound collection location, it generates a synthesized sound that corresponds to when listening to the sound from the sound source at that listening point. You can reproduce the sounds you hear. In addition, the broadcast receiving device moves the listening point at the sound collection location in accordance with the partial enlargement operation of the broadcast image, and virtually changes the direction from which the sound comes, so the video and audio are synchronized, creating a more realistic feeling. You can enjoy a high quality experience.
 以上、本発明の実施形態の例を、実施例1、2および3を用いて説明したが、本発明の技術を実現する構成は前記実施例に限られるものではなく、様々な変形例が考えられる。例えば、ある実施例の構成の一部を他の実施例の構成と置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。これらはすべて本発明の範疇に属するものである。また、文中や図中に現れる数値やメッセージ等もあくまでも一例であり、異なるものを用いても本発明の効果を損なうことはない。 Although examples of embodiments of the present invention have been described above using Examples 1, 2, and 3, the configuration for realizing the technology of the present invention is not limited to the above-mentioned examples, and various modifications can be considered. It will be done. For example, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. All of these belong to the scope of the present invention. Further, the numerical values, messages, etc. that appear in the text and figures are merely examples, and the effects of the present invention will not be impaired even if different values are used.
 前述した本発明の機能等は、それらの一部または全部を、例えば集積回路で設計する等によりハードウェアで実現しても良い。また、マイクロプロセッサユニット等がそれぞれの機能等を実現する動作プログラムを解釈して実行することによりソフトウェアで実現しても良い。ハードウェアとソフトウェアを併用しても良い。 Some or all of the functions of the present invention described above may be realized in hardware by, for example, designing an integrated circuit. Alternatively, the functions may be realized by software by having a microprocessor unit or the like interpret and execute operating programs for realizing the respective functions. Hardware and software may be used together.
 なお、放送受信装置100を制御する前記ソフトウェアは、製品出荷の時点で予め放送受信装置100のROM103および/またはストレージ部110等に格納された状態であっても良い。製品出荷後にインターネット800上のサーバ装置からLAN通信部121を介して取得するものであっても良い。また、メモリカードや光ディスク等に格納された前記ソフトウェアを、拡張インタフェース部124等を介して取得しても良い。同様に、携帯情報端末700を制御する前記ソフトウェアは、製品出荷の時点で予め携帯情報端末700のROM703および/またはストレージ部710等に格納された状態であっても良い。製品出荷後にインターネット800上のサーバ装置からLAN通信部721若しくは移動体電話網通信部722等を介して取得するものであっても良い。また、メモリカードや光ディスク等に格納された前記ソフトウェアを、拡張インタフェース部724等を介して取得しても良い。 Note that the software for controlling the broadcast receiving apparatus 100 may be stored in advance in the ROM 103 and/or the storage unit 110 of the broadcast receiving apparatus 100 at the time of product shipment. The information may be acquired from a server device on the Internet 800 via the LAN communication unit 121 after the product is shipped. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface unit 124 or the like. Similarly, the software for controlling the mobile information terminal 700 may be stored in advance in the ROM 703 and/or the storage unit 710 of the mobile information terminal 700 at the time of product shipment. The information may be acquired from a server device on the Internet 800 via the LAN communication unit 721 or the mobile telephone network communication unit 722 after the product is shipped. Further, the software stored in a memory card, an optical disk, or the like may be acquired via the expansion interface section 724 or the like.
 また、図中に示した制御線や情報線は説明上必要と考えられるものを示しており、必ずしも製品上のすべての制御線や情報線を示しているとは限らない。実際には殆どすべての構成が相互に接続されていると考えても良い。 Furthermore, the control lines and information lines shown in the figures are those considered necessary for the explanation, and do not necessarily show all control lines and information lines on the product. In reality, almost all components may be considered to be interconnected.
 100:放送受信装置、101:主制御部、102:システムバス、103:ROM、104:RAM、110:ストレージ(蓄積)部、121:LAN通信部、124:拡張インタフェース部、125:デジタルインタフェース部、130C、130T、130L、130B:チューナ/復調部、140S、140U:デコーダ部、180:操作入力部、191:映像選択部、192:モニタ部、193:映像出力部、194:音声選択部、195:スピーカ部、196:音声出力部、180R:リモートコントローラ、200、200C、200T、200S、200L、200B:アンテナ、201T、201L、201B:変換部、300、300T、300S、300L:電波塔、400C:ケーブルテレビ局のヘッドエンド、400:放送局サーバ、500:サービス事業者サーバ、600:移動体電話通信サーバ、600B:基地局、700:携帯情報端末、800:インターネット、800R:ルータ装置、900:情報サーバ、910:ヘッドフォン、920:HMD 100: Broadcast receiving device, 101: Main control unit, 102: System bus, 103: ROM, 104: RAM, 110: Storage unit, 121: LAN communication unit, 124: Expansion interface unit, 125: Digital interface unit , 130C, 130T, 130L, 130B: tuner/demodulation section, 140S, 140U: decoder section, 180: operation input section, 191: video selection section, 192: monitor section, 193: video output section, 194: audio selection section, 195: Speaker section, 196: Audio output section, 180R: Remote controller, 200, 200C, 200T, 200S, 200L, 200B: Antenna, 201T, 201L, 201B: Conversion section, 300, 300T, 300S, 300L: Radio tower, 400C: head end of cable television station, 400: broadcasting station server, 500: service provider server, 600: mobile telephone communication server, 600B: base station, 700: mobile information terminal, 800: Internet, 800R: router device, 900 : Information server, 910: Headphones, 920: HMD

Claims (26)

  1.  音源別の信号を放送波を介して受信可能な放送受信装置であって、
     前記放送波を受信する放送受信部と、
     複数のスピーカにより構成され、前記放送受信装置から送信された前記音源別の信号に従って音声を出力する音声出力部と、
     制御部と、
    を備え、
     前記制御部は、
     前記複数のスピーカの配置情報に従い、前記音源別の信号による音声の再生位置を決定し、
     前記音源別の信号をもとに22.2chの音声チャンネルに対応した信号を計算した後、前記22.2ch用の音声チャンネルに対応した信号を前記音声出力部用の信号に変換する、
     放送受信装置。
    A broadcast receiving device capable of receiving signals for each sound source via broadcast waves,
    a broadcast receiving unit that receives the broadcast waves;
    an audio output unit configured with a plurality of speakers and outputs audio according to the sound source-specific signals transmitted from the broadcast receiving device;
    a control unit;
    Equipped with
    The control unit includes:
    determining the playback position of the sound based on the signal for each sound source according to the arrangement information of the plurality of speakers;
    After calculating a signal corresponding to the 22.2ch audio channel based on the signal for each sound source, converting the signal corresponding to the 22.2ch audio channel into a signal for the audio output section.
    Broadcast receiving device.
  2.  請求項1に記載の放送受信装置において、
     前記放送波は、前記音源別の信号毎に再生位置設定をユーザに対して許可するかどうかの情報が記述されたパラメータを前記放送受信部に伝送し、
     前記再生位置設定は、前記放送波を介して伝送されるプリセット位置から前記制御部が選択し、または、範囲を限定してユーザが設定する、
     放送受信装置。
    The broadcast receiving device according to claim 1,
    The broadcast wave transmits to the broadcast receiving unit a parameter that describes information as to whether or not a user is permitted to set a playback position for each signal for each sound source;
    The playback position setting is selected by the control unit from preset positions transmitted via the broadcast wave, or set by the user in a limited range.
    Broadcast receiving device.
  3.  請求項1に記載の放送受信装置において、
     リモートコントローラからの制御信号を入力する操作入力部をさらに有し、
     ユーザは、前記リモートコントローラを用いて前記音源別の信号による音声の再生位置を設定可能である、
     放送受信装置。
    The broadcast receiving device according to claim 1,
    It further includes an operation input section for inputting a control signal from a remote controller,
    The user can use the remote controller to set the playback position of the audio based on the signal for each sound source,
    Broadcast receiving device.
  4.  請求項1に記載の放送受信装置において、
     ネットワークを介してデータを受信可能な通信部と、
     前記通信部は、前記音源別の信号とは異なる追加の音声信号を前記ネットワークを介して受信し、
     前記制御部は、前記音声信号による音声を、前記放送波を介して受信した前記音源別の信号の音声と共に再生する、
     放送受信装置。
    The broadcast receiving device according to claim 1,
    a communication unit capable of receiving data via a network;
    The communication unit receives an additional audio signal different from the sound source-specific signal via the network,
    The control unit reproduces the sound of the audio signal together with the sound of the signal for each sound source received via the broadcast wave.
    Broadcast receiving device.
  5.  請求項1に記載の放送受信装置において、
     前記放送受信部が緊急放送を受信した場合、前記制御部は、ユーザにより設定された音声再生に関する設定を無効にし、音声再生に関する設定を緊急放送用設定に変更する、
     放送受信装置。
    The broadcast receiving device according to claim 1,
    When the broadcast receiving unit receives an emergency broadcast, the control unit invalidates settings related to audio playback set by the user and changes settings related to audio playback to settings for emergency broadcast.
    Broadcast receiving device.
  6.  第1の制御情報と第2の制御情報と第3の制御情報と、音声信号が含まれるコンテンツが伝送されるデジタル放送を受信する受信装置におけるコンテンツ保護方法であって、
     前記第1の制御情報と前記第2の制御情報との組み合わせにより、前記コンテンツのコピー制御状態が示されるものであり、
     前記第3の制御情報は前記コンテンツに含まれる音声信号の種類を示すものであり、
     前記デジタル放送を受信する受信ステップと、
     受信した前記デジタル放送に含まれるコンテンツについて、外部機器へ出力する出力ステップを備え、
     前記出力ステップにおけるコンテンツの出力時のコピー制御状態は、前記受信ステップで受信したときの前記コンテンツのコピー制御状態と、前記コンテンツに含まれる音声信号の種類に応じて決定される、
     コンテンツ保護方法。
    A content protection method in a receiving device that receives digital broadcasting in which content including first control information, second control information, third control information, and an audio signal is transmitted, the method comprising:
    A copy control state of the content is indicated by a combination of the first control information and the second control information,
    The third control information indicates the type of audio signal included in the content,
    a receiving step of receiving the digital broadcast;
    an output step for outputting content included in the received digital broadcast to an external device;
    The copy control state when outputting the content in the output step is determined according to the copy control state of the content when received in the receiving step and the type of audio signal included in the content.
    Content protection methods.
  7.  第1の制御情報と第2の制御情報と第3の制御情報と、音声信号が含まれるコンテンツが伝送されるデジタル放送を受信する受信装置におけるコンテンツ保護方法であって、
     前記第1の制御情報と前記第2の制御情報との組み合わせにより、前記コンテンツのコピー制御状態が示されるものであり、
     前記第3の制御情報は前記コンテンツに含まれる音声信号の種類を示すものであり、
     前記デジタル放送を受信する受信ステップと、
     受信した前記デジタル放送に含まれるコンテンツについて、外部機器へ出力する出力ステップを備え、
     前記第1の制御情報と前記第2の制御情報との組み合わせにより、1世代のみコピー可のコピー制御状態または個数制限コピー可のコピー制御状態が示され、かつ、前記第3の制御情報でオブジェクトベースの音声信号を含むことが示されている状態で受信したコンテンツについて、前記出力ステップおいて前記受信装置のIPインタフェースを介したストリーム出力を行う場合のコピー制御状態は、前記出力ステップにおいて前記受信装置で前記コンテンツの音声信号をデコードしてデジタル音声出力を行う場合のコピー制御状態よりも厳しいコピー制御状態で出力する、
     コンテンツ保護方法。
    A content protection method in a receiving device that receives digital broadcasting in which content including first control information, second control information, third control information, and an audio signal is transmitted, the method comprising:
    A copy control state of the content is indicated by a combination of the first control information and the second control information,
    The third control information indicates the type of audio signal included in the content,
    a receiving step of receiving the digital broadcast;
    an output step for outputting content included in the received digital broadcast to an external device;
    The combination of the first control information and the second control information indicates a copy control state in which only one generation can be copied or a copy control state in which a limited number of copies can be made, and the third control information indicates that the object The copy control state when stream output is performed via the IP interface of the receiving device in the output step with respect to content received in a state where it is indicated that the content includes a base audio signal is outputting the content under a copy control state that is stricter than the copy control state when the device decodes the audio signal of the content and outputs digital audio;
    Content protection methods.
  8.  請求項7に記載のコンテンツ保護方法において、
     前記受信ステップで受信した前記コンテンツを蓄積する蓄積ステップを備え、
     前記出力ステップは、前記蓄積ステップで蓄積した後のコンテンツについて行われる、
     コンテンツ保護方法。
    The content protection method according to claim 7,
    an accumulation step of accumulating the content received in the receiving step;
    The output step is performed on the content after being accumulated in the accumulation step,
    Content protection methods.
  9.  音源別の信号を含む放送波を受信可能な放送受信装置であって、
     前記放送波を受信する放送受信部と、
     前記受信された放送波に含まれる前記音源別の信号に基づいて、複数のスピーカを有する音声出力装置に、音声信号を送信する送信部と、
     制御部と、
    を備え、
     前記制御部は、
     前記放送波に含まれる情報に基づいて、前記音源の音声の収録場所における音響特性である音場特性を取得し、
     前記音源別の信号と前記音場特性とに基づいて音源別の残響音の信号を生成し、
     前記音源別の信号と前記音源別の残響音の信号とに基づいて、前記音源別の信号の音声に残響音が付加された残響音付加音声の信号を生成し、
     前記送信部は、
     前記音声信号として前記残響音付加音声の信号を前記音声出力装置に送信する、
     放送受信装置。
    A broadcast receiving device capable of receiving broadcast waves including signals for each sound source,
    a broadcast receiving unit that receives the broadcast waves;
    a transmitter that transmits an audio signal to an audio output device having a plurality of speakers based on the signal for each sound source included in the received broadcast wave;
    a control unit;
    Equipped with
    The control unit includes:
    Obtaining sound field characteristics that are acoustic characteristics at a recording location of the sound of the sound source based on information included in the broadcast wave;
    generating a reverberant sound signal for each sound source based on the signal for each sound source and the sound field characteristics;
    generating a reverberant sound-added audio signal in which reverberation sound is added to the sound of the sound source-specific signal based on the sound source-specific signal and the sound source-specific reverberant sound signal;
    The transmitter includes:
    transmitting a signal of the reverberant sound-added audio as the audio signal to the audio output device;
    Broadcast receiving device.
  10.  請求項9に記載の放送受信装置において、
     前記制御部は、前記放送波に含まれる情報から前記音場特性を取得する、
     放送受信装置。
    The broadcast receiving device according to claim 9,
    The control unit acquires the sound field characteristics from information included in the broadcast wave.
    Broadcast receiving device.
  11.  請求項9に記載の放送受信装置において、
     データをネットワークを介して受信する通信部を備え、
     前記制御部は、前記放送波に含まれる情報に基づいて前記収録場所を特定し、前記収録場所の前記音場特性を、前記通信部を通し前記ネットワークを介してサーバから取得する、
     放送受信装置。
    The broadcast receiving device according to claim 9,
    Equipped with a communication unit that receives data via a network,
    The control unit specifies the recording location based on information included in the broadcast wave, and acquires the sound field characteristics of the recording location from a server via the communication unit and the network.
    Broadcast receiving device.
  12.  請求項9に記載の放送受信装置において、
     前記制御部は、
     前記収録場所における聴音位置を指定され、
     前記残響音付加音声が前記音源別の音声を前記聴音位置で聴いた場合の音声に近づくように前記残響音付加音声の信号を生成する、
     放送受信装置。
    The broadcast receiving device according to claim 9,
    The control unit includes:
    The listening position at the recording location is specified,
    generating a signal of the reverberant sound-added sound so that the reverberant sound-added sound approaches the sound when the sound for each sound source is listened to at the listening position;
    Broadcast receiving device.
  13.  請求項12に記載の放送受信装置において、
     前記制御部は、放送画像の部分拡大表示操作に合わせて前記聴音位置を変化させる、
     放送受信装置。
    The broadcast receiving device according to claim 12,
    The control unit changes the listening position in accordance with a partial enlargement display operation of a broadcast image.
    Broadcast receiving device.
  14.  請求項9に記載の放送受信装置において、
     前記制御部は、前記残響音付加音声が視聴者の頭部伝達特性に基づいて補正されるように前記残響音付加音声の信号を生成する、
     放送受信装置。
    The broadcast receiving device according to claim 9,
    The control unit generates a signal of the reverberation sound added sound so that the reverberation sound added sound is corrected based on a head transfer characteristic of the viewer.
    Broadcast receiving device.
  15.  請求項9に記載の放送受信装置において、
     前記収録場所は、ホールまたは劇場である、
     放送受信装置。
    The broadcast receiving device according to claim 9,
    The recording location is a hall or a theater;
    Broadcast receiving device.
  16.  請求項9に記載の放送受信装置において、
     前記音声出力装置は、ヘッドフォンまたはヘッドマウントディスプレイである、
     放送受信装置。
    The broadcast receiving device according to claim 9,
    The audio output device is a headphone or a head-mounted display.
    Broadcast receiving device.
  17.  音源別の信号を含む放送波を受信可能な放送受信装置における残響音付加処理方法であって、
     前記放送波を受信し、
     前記受信された放送波に含まれる情報に基づいて、前記音源の音声の収録場所における音響特性である音場特性を取得し、
     前記音源別の信号と前記音場特性とに基づいて音源別の残響音の信号を生成し、
     前記音源別の信号と前記音源別の残響音の信号とに基づいて、前記音源別の信号の音声に残響音が付加された残響音付加音声の信号を生成する、
     残響音付加処理方法。
    A reverberation sound addition processing method in a broadcast receiving device capable of receiving broadcast waves including signals for each sound source, the method comprising:
    receiving the broadcast wave;
    Based on information included in the received broadcast wave, obtain sound field characteristics that are acoustic characteristics at a recording location of the audio of the sound source;
    generating a reverberant sound signal for each sound source based on the signal for each sound source and the sound field characteristics;
    generating a reverberant sound-added audio signal in which reverberation sound is added to the sound of the sound source-specific signal based on the sound source-specific signal and the sound source-specific reverberant sound signal;
    Reverberation sound addition processing method.
  18.  オブジェクトベースの信号またはHOA(Higher Order Ambisonics)方式の信号を放送波を介して受信可能な放送受信装置であって、
     前記放送波を受信する放送受信部と、
     前記放送受信部で受信した前記放送波に含まれる前記オブジェクトベースの信号またはHOA方式の信号に従って生成した音声信号を音声出力部に送信する送信部と、
     制御部と、
    を備え、
     前記制御部は、
     前記音声出力部の配置情報に従い、前記オブジェクトベースの信号またはHOA方式の信号を前記音声出力部に送信する前記音声信号に変換する、
     放送受信装置。
    A broadcast receiving device capable of receiving object-based signals or HOA (Higher Order Ambisonics) signals via broadcast waves,
    a broadcast receiving unit that receives the broadcast waves;
    a transmitter that transmits an audio signal generated in accordance with the object-based signal or the HOA signal included in the broadcast wave received by the broadcast receiver to an audio output unit;
    a control unit;
    Equipped with
    The control unit includes:
    converting the object-based signal or HOA-based signal into the audio signal to be transmitted to the audio output unit according to placement information of the audio output unit;
    Broadcast receiving device.
  19.  請求項18に記載の放送受信装置において、
     前記音声出力部は、イヤホン、ヘッドフォン、及び、HMD(Head Mound Display)のいずれかである、
     放送受信装置。
    The broadcast receiving device according to claim 18,
    The audio output unit is one of an earphone, a headphone, and an HMD (Head Mound Display).
    Broadcast receiving device.
  20.  請求項18に記載の放送受信装置において、
     前記配置情報は、前記音声出力部から出力されて前記放送受信装置に入力される、
     放送受信装置。
    The broadcast receiving device according to claim 18,
    The arrangement information is output from the audio output unit and input to the broadcast receiving device,
    Broadcast receiving device.
  21.  請求項18に記載の放送受信装置において、
     前記配置情報には、前記音声出力部を装着したユーザの顔の正面方向と前記放送受信装置の画面中央方向とのなす方位角度を含む、
     放送受信装置。
    The broadcast receiving device according to claim 18,
    The arrangement information includes an azimuth angle between the front direction of the face of the user wearing the audio output unit and the center direction of the screen of the broadcast receiving device.
    Broadcast receiving device.
  22.  請求項21に記載の放送受信装置において、
     リモートコントローラからの制御信号を入力する操作入力部をさらに有し、
     ユーザは、前記リモートコントローラを用いて前記方位角度を固定とするか可変とするかを設定可能である、
     放送受信装置。
    The broadcast receiving device according to claim 21,
    It further includes an operation input section for inputting a control signal from a remote controller,
    The user can use the remote controller to set the azimuth angle to be fixed or variable;
    Broadcast receiving device.
  23.  請求項22に記載の放送受信装置において、
     前記方位角度を固定とする場合の角度は零度である、
     放送受信装置。
    The broadcast receiving device according to claim 22,
    When the azimuth angle is fixed, the angle is zero degrees,
    Broadcast receiving device.
  24.  請求項22に記載の放送受信装置において、
     前記方位角度を可変とする場合に、前記音声出力部を装着したユーザの顔の正面方向を新たに設定し、前記方位角度を新たに設定する方位角度設定モードを有する、
     放送受信装置。
    The broadcast receiving device according to claim 22,
    When the azimuth angle is made variable, the azimuth angle setting mode is configured to newly set the front direction of the face of the user wearing the audio output unit and newly set the azimuth angle.
    Broadcast receiving device.
  25.  請求項24に記載の放送受信装置において、
     前記方位角度設定モードは、前記リモートコントローラまたは前記音声出力部から設定する、
     放送受信装置。
    The broadcast receiving device according to claim 24,
    The azimuth angle setting mode is set from the remote controller or the audio output unit,
    Broadcast receiving device.
  26.  オブジェクトベースの信号またはHOA(Higher Order Ambisonics)方式の信号を放送波を介して受信可能な放送受信装置の制御方法であって、
     前記放送波を受信する放送受信ステップと、
     前記放送受信ステップで受信した前記放送波に含まれる前記オブジェクトベースの信号またはHOA方式の信号に従って生成した音声信号を音声出力部に送信する送信ステップと、
     前記音声出力部の配置情報に従い、前記オブジェクトベースの信号またはHOA方式を前記音声出力部に送信する前記音声信号に変換する制御ステップと、を含む、
     放送受信装置の制御方法。
    A method for controlling a broadcast receiving device capable of receiving object-based signals or HOA (Higher Order Ambisonics) signals via broadcast waves, the method comprising:
    a broadcast receiving step of receiving the broadcast wave;
    a transmitting step of transmitting an audio signal generated according to the object-based signal or the HOA method signal included in the broadcast wave received in the broadcast receiving step to an audio output unit;
    a control step of converting the object-based signal or HOA method into the audio signal transmitted to the audio output unit according to the arrangement information of the audio output unit;
    A method of controlling a broadcast receiving device.
PCT/JP2023/029279 2022-08-03 2023-08-10 Broadcast reception device, content protection method, processing method for adding reverberation sound, and control method for broadcast reception device WO2024029634A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2022-123799 2022-08-03
JP2022123799A JP2024021157A (en) 2022-08-03 2022-08-03 Broadcast receiving device and content protection method
JP2022151952A JP2024046518A (en) 2022-09-22 2022-09-22 Broadcast receiving device and reverberation processing method
JP2022-151952 2022-09-22
JP2022181025 2022-11-11
JP2022-181025 2022-11-11

Publications (1)

Publication Number Publication Date
WO2024029634A1 true WO2024029634A1 (en) 2024-02-08

Family

ID=89849523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/029279 WO2024029634A1 (en) 2022-08-03 2023-08-10 Broadcast reception device, content protection method, processing method for adding reverberation sound, and control method for broadcast reception device

Country Status (1)

Country Link
WO (1) WO2024029634A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6229327A (en) * 1985-07-31 1987-02-07 Toshiba Electric Equip Corp Information system of multile dwelling house
JP2003255955A (en) * 2002-02-28 2003-09-10 Pioneer Electronic Corp Method and system for sound field control
JP2013157680A (en) * 2012-01-27 2013-08-15 Hitachi Consumer Electronics Co Ltd Receiving device and digital broadcast transmitting and receiving system
JP2014045282A (en) * 2012-08-24 2014-03-13 Nippon Hoso Kyokai <Nhk> Reverberation adding device, reverberation adding program
WO2016002738A1 (en) * 2014-06-30 2016-01-07 ソニー株式会社 Information processor and information-processing method
JP2018046319A (en) * 2016-09-12 2018-03-22 マクセル株式会社 Broadcast receiver
JP2019161672A (en) * 2019-06-27 2019-09-19 マクセル株式会社 system
JP2021136465A (en) * 2020-02-21 2021-09-13 日本放送協会 Receiver, content transfer system, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6229327A (en) * 1985-07-31 1987-02-07 Toshiba Electric Equip Corp Information system of multile dwelling house
JP2003255955A (en) * 2002-02-28 2003-09-10 Pioneer Electronic Corp Method and system for sound field control
JP2013157680A (en) * 2012-01-27 2013-08-15 Hitachi Consumer Electronics Co Ltd Receiving device and digital broadcast transmitting and receiving system
JP2014045282A (en) * 2012-08-24 2014-03-13 Nippon Hoso Kyokai <Nhk> Reverberation adding device, reverberation adding program
WO2016002738A1 (en) * 2014-06-30 2016-01-07 ソニー株式会社 Information processor and information-processing method
JP2018046319A (en) * 2016-09-12 2018-03-22 マクセル株式会社 Broadcast receiver
JP2019161672A (en) * 2019-06-27 2019-09-19 マクセル株式会社 system
JP2021136465A (en) * 2020-02-21 2021-09-13 日本放送協会 Receiver, content transfer system, and program

Similar Documents

Publication Publication Date Title
JP2023126310A (en) Broadcast receiving apparatus
JP2024023593A (en) Transmission wave processing method
JP2024026573A (en) Display control method
JP2024014961A (en) Digital broadcasting modulated wave transmission method
JP2024026572A (en) Display control method
JP2024026228A (en) Broadcast receiving device
WO2019221080A1 (en) Broadcast reception device and method for processing carrier wave
WO2024029634A1 (en) Broadcast reception device, content protection method, processing method for adding reverberation sound, and control method for broadcast reception device
JP2022132341A (en) Transmission wave processing method
JP2024046518A (en) Broadcast receiving device and reverberation processing method
JP2024021157A (en) Broadcast receiving device and content protection method
JP2024053891A (en) Broadcasting system, broadcast receiving device, broadcasting station device, and broadcasting method
JP2019201330A (en) Transmission wave processing method
JP7428507B2 (en) Broadcast receiving device
JP7460740B2 (en) Transmission method of modulated digital broadcasting waves
JP7448340B2 (en) Time control method
JP7388891B2 (en) Display control method
WO2023112666A1 (en) Broadcast reception device, setting method, transmission method, display control method, and recording medium
JP7414489B2 (en) Display control method
JP7405579B2 (en) Display control method
WO2022168525A1 (en) Digital broadcast reception device
JP7197288B2 (en) Broadcast receiver
JP2022174771A (en) Digital broadcast receiver and display method
JP2022163274A (en) Display device and reproduction method
JP2023133369A (en) system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23850187

Country of ref document: EP

Kind code of ref document: A1